Dataflows for Power BI with Dynamics GP Data: Part 1

I was really excited about the Dataflows with Power BI announcement in November and finally got a chance to play around with it.  If you haven’t heard about Dataflows or know what it is, here is an excerpt from that announcement:

“In the modern BI world, data preparation is considered the most difficult, expensive, and time-consuming task, estimated by experts as taking 60%-80% of the time and cost of a typical analytics project. Some of the challenges in those projects include fragmented and incomplete data, complex system integration, business data without any structural consistency, and of course, a high skillset barrier. Specialized expertise, typically reserved for data warehousing professionals, is often required. Such advanced skills are rare and expensive.

To answer many of these challenges, Power BI serves analysts today with industry leading data preparation capabilities using Power Query in Power BI Desktop. Now, with Power BI dataflows, we’re bringing these self-service data preparation capabilities into the Power BI online service, and significantly expanding the capabilities in the following ways:

  • Self-service data prep for big data in Power BI – Dataflows can be used to easily ingest, cleanse, transform, integrate, enrich, and schematize data from a large array of transactional and observational sources, encompassing all data preparation logic. Previously, ETL logic could only be included within datasets in Power BI, copied over and over between datasets and bound to dataset management settings. With dataflows, ETL logic is elevated to a first-class artifact within Power BI and includes dedicated authoring and management experiences. Business analysts and BI professionals can use dataflows to handle the most complex data preparation challenges and build on each other’s work, thanks to a revolutionary model-driven calculation engine, which takes care of all the transformation and dependency logic—cutting time, cost, and expertise to a fraction of what’s traditionally been required for those tasks. Better yet, analysts can now easily create dataflows using familiar self-service tools, such as the well-known Power Query data preparationexperience. Dataflows are created and easily managed in app workspaces, enjoying all the capabilities that the Power BI service has to offer, such as permission management, scheduled refreshes, and more.
  • Advanced Analytics and AI with Azure – Power BI dataflows store data in Azure Data Lake Storage Gen2 – which means that data ingested through a Power BI dataflow is now available to data engineers and data scientists to leverage the full power of Azure Data Services such as Azure Machine Learning, Azure Databricks, and Azure SQL Datawarehouse for advanced analytics and AI. This allows business analysts, data engineers, and data scientists to collaborate on the same data within their organization.
  • Support for the Common Data Model – The Common Data Model (CDM) is a set of a standardized data schemas and a metadata system to allow consistency of data and its meaning across applications and business processes.  Dataflows support the CDM by offering easy mapping from any data in any shape into the standard CDM entities, such as Account, Contact etc. Dataflows also land the data, both standard and custom entities, in schematized CDM form. Business analysts can take advantage of the standard schema and its semantic consistency, or customize their entities based on their unique needs. The Common Data Model continues to evolve as part of the recently announced Open Data Initiative

Once dataflows are created, users can use Power BI Desktop and the Power BI service to create datasets, reports, dashboards, and apps to drive deep insights into their business.”

Microsoft has released a white paper on Dataflows found or check out Matthew Rouche’s great blog posts at BI Polar.  They really are a great guide to get you started and up to speed the Dataflows. 

Getting started with Dataflows

So, let’s try doing some of this ourselves with Dynamics GP data.  To do this you will need to have already installed an Enterprise gateway to your Dynamics GP SQL server.  Log into your Power BP Pro and Premium service and create a new workspace.  You should now see a “Dataflows (Preview)” option.  Click on the that option and then the “+ Create” button to start and select Dataflows from the dropdown menu.

Creating your Entities

This will open up the below window with the option to “Add new entities” or “Add linked entities.”  Let’s select “Add new entities” so we can add a date dimension to our Dataflow prep.

From the “Choose data source” window, select the “Blank query” tile.

This will open up the “Connect to data source” window.  Copy and paste the below Power Query code to create your data dimension.


  Source = List.Dates(StartDate, Length, #duration(1, 0, 0, 0)),

  #”Converted to Table” = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),

  #”Renamed Columns” = Table.RenameColumns(#”Converted to Table”, {{“Column1”, “Date”}}),

  #”Changed Type” = Table.TransformColumnTypes(#”Renamed Columns”, {{“Date”, type date}}),

  StartDate = Date.StartOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),-10)),

  Today = Date.EndOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),5)),

  Length = Duration.Days(Today – StartDate),

  Custom1 = #”Changed Type”,

  #”Inserted Year” = Table.AddColumn(Custom1, “Fin Year”, each Date.Year([Date]+#duration(184,0,0,0)), Int64.Type),

  #”Inserted Month Name” = Table.AddColumn(#”Inserted Year”, “Month Name”, each Date.MonthName([Date]), type text),

  #”Inserted Day Name” = Table.AddColumn(#”Inserted Month Name”, “Day Name”, each Date.DayOfWeekName([Date]), type text),

  #”Inserted Month” = Table.AddColumn(#”Inserted Day Name”, “Fin Month”, each if Date.Month([Date]) >=7 then Date.Month([Date])-6 else Date.Month([Date])+6  , Int64.Type),

  #”Inserted Day of Week” = Table.AddColumn(#”Inserted Month”, “Day of Week”, each Date.DayOfWeek([Date])+1, Int64.Type),

  #”Inserted First Characters” = Table.AddColumn(#”Inserted Day of Week”, “Short Month”, each Text.Start([Month Name], 3), type text),

  #”Inserted First Characters1″ = Table.AddColumn(#”Inserted First Characters”, “DDD”, each Text.Start([Day Name], 3), type text),

  #”Reordered Columns” = Table.ReorderColumns(#”Inserted First Characters1″, {“Date”, “Fin Year”, “Month Name”, “Short Month”, “Fin Month”, “Day Name”, “DDD”, “Day of Week”}),

  #”Added Custom1″ = Table.AddColumn(#”Reordered Columns”, “Month Number”, each (Date.Month([Date]))),

  #”Inserted Start of Month” = Table.AddColumn(#”Added Custom1″, “Start of Month”, each Date.StartOfMonth([Date]), type date),

  #”Inserted End of Month” = Table.AddColumn(#”Inserted Start of Month”, “End of Month”, each Date.EndOfMonth([Start of Month]), type date),

  #”Duplicated Column” = Table.DuplicateColumn(#”Inserted End of Month”, “Date”, “Date – Copy”),

  #”Calculated Quarter” = Table.TransformColumns(#”Duplicated Column”,{{“Date – Copy”, Date.QuarterOfYear, Int64.Type}}),

  #”Renamed Columns1″ = Table.RenameColumns(#”Calculated Quarter”, {{“Date – Copy”, “Quarter Number”}}),

  #”Inserted Merged Column” = Table.AddColumn(#”Renamed Columns1″, “Merged”, each Text.Combine({“Q”, Text.From([Quarter Number], “en-US”)}), type text),

  #”Current Date” = Table.AddColumn(#”Inserted Merged Column”, “Current Date”, each Date.From(DateTimeZone.FixedLocalNow())),

  #”Added Custom10″ = Table.AddColumn(#”Current Date”, “Is Work Day”, each if Date.DayOfWeek([Date]) >=0 and Date.DayOfWeek([Date]) <= 4 then “Is Work Day” else “Weekend”),

  #”Renamed Columns2″ = Table.RenameColumns(#”Added Custom10″, {{“Merged”, “Calendar Quarter”}, {“DDD”, “Short Day”}}),

  #”Duplicated Column1″ = Table.DuplicateColumn(#”Renamed Columns2″, “Date”, “Date – Copy”),

  #”Extracted Year” = Table.TransformColumns(#”Duplicated Column1″,{{“Date – Copy”, Date.Year, Int64.Type}}),

  #”Renamed Columns3″ = Table.RenameColumns(#”Extracted Year”, {{“Date – Copy”, “Calendar Year”}}),

  #”Changed Type3″ = Table.TransformColumnTypes(#”Renamed Columns3″, {{“Fin Year”, Int64.Type}, {“Current Date”, type date}}),

  #”Added Custom Column” = Table.AddColumn(#”Changed Type3″, “DateKey”, each Text.Combine({Date.ToText([Date], “yyyy”), Date.ToText([Date], “MM”), Date.ToText([Date], “dd”)}), Int64.Type),

  #”Transform columns” = Table.TransformColumnTypes(#”Added Custom Column”, {{“Month Number”, type text}, {“Is Work Day”, type text}}),

  #”Replace errors” = Table.ReplaceErrorValues(#”Transform columns”, {{“Month Number”, null}, {“Is Work Day”, null}}),

  #”Added Custom” = Table.AddColumn(#”Replace errors”, “YYYYMM”, each Text.Combine({Date.ToText([Date],”yyyy”), Date.ToText([Date], “MM”)})),

  #”Added Custom3″ = Table.AddColumn(#”Added Custom”, “YYYY-MM”, each Text.Combine({Date.ToText([Date],”yyyy”),”-“, Date.ToText([Date], “MM”)})),

    #”Changed Type1″ = Table.TransformColumnTypes(#”Added Custom3″,{{“YYYYMM”, type text}, {“YYYY-MM”, type text}})


    #”Changed Type1″

On the “Edit queries” window.  Here I changed the Name to “Date” and then clicked “Done”.

That’s enough for today’s blog post.  In part 2, we will create entities for our customer, salesperson, sales territory, product, sales transactions, inventory transactions and a custom function in are dataflow to help clean up our customer data and then connect Power BI Desktop to our dataflows so we can create any relationship and DAX measures needed to start visualizing our Dynamics GP data.

Ready to Do Even More with Your Data?

Start organizing, knowing and executing on your data today with dataflows and Power BI to provide a self-service data lake in the future. KTL Solutions works with business leaders every day in helping them lead their organization into becoming a data-driven organization. Need help executing on your data? Contact us today.

Share this post

Related Posts