How to track data lineage with SQL Server Graph Tables – Part 1 Create Nodes and Edges

Featured

Where did this data come from?
How can I trust this data?
What impact will changing this field have on other systems?

If you have ever heard or asked any of these questions, then this series of blog posts is for you.

As data volumes continue to grow so too does the need to manage the data estate. One critical aspect with managing the data estate is understanding data ancestry. Thankfully you can leverage SQL Server 2019 Graph tables to track the lineage of one of your most valuable assets, your data.

In this series of blog posts, I will show you how you can use how you can use Graph Tables in SQL Server 2019 to capture and report on data lineage.

Prerequisites

  • SQL Server 2019 Developer Edition (You can download a free copy for development use from here )
  • SQL Server Management Studio 18 (You can download a free copy from here )

*Note at the end of the series I will show you how you can visualize the data using R and Power BI and Plotly Dash so expect to install both Microsoft R Open, Power BI and Python later in the series.

Overview

The diagram below provides a conceptual overview of the various components that we will use during this series of articles. We will start with the Graph tables in the DataLineage database in SQL Server 2019 because they are the core of the solution.

Here is an overview of the initial data model that we will build.

As you can see in the diagram this series will cover the movement of data from one system to another. At the end of the series I will explain how this initial solution can be extended to included dependencies between objects such as the dependency of a field in a report with a field in a table in your data warehouse or data lake. This will allow you to track and reporting on both the movement of data as well as the inter-dependencies between data entities.

Create the DataLineage database

Once you have SQL Server installed, we will create a new database called DataLineage which we will use to store data we need to track lineage. To create the database run the following SQL command.

CREATE DATABASE DataLineage

Graph tables are first class entities in SQL Server databases starting with SQL Server 2017. Graph database are ideal for storing and traversing relationships between items. For additional information on the graph database in SQL Server you can refer to the following site.

https://docs.microsoft.com/en-us/sql/relational-databases/graphs/sql-graph-overview?view=sql-server-ver15

Create the Node table

Now we will create the node table. A node represents a discreet entity in a graph database. In order to maximize the value of our data lineage solution we need to track the movement of data at the at the lowest level possible which is the column/field level. In addition to the column/field data we also need to capture the table, database and system the column/field is stored in, this will be useful for filtering the data and understanding context. Based on this requirement we will create a node table that stores column, table, database and system name using the following SQL command.

USE [DataLineage]
GO

DROP TABLE IF EXISTS dbo.DataEntity
GO

CREATE TABLE DataEntity

(
    [ID] INTEGER IDENTITY(0,1PRIMARY KEY,
    [ColumnName] NVARCHAR(MAX),
    [TableName] NVARCHAR(MAX),
    [DatabaseName] NVARCHAR(MAX),
    [SystemName] NVARCHAR(MAX),
    [CreatedDt] DATE DEFAULT(GETDATE() )
AS NODE;

GO

I have also included a CreatedDt field for auditing purposes. Next, we will create the edge table which will be used to store information regarding the movement of data from source to target.

Create the Edge table

An Edge table represents a relationship between nodes in a graph database. In our scenario we want to track the movement of data between source and target. In order to capture this information, we will create an Edge table using the following SQL command.

USE [DataLineage]
GO

DROP TABLE IF EXISTS dbo.DataFlowsTo
GO

CREATE TABLE DataFlowsTo
(
    [ProcessName] NVARCHAR(MAX),
    [CreatedDt] DATE DEFAULT(GETDATE())
) AS EDGE;
GO

The edge table contains a single user defined column called ProcessName which is used to capture the name of the ETL process that moves data from source to target. The ProcessName will also be useful for filtering the data which is helpful in scenarios where we need to view all the fields involved in an ETL process or all the ETL processes dependent on one or more fields. Like the node table I have included a CreatedDt field for auditing purposes.

Summary

In this article we have created the database as well as the node and edge tables required to capture data lineage.

In the next article I will cover how to populate these tables using a database procedure in order simplify the process of

Hopefully you have found this to be another practical post.

Until next time.

Anthony

Abstract and encapsulate with Power BI and SQL Server Table-Valued Functions – Use Case 2: Change results based on a database point in time and a user defined parameter

This is the second article  in which I cover different use cases for using SQL Server Table-Valued Functions with Power BI.

In the previous article I showed you how a table valued function can be used to hide lower levels of a hierarchy based on user id. This is handy if you need to prevent certain users from breaking down aggregate measures while interacting with a report or dashboard in Power BI.

In this article I will show you how you can use table valued functions to filter the results based on a date range stored in your database and allow the report author to control how many years of data to bring back with an optional input parameter.

Prerequisites

Use case 2: Change results based on time

This use case comes in handy if you want your results to be filtered based on a date and time in your database and not on the date time of your Power BI users. This situation may occur in a globally distributed system in which Power BI users and the database they are querying are located in different time zones or in situations where the data lags behind the current date/time of your users. I will cover a situation where you want the data set to only show data up to a point in time stored in your database.

Using the Wide World Importers sample database suppose you wanted to limit the orders that a person can report on to the last year in which there were orders in the database. Run the following query.

USE WideWorldImportersDW
GO

SELECT
MIN([Order Date Key]) AS [Earliest Order Date]
, MAX([Order Date Key]) AS [Latetest Order Date]
FROM
WideWorldImportersDW.Fact.[Order]
GO

As you can see from the results, we have orders from 2013 up to 2016. If we were to use the Power BI relative date slicer and set it to only show data from the past 1 year we would not see any results because the Power BI relative date slicer is based on today’s date April 6, 2019 and not on a date in the database. One way to overcome this problem is to use SQL Server Table-valued functions and encapsulate the logic to only show orders from June 1 2015 to May 31 2016. To do this will create a new function using the code below.

Create Function

Run the following code in the Wide World Importers database to create a new function. We are including an optional parameter so that the report author can change how many years they want to go back when they connect to the data.

--DROP FUNCTION dbo.ufn_Orders_PastYear
--GO


CREATE FUNCTION dbo.ufn_Orders_PastYear(@NumberOfYears INT = NULL)
RETURNSTABLE
AS
RETURN
(
SELECT
*
FROM
WideWorldImportersDW.Fact.[Order] ord
WHERE
ord.[Order Date Key]
BETWEEN
(SELECT DATEADD(year, -ISNULL(@NumberOfYears, 1), MAX([Order Date Key])) FROM WideWorldImportersDW.Fact.[Order])
AND
(SELECT MAX([Order Date Key]) FROM WideWorldImportersDW.Fact.[Order])
);
GO

Notice that I negate the number of years by adding a negative sign in front of the ISNULL function in the select statement, this is to simplify the report authoring experience with using this function. Next, we will make a DirectQuery connection to the function using Power BI.

Connect with Power BI

Similar to before connect to the SQL Server database using a DirectQuery connection and select the function ufn_Orders_PastYear from the list of database objects.

As you can see in the image above the function parameter @NumberOfYears appears in Power BI as an optional parameter. If you leave it blank and click apply it will pull back 1 year’s worth of data based on what is available in the database. You can enter in your own number to control how many years back you query the Orders fact table. Incorporating parameters is a very powerful way to give the report author control of the results.

Once the data has been loaded in let’s visualize it using a simple bar chart.

Your results should look like the following image below.

As you can see in the chart we only have data from 2015 to 2016. To make the chart a bit easier to read lets add a proper date hierarchy. We will need to build it because we are using a DirectQuery to access the data so the autogenerated date hierarchies are not available, those are only created when you import data into Power BI and set the data type to be a date.

Create Year Column

Create a new calculated column and use the following DAX code to pull out the year value from the Order Date Key field.

OrderYear = Year([Order Date Key])

Create Month Columns

Next, we will create two month columns one will be used to sort and the other will be used to display the month name on the report.

Use the following DAX code to create a new month number calculated column.

OrderMonth = Month([Order Date Key])

Now create a new calculated column to store the month name using the following DAX code.

Order Month Name = 
SWITCH (
    [OrderMonth],
    1, "January",
    2, "February",
    3, "March",
    4, "April",
    5, "May",
    6, "June",
    7, "July",
    8, "August",
    9, "September",
    10, "October",
    11, "November",
    12, "December"
)

Now we need to set the sort by column property of the Order Month Name to use the value of the OrderMonth column. To do this navigate to the model viewer and click on the Order Month Name field and then set the Sort by column to OrderMonth.

Now we will create a new Hierarchy based on OrderYear and Order Month Name.

Click on the chart and the Axis value with the new hierarchy we just created. Drill down a level to see the years and months.

As you can see this makes the chart much easier to read. Now lets insert some new data into the table and refresh the report.

Add some data

Run the following SQL to create new Date and Order records.

INSERT INTO [Dimension].[Date]
([Date]
,[Day Number]
,[Day]
,[Month]
,[Short Month]
,[Calendar Month Number]
,[Calendar Month Label]
,[Calendar Year]
,[Calendar Year Label]
,[Fiscal Month Number]
,[Fiscal Month Label]
,[Fiscal Year]
,[Fiscal Year Label]
,[ISO Week Number])
VALUES
(‘4/1/2019’,
,1
,1
,’April’
,’Apr’
,4
,’CY2019-Apr’
,2019
,’CY2019′
,6
,’FY2019-APR’
,2019
,’FY2019′
,14)
GO
INSERT INTO [Fact].[Order]
([City Key]
,[Customer Key]
,[Stock Item Key]
,[Order Date Key]
,[Picked Date Key]
,[Salesperson Key]
,[Picker Key]
,[WWI Order ID]
,[WWI Backorder ID]
,[Description]
,[Package]
,[Quantity]
,[Unit Price]
,[Tax Rate]
,[Total Excluding Tax]
,[Tax Amount]
,[Total Including Tax]
,[Lineage Key])
VALUES
(45901
,0
,175
,’4/1/2019′
,’4/1/2019′
,76
,67
,9073
,NULL
,’April Fools ain’t no joke’
,’Each’
,3
,13.00
,15.00
,39.00
,5.85
,44.85
,9)
GO

Refresh the report and notice how it updates so that it only has one column.

This is because there is only 1 record from March 2018 until April 2019.

Combining SQL Server database functions with Power BI is a powerful way to abstract and encapsulate logic in the database thus simplifying the report authors job and ensuring the right data is presented to report consumers.

Hopefully you have found this to be another practical post.

Until next time.

Anthony

Abstract and encapsulate with Power BI and SQL Server Table-Valued Functions – Use Case 1: Change results based on user

If you’ve ever required a dynamic data source in Power BI that can change based on who the user is, when they are querying the data source or if certain data elements have changed you can leverage the ability for Power BI to connect to a table value function in SQL Server.

Table-valued functions allow you to abstract complex business logic from the report author and encapsulate it into a database object. This simplifies report building and enables you to do things like hide hierarchy levels, filter data based on a certain point in time stored in the database or check for certain data conditions and alter the query results as appropriate.

Prerequisites

  • SQL Server 2016 or later. You can download the SQL Server 2017 developer edition HERE
  • Wide World Importer sample database. A copy can be found HERE
  • Power BI Desktop. You can download the latest version from HERE

In this series of articles I will step through several use cases for direct queries from SQL Server Table-valued Functions in Power BI.

Use case 1: Change results based on user

For this first use case we will cover how you can embed some simple logic in your table-valued function to hide lower levels of a hierarchy. This is useful if you want to prevent certain individuals from breaking down aggregated values but still allow them to use data at a summary level.

Create user accounts

For the purposes of simplicity, we will create some users in the database using SQL Server authentication. Connect to your SQL Server database and execute the following SQL code.

USE [master]

GO
--Create Bob
CREATE LOGIN [Bob] WITH PASSWORD='Bob', DEFAULT_DATABASE=[WideWorldImportersDW], DEFAULT_LANGUAGE=[us_english], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
GO

--Create Mary
CREATE LOGIN [Mary] WITH PASSWORD='Mary', DEFAULT_DATABASE=[WideWorldImportersDW], DEFAULT_LANGUAGE=[us_english], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
GO

USE [WideWorldImportersDW]
GO

--Grant Bob access to WideWorldImportersDW
CREATE USER [Bob] FOR LOGIN [Bob] WITH DEFAULT_SCHEMA=[dbo]
GO

--Grant Mary access to WideWorldImportersDW
CREATE USER [Mary] FOR LOGIN [Mary] WITH DEFAULT_SCHEMA=[dbo]
GO

--Grant Bob access to read access to WideWorldImportersDW
ALTER ROLE db_datareader ADD MEMBER [Bob]
GO

--Grant Mary access to read access to WideWorldImportersDW
ALTER ROLE db_datareader ADD MEMBER [Mary]
GO

 

Next, we will create a function in SQL Server with the following code.

Create function

Use the code below to create a new Table-Valued function in SQL Server. The function is what we will directly connect to in Power BI.

CREATE FUNCTION dbo.ufn_Customer()  
RETURNS TABLE 
AS
RETURN   
(  
    SELECT  
       [Customer Key]
      ,[WWI Customer ID]
      ,[Customer]
      ,[Bill To Customer]
      ,[Category]
      ,CASE SYSTEM_USER
        WHEN 'Bob' THEN
            NULL
        WHEN 'Mary' THEN
            [Buying Group]
        ELSE
            [Buying Group]
        END AS [Buying Group]
      ,[Primary Contact]
      ,[Postal Code]
      ,[Valid From]
      ,[Valid To]
      ,[Lineage Key]
  FROM [WideWorldImportersDW].[Dimension].[Customer]
);  

GO

As you can see in the code above, I have created a function called dbo.ufn_Customer which returns the data from the Customer dimension table. In the code I have added a simple case statement that returns different data for the Buy Group based on who executing the function.

Next we will bring this function into Power BI and see the results.

Connect with Power BI

Open Power BI and get data from SQL Server. Enter in the server name and select DirectQuery.

Click on OK. Log in using a Database account. First, we will try using Bob.

Click on connect and select the function ufn_Customer from the list of available objects.

If you wanted to force the report author to use the function rather than the actual customer table you can use database security to only expose the function and not the table. I typically use custom database roles and schemas because it is easier to manage and allows me to enable “data discovery with guard rails”.

Load the data in and create a simple hierarchy using the fields Category and Buying Group.

Drop a Matrix onto the canvas of the report and use the Category Hierarchy you just created for the rows and the Customer Key for the values.

You should have a report that looks similar to the image below. Notice how the Buying Group is null because for Bob the function is not returning the Buyin Group value but the NULL value instead.

Now lets switch to Mary and see how the lower level values of the hierarchy appear in the report. Click on Home > Edit Queries > Data Source Settings. Select the data source that you are using for this report and click on Edit Permissions…

In the Edit Permission pop up menu click on Edit.. then in Database enter in Mary and Mary for the ID and PWD.

Click on Save and OK and Close. Refresh the report.

Notice how the lower levels of the hierarchy now appear in the matrix visual because for Mary the actual Buying Group value is being return as specified in the function.

NOTE: The reason why you need to refresh is because of caching. When using this technique to obfuscate lower levels of a hierarchy make sure to build your visuals so that the default view of the report is at an aggregated level and minimize the amount of caching which will force Power BI to re-query from the source and update the results appropriately.

If you want to add an enterprise semantic layer such as an Analysis Services Tabular model and still have the same dynamic results you will need to build your SSAS model using DirectQuery mode because the results of the model need to change and cannot be processed and stored in memory in advance of the user querying it.

In the next article I will cover how you can use a database function to curate the results based on a date time in your database.

Next we will look at using a database function to curate the result set based on time.

Hopefully you have found this to be another practical post.

Until next time.

Anthony

Instant insights, automation and action – Part 6 Integrate Power BI, Power Apps, Azure Machine Learning and Dynamics 365 using MS Flow

This is the last article in a 6-part series in which I will explain how you can integrate Power BI, Power Apps, Azure Machine and Dynamics 365 using MS Flow.

For reference here are the descriptions and links to the previous articles.

Instant insights, automation and action – Part 1 Create Power App

Instant insights, automation and action – Part 2 Create Azure Machine Learning Experiment

Instant insights, automation and action – Part 3 Create the Power BI Report

Instant insights, automation and action – Part 4 Register Power BI in Azure Active Directory

Instant insights, automation and action – Part 5 Integrate with MS Flow

In this article I will explain how you can kick off a MS Flow by adding an action to your Power App and then how you can integrate the Power App into a Power BI Dashboard. Data alerts can by tied to tiles in the Power BI Dashboard that can kick off additional flows which will insert records into Dynamics. The complete system is depicted in the diagram below.


Modify the Power APP

In Part 1 of this series we created a simple app that allowed a user to enter new sales data. We now need to go back to this app and modify it. Navigate to Power Apps and edit the app


Once the app is open click on the submit button to select it and then from the Action menu at the top select Flows.


This will open up a new pane in which you can select the flow that we created in Part 5 of this series. Once you have selected the flow enter the following code into the formula expression bar.

PowerApptoAzureMLtoPowerBIbkp.Run(NAME.Text, CHANNEL.Text, REGION.Text, FRESH.Text, MILK.Text, GROCERY.Text, FROZEN.Text, DETERGENT.Text, DELICASSEN.Text,CATEGORY.Text)


This will execute the flow and pass the data values from each of the text input boxes into the flow. You can test the flow by clicking on the play button in the top right-hand corner of the screen.

Save the report and publish it so that the new version with the flow attached to the submit button is available to integrate into Power BI.

Modify the Power BI Report

Next, we will need to modify the Power BI report to drop in a PowerApps visual. Open the Power BI report that we created in Part 3 and add a new custom visual from the marketplace. We need to add the Power App custom visual to the report.


Once the new visual has been successfully added we will add it to a new page in the report. In the Power BI report create a new page and call it Data Entry. We are doing this to keep the report clean and simple. We will integrate various visuals including the Power App in a Power BI Dashboard once we have finished putting the necessary polish in the report.

Drop the new visual onto the canvas of the new page in the report and add any field from the list of fields in the dataset, I used customer name. You should see a screen like the image below.


We are not creating or editing an app since we already built it in Part 1. Click ok and then select Choose app. Select the app we created for entering new whole customer sales data.


Click Add. You may see another warning about creating or editing the app, just ignore this by clicking ok.


New report page should now look like the image below.


Rename Page 1 and call it Wholesale Customer Report. You can spruce up the first page to make it look more appealing. I modified my report to make it look like this.


Once you are happy with the design of the report you need to publish it to Power BI. You can replace the existing report that we created in Part 3. Once the report has been published navigate to the cloud service and go the report that you just published.

Build the Dashboard

It’s now time to build a dashboard. With the report open pin the following visuals to a new dashboard.


To pin a visual to a dashboard click on the visual and select the pin from the menu bar.


A menu like the one below will pop up. Give the new dashboard a name such as Wholesale customer dashboard.


Select pin to create and add the visual to the new dashboard. Repeat this for all of the card visuals in the report except instead of selecting New Dashboard select Existing dashboard and if not already selected pick the Wholesale customer dashboard that we just created.

Next, we will need to pin the Power App visual. Go to the Data Entry page and pin the Power App just like we did for the card visuals. If you are having trouble selecting the pin option you may need to edit the report to pin the visual.

Your dashboard should now look something like this.


Let’s rearrange the tiles and add some new visuals by using Q&A.

First add a new visual by typing the following questions in the Q&A bar at the top of the screen.

Fresh by customer sort by fresh

Pin the visual to the existing Wholesale customer dashboard.


Then place this at the bottom of the dashboard.

Repeat these steps using the following questions:

Milk by customer sort by milk

Grocery by customer sort by grocery

Frozen by customer sort by frozen

Detergent paper by customer sort by detergent paper

Delicassen by customer sort by delicassen

Your dashboard should now look similar to the image below.


Try adding a new customer by using the Power App embedded in the Power BI Dashboard. After you have entered data into each of the input boxes in the Power App hit the submit button and in about 5 seconds or less you should see the customer count go up and your new customer on the dashboard in real-time. Also try entering in a new customer but do not fill out the Category field blank. Notice how even though the field is blank it is still populated by the time it shows up in Power BI, that is because the Azure Machine Learning model is supplying this data.

Integrate with Dynamics 365

The last step is to add a data alert to one of the tiles which will create a record in Dynamics 365. Navigate to the dashboard if not already there and click the … in the top right hand corner of the Fresh tile.


Then select Manage alerts.


This will open a new menu on the right-hand side of the screen. From this screen click + Add alert rule. Create an alert that will fire once the Fresh goes above a certain value. In my case I used 60,000.


For the purposes of this tutorial an alert based on an absolute value is adequate however a better choice would be to create an alert on a relative value such as % change since you do not want to have to go in and modify the alert to increase its threshold every time you surpass it. Click Save and close.

Go back to Manage alerts for this tile (Fresh) and this time select Use Microsoft Flow to trigger additional actions.



This will launch MS Flow. Use the default template to create a new flow triggered from a Power BI alert.


Use the template and select the Alert for Fresh from the Alert id drop down menu. Next select add new step and search for Dynamics 365. Then select Create a new record Dynamics 365.

Your flow should now look like this.


Enter the details for the Dynamics 365 tenant and select the Entity that you want a record created in. For my purposes I created a new task to follow-up with the customer by using the tasks entity. Save the flow and test it out by entering in new sales data using the Power App embedded in the Power BI report. If you have wired up the flow correctly a new record should be created in Dynamics 365 once you have triggered the data alert in your Power BI dashboard.

We have now reached the end of this series hopefully you have realized that by combining Power BI, Power Apps, Flow, Azure Machine Learning and Dynamics 365 you can open up new possibilities which lead to insights, automation and action at the speed of business.

Until next time.

Anthony


Instant insights, automation and action – Part 5 Integrate with MS Flow

This is the fifth post in a series of articles in which I explain how to integrate Power BI, Power Apps, Flow, Azure Machine Learning and Dynamics 365 to rapidly build a functioning system which allows users to analyze, insert, automate and action data.

In the previous article I covered how to create an API enabled dataset.

In this article I will cover how you can use MS Flow to create and automated workflow which will integrate the various components that we have built thus far as illustrated in the diagram below.


Prerequisites

Before you can complete this tutorial you will need to make sure that you have access to use MS Flow as well as registered Power BI with Azure AD so that you can push data into an API enabled data set. For more information on how to register Power BI you can refer to the previous article.

Create the Flow

Log onto MS Flow using the following URL https://us.flow.microsoft.com

Once you have logged into MS Flow click on My Flows and then select New > Create from blank using the drop down menu.

This will land you on a second screen in which you will need to click Create from blank once more.

After clicking create from blank once more you should see a screen like the one below.

The first thing we need to add is a Power Apps Trigger. To do this search for the word PowerApps and select the first result back from the search.

Add Power BI action

Next we will add the an action to our flow. Click on + New Step after the PowerApp connector and search for Power BI. Then select Add rows to dataset (preview)

This will open a new window in which you will select the workspace, dataset, and table name. The workspace will be called Customer Segmentation and corresponds to the Power BI Workspace you used to create the customer segmentation report. The dataset should be called WholeSaleCustomer and corresponds to the name of the API enabled dataset that we created in Power BI. The table name is the default table name for all API enabled datasets that are created in Power BI.

Next, we will use the dynamic content wizard to get the data from PowerApps into each of the appropriate columns in the API enabled dataset.

NOTE You must do this next step correctly and in the right order because as of the time of this writing there is no way to remove elements once they have been created.

Click on Add dynamic content and then in the pop up menu on the right select Power Apps See More.

This will open an additional option to Ask in PowerApps.

Select Ask in PowerApps and notice how flow automatically generates a field in the flow called Addrowstoadataset_CustomerName.

Complete the same steps for each column in the Power BI dataset. Be sure to select Ask in PowerApps for each new field. Once done your action should look like this.

This will automatically create parameters in the flow which will surface in the Power App once the Flow has been added to the Power App. As a sneak peek to what I mean here is a screen clip of the step to call the flow from inside PowerApps, we will do this step later in this tutorial series.

Add Azure Machine Learning action

Now we will add a new action between the PowerApp connector and the Power BI one to call the Azure Machine Learning API. To do this we will use the HTTP event. As of the time of this writing there is no OOTB connector or action to Azure Machine Learning Studio.

After you have added the HTTP event fill out the details as follows.

You can get the URI from the Azure Machine Learning Experiment > Request/Response page. Also, be sure to include the Content-Type and Authorization information. The authorization information needs to include the keyword bearer as well as the API Key which you can get from the Azure Machine Learning Experiment page.

Next, we will add a dynamic body as follows.

This will take the data that comes out of Power Apps and pass it to the Azure Machine Learning API for scoring. The machine learning model responds with the grouping the customer belongs to by using a clustering alogrithm. The Azure Machine Learning service responds with a JSON document that needs to be parsed in order to get the appropriate information.

Parse the JSON response

After the HTTP action we will add a Parse JSON action and take the Body of the HTTP response as input. We will also use the following Schema.

{
	"type": "object",
	"properties": {
		"Results": {
			"type": "object",
			"properties": {
				"output1": {
					"type": "object",
					"properties": {
						"type": {
							"type": "string"
						},
						"value": {
							"type": "object",
							"properties": {
								"ColumnNames": {
									"type": "array",
									"items": {
										"type": "string"
									}
								},
								"ColumnTypes": {
									"type": "array",
									"items": {
										"type": "string"
									}
								},
								"Values": {
									"type": "array",
									"items": {
										"type": "array",
										"items": {
											"type": "string"
										}
									}
								}
							}
						}
					}
				}
			}
		}
	}
}

 

This schema can also be generated by dropping the sample payload generated by the Azure Machine Learning Service Request/Response document in the Sample Response section.

Your action should look like the image below.

parse json

You flow should now look as follows.

complete flow
Complete flow with all components

Using the HTTP event is adequate for this tutorial however a more robust solution would be to use Azure Functions.

You can grab the sample C# code generated in Azure Machine Learning Studio to jump start the development of the “server-less” function.

Add conditional logic to Category data

Last, we will add some conditional logic to the Category field in the Power BI event. Click on the Category field in the Power BI event and select Experssion.

Enter in the following code to the expression box.

if
(
      equals(triggerBody()['Addrowstoadataset_Category'],''),
      first(first(body('Parse_JSON')?['Results']?['output1']?['value']?['Values'])),
      triggerBody()['Addrowstoadataset_Category']
)

This will check to see if the Category field has been filled out in the Power App and use that value otherwise if the category field is null it will use the value from the Azure Machine Learning model.

Test the flow by clicking Test in the top right hand corner of the flow.

In the next post I will show you how you can integrate the flow into the Power App and then integrate the app into a Power BI Dashboard.

Hopefully you have found this to be another practical post.

Until next time

Anthony