Logic Apps cookbook for Business Intelligence

Logic Apps cookbook for Business Intelligence

In my previous blog I wrote about how to use Logic Apps in combination with Azure data factory to improve ETL processes. Read it here. This blog describes how both can be combined and how to check the status for an Azure SQL database or datawarehouse.

Now getting that status is useful both most of the times this is only useful together with an action like scale-up, scale-down, scale-out, pause or resume. This blog can be referred to as a cookbook with several Logic App recipes that can be leveraged from for example Azure data factory to build an awesome automated pipeline!

HTTP Request > 202 (Accepted)

Coming from the field of Business Intelligence you are probably used to running ETL packages that contained tasks that would have a couple status like running, succeeded or failed. What happened here is that this process waited for a final response before continueing to the next task. With HTTP requests this is different. You get an imediate response 202 (if succeeded) that does not hold a final status. It just says, OK the request is fired. So to see wether a logic app succeeded in doing what was asked we can use another logic app that polls for the current status. The combination of these Logic Apps is what forms the basis of this cookbook.

Scaling an Azure SQL Database

The Logic App to accomplish this is shown in the picture above. We start of with a received HTTP request. After saving your Logic App this holds the unique url that will trigger this Logic App from for example Azure data factory.

Because we want to re-use this Logic App for up and downscaling we use a parse JSON component to extract the messages that has been send to this url.

Finally it fires an HTTP request to the database REST API to scale the database.
Now lets go through these steps one by one

When a HTTP request is received

As described we will re-use this Logic App to up- and down-scale so therefor we need to tell this Logic App what ServiceObjective to scale to. In this example we only scale in the same tier (Basic). If you get the idea it’s easy to add other inputs like tier to this Logic App.

Click on the “Use sample payload to generate schema” and enter:

{
“ServiceObjective”:”s3″
}

The JSON schema should look like the image.

Parse JSON

Because we want to use this ServiceObjective input in the HTTP Request we parse it to be valid JSON and to be able to use this ServiceObjective directly.

Click in the textfield Content and in the pop-up menu select the Body from
” When a HTTP request is received “. and paste the same JSON schema from the previous component into the schema field.

HTTP

Choose PUT from the Method dropdown. Put is needed to perform this change to the database. Paste the url where [] are replaced with your own settings.
https://management.azure.com/subscriptions/[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/[]?api-version=2017-10-01-preview

Paste the code below into the Body field. For now we only made the ServiceObjective variable but this could be extended with tier, location etc.

{
  "location": "westeurope",
  "sku": {
    "name": "@{body('Parse_JSON')?['ServiceObjective']}",
    "tier": "Standard"
  }
}

Choose Managed identity from Authentication but remember to add this Logic App’s identity to the SQL Server where this database is hosted. More info about that in my previous blog under Authentication. https://jordifrijters.nl/2019/05/31/using-logic-apps-and-azure-data-factory-to-improve-your-etl-processes/

Full HTTP reference for PUT can be found at https://docs.microsoft.com/en-us/rest/api/sql/databases/createorupdate

Running the Logic App

The Logic App is finished now so it’s time to test it. Open or create an Azure data factory, create a pipeline and add a web component to the canvas and paste the url. Select POST as a method because we want to send something to this Logic App en specify the Body from below

{
"ServiceObjective":"s3"
} 

These settings would be better off storing them in variables & parameters but this might be content for a later blog.

Now run this pipeline and check out the database in the Azure Portal. It’s scaling right?

Now that our first recipe is ready we can create a lot more in a short time. For example if we would like to know when the database is back online?

Getting current state of an Azure SQL Database

Create a new Logic App or clone the previous. Set up like the image below.

If it is cloned from the previous Logic App be sure to change all method’s to GET (We want to get the status) remove all JSON input & parse and add a response task with status code and body. Also don’t forget to add this identity to the SQL Server.

From data factory create a pipeline with GET as a method, no body and run!

Scale and wait until resumed

A great combination could be to scale a database and wait until it is resumed. With both logic apps this can easily be accomplished through the Azure data factory. The example below uses the first Logic App to scale a SQL Datawarehouse. Yes that’s also just a database that has a tiny difference in the url https://management.azure.com/subscriptions/[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/[]?api-version=2014-04-01-preview

Until resumed holds an expression that retrieves the current state of the SQL Datawarehouse and finishes while it equals Online

@equals(‘Online’, coalesce(activity(‘SQL DW State’).output.properties.status, ‘null’))

The until loop runs the Logic App for getting the current state and then a 60s wait. While it is not Online this process continues.

Pause / unpause SQL Datawarehouse

Another great recipe is a Logic App that can pause or unpause your Azure SQL Datawarehouse to only run it when needed. Especially for this component it is very handy becuase it’s not the cheapest component.

Again it’s just like the first Logic App and even easier. Use POST as the method and the below url

https://management.azure.com/subscriptions/[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/pause?api-version=2014-04-01-preview

or te resume:
https://management.azure.com/subscriptions/[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/resume?api-version=2014-04-01-preview

Conclusion

It is super easy and useful to combine Logic Apps and Azure data factory to dynamically change your solution to it’s needs. Resume and scale when data processing needs to happen and scale down and pause when no data processing is happening.

In one of my next blogs I will write about processing Azure analysis services via Logic Apps.

Jordi

Geef een reactie