Categorie op Modern data platform

Process Azure analysis services through Azure data factory & Logic Apps

My first blog was a detailed explanation on how to use Azure data factory and logic apps together to improve your ETL process. Find it here. This blog represented the majority of user traffic so I decided that the time is there to follow up on this blog!

Azure analysis services is often used as an enterprise datamodel. Azure analysis services can be compared with Power BI (premium) as they share the same underlying architecture. Microsoft’s vision is to create a superset of Azure analysis services features within Power BI premium that might influence the choice when selecting where to host your datamodel (Power BI premium / Analaysis services). SQLBI has written a nice blog about choosing analysis services with large datasets. Find it here.

Azure analysis services supports two different method’s of storing data, direct query and import. In a direct query setting data resides in the database and is queried at runtime by the analysis services engine. This requires the standard pricing tier as it is a premium Analysis services feature. The import mode loads the data into the datamodel, this is called processing.

Processing Azure analysis services

There are multiple ways of processing Azure analysis services. Besides manual, automating the model processing can be accomplished via Azure functions, Azure automation, Logic Apps and even from within SSIS running under integration runtime in Azure data factory. Important to know is that it requires a service principle (app registration) or actual user + pass for authentication.

This blog describes how to set up the processing in combination with Azure data factory and Logic Apps.

Getting started

To get started provision an Azure Analysis services instance, Azure data factory instance and prepare a datamodel in visual studio that sources data from, for example an Azure sql database. For this blog I will re-use the datamodel from my previous blog about slicer dependancies.

In a classic Business Intelligence pipeline Azure data factory would function as an orchestrator calling Logic Apps, data flows / SSIS packages and it would take care of error handling. With the purpose to extract data from sources, transforming it and loading it into a datawarehouse and datamodel to enable the business to better base decisions on facts.

In this last step analysis services comes to play as a datamodel that can be used for easy reporting and self-service analytics. Analysis services should be processed with data from the datawarehouse, probably modelled as a starschema.

This blog is about how to process Azure Analysis Services with the use of Logic Apps in combination with Azure data factory.

Initiating the processing

Because Azure data factory is going to trigger this new logic app we are creating a logic app from scratch triggered by a HTTP request.

The second step is to initiate a HTTP request so we can call the Azure analysis services API for refresh. All details about the API can be found here.

The base URL should be altered by replacing the region, servername and modelname. /Refreshes initiates the actual refresh.

The base url: https://[]/servers/[myserver]/models/[AdventureWorks]/refreshes

Fig 1

Figure 1 shows the Logic App configured thus far. Note the changed URL and the body. This is also described in the API details. For now we have configures a full refresh. One very important setting, and super easy to forget, is the change the asynchronous pattern for this request. This setting can be changed by clicking on the … for the HTTP task. Be sure it’s off. The reason we want this is that if we leave this setting on we get an immediate response from the API, and this does not yet contain the location that we need to check for the refresh state.

Authentication can be configured through Active directory OAuth and providing app registration details. Other options are possible but this depends on your approach and policies.

Running this Logic App will trigger Analysis services to start a full refresh. Triggering this Logic App from data factory is as easy as creating a pipeline, dragging a web request component onto the canvas and configuring it with the Logic App url and a sample body as shown in figure 2.

Fig 2

As we are used to with web requests we get an immidiate response only telling us the request was successful but not if the processing succeeded. To accomplish that we need to check the processing state.

Checking processing state

Compared to checking the state for an Azure SQL database checking the state for Analysis services requires the refresh location from the original request. This can be accomplished by returning this location to data factory and then use this in combination with the API to get the current processing state.

First the logic app is extended with a response task as shown in figure 3. Note that if the async pattern in still on this does not yet contain the location url.

Fig 3

Back in Azure Data Factory, create a variable (unselect all components and navigate to Variables) and drag a set variable component onto the canvas. Specify the formula to set the location variable to:
@activity(‘Trigger refresh’).output.Location. So far the pipeline triggers the refresh and stores the returned location into the Location variable.

To check the state we need a different Logic App. This app will fire a HTTP request on the analysis services API based on the location that was returned from the first Logic App. Because we need this location in Logic Apps our HTTP request needs to accept some input. This can be done by providing a body for the request as shown in figure 4.

Fig 4

The next step in the logic app is to parse the received JSON and fire the actual HTTP request to get the current processing state. To parse the JSON add a parse JSON task and use the body as content and provide the schema as shown in figure 5

Fig 5

After parsing the JSON we can now add a HTTP request that uses the parsed location to check the current processing state as shown in figure 6.

Fig 6

The last step is to return this response so we can use it in data factory to keep checking until the state is different than InProgress. The complete logic app including the details for the response is shown in figure 7.

Fig 7

Because this Logic App requires a valid url to check for the processing state we cannot run a test. The finishing touch including testing can be done from data factory.

Putting it all together

In azure data factory extend the pipeline with a until loop and use the following expression as shown in figure 8

@not(equals('InProgress', coalesce(activity('Check processing state').output.status, 'null'))).

It might be smart to configure a reasonable timeout so if there is something going wrong it does not keep polling.

Fig 8

As activity use a web request, set the Method to Post and provide the location variable as part of the body as shown in figure 9. To prevent this until loop from running every second a wait task for 60 seconds is a smart addition.

Fig 9

The pipeline is complete now and can be tested. In the first step the Logic App is triggered that posts a processing command to the analysis services API. It wait’s for a full response that contains the location wich we can use to later check for the current state for this processing command. The logic app will respond with this location.

The next step extracts the location from the response and stores it into a variable. Finally the until loop fires a web request to the Logic App that get’s the current state for this specific processing command. In data factory this process is repeated until the response is not equal to inProgress, so succeeded or failed. Figure 10 shows the full pipeline.

Fig 10

This pipeline can be saved as a “Process analysis services” pipeline and can be called from for example a master pipeline. This way we get small seperate pipelines that can be combined from a master / orchestrating pipeline.


Processing analysis services can be done in diferent ways. I prefer Logic Apps over Azure functions because there is no need for any code thus it’s really transparent and can be re-used for other scenario’s. It took me some time to figger out that the key thing in this set-up is to change the async pattern for the http request that fires the process command but besides that it’s super easy to set up.

Logic Apps cookbook for Business Intelligence

In my previous blog I wrote about how to use Logic Apps in combination with Azure data factory to improve ETL processes. Read it here. This blog describes how both can be combined and how to check the status for an Azure SQL database or datawarehouse.

Now getting that status is useful both most of the times this is only useful together with an action like scale-up, scale-down, scale-out, pause or resume. This blog can be referred to as a cookbook with several Logic App recipes that can be leveraged from for example Azure data factory to build an awesome automated pipeline!

HTTP Request > 202 (Accepted)

Coming from the field of Business Intelligence you are probably used to running ETL packages that contained tasks that would have a couple status like running, succeeded or failed. What happened here is that this process waited for a final response before continueing to the next task. With HTTP requests this is different. You get an imediate response 202 (if succeeded) that does not hold a final status. It just says, OK the request is fired. So to see wether a logic app succeeded in doing what was asked we can use another logic app that polls for the current status. The combination of these Logic Apps is what forms the basis of this cookbook.

Scaling an Azure SQL Database

The Logic App to accomplish this is shown in the picture above. We start of with a received HTTP request. After saving your Logic App this holds the unique url that will trigger this Logic App from for example Azure data factory.

Because we want to re-use this Logic App for up and downscaling we use a parse JSON component to extract the messages that has been send to this url.

Finally it fires an HTTP request to the database REST API to scale the database.
Now lets go through these steps one by one

When a HTTP request is received

As described we will re-use this Logic App to up- and down-scale so therefor we need to tell this Logic App what ServiceObjective to scale to. In this example we only scale in the same tier (Basic). If you get the idea it’s easy to add other inputs like tier to this Logic App.

Click on the “Use sample payload to generate schema” and enter:


The JSON schema should look like the image.

Parse JSON

Because we want to use this ServiceObjective input in the HTTP Request we parse it to be valid JSON and to be able to use this ServiceObjective directly.

Click in the textfield Content and in the pop-up menu select the Body from
” When a HTTP request is received “. and paste the same JSON schema from the previous component into the schema field.


Choose PUT from the Method dropdown. Put is needed to perform this change to the database. Paste the url where [] are replaced with your own settings.[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/[]?api-version=2017-10-01-preview

Paste the code below into the Body field. For now we only made the ServiceObjective variable but this could be extended with tier, location etc.

  "location": "westeurope",
  "sku": {
    "name": "@{body('Parse_JSON')?['ServiceObjective']}",
    "tier": "Standard"

Choose Managed identity from Authentication but remember to add this Logic App’s identity to the SQL Server where this database is hosted. More info about that in my previous blog under Authentication.

Full HTTP reference for PUT can be found at

Running the Logic App

The Logic App is finished now so it’s time to test it. Open or create an Azure data factory, create a pipeline and add a web component to the canvas and paste the url. Select POST as a method because we want to send something to this Logic App en specify the Body from below


These settings would be better off storing them in variables & parameters but this might be content for a later blog.

Now run this pipeline and check out the database in the Azure Portal. It’s scaling right?

Now that our first recipe is ready we can create a lot more in a short time. For example if we would like to know when the database is back online?

Getting current state of an Azure SQL Database

Create a new Logic App or clone the previous. Set up like the image below.

If it is cloned from the previous Logic App be sure to change all method’s to GET (We want to get the status) remove all JSON input & parse and add a response task with status code and body. Also don’t forget to add this identity to the SQL Server.

From data factory create a pipeline with GET as a method, no body and run!

Scale and wait until resumed

A great combination could be to scale a database and wait until it is resumed. With both logic apps this can easily be accomplished through the Azure data factory. The example below uses the first Logic App to scale a SQL Datawarehouse. Yes that’s also just a database that has a tiny difference in the url[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/[]?api-version=2014-04-01-preview

Until resumed holds an expression that retrieves the current state of the SQL Datawarehouse and finishes while it equals Online

@equals(‘Online’, coalesce(activity(‘SQL DW State’), ‘null’))

The until loop runs the Logic App for getting the current state and then a 60s wait. While it is not Online this process continues.

Pause / unpause SQL Datawarehouse

Another great recipe is a Logic App that can pause or unpause your Azure SQL Datawarehouse to only run it when needed. Especially for this component it is very handy becuase it’s not the cheapest component.

Again it’s just like the first Logic App and even easier. Use POST as the method and the below url[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/pause?api-version=2014-04-01-preview

or te resume:[]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/resume?api-version=2014-04-01-preview


It is super easy and useful to combine Logic Apps and Azure data factory to dynamically change your solution to it’s needs. Resume and scale when data processing needs to happen and scale down and pause when no data processing is happening.

In one of my next blogs I will write about processing Azure analysis services via Logic Apps.

Using logic apps and Azure data factory to improve your ETL processes.

Logic Apps are great when it comes to performing recurring tasks. There are a bunch of connectors available and it looks like your own creativity is the limit of what you can do with logic apps. Of course when it comes to complex, long running tasks it might be better to look at a different solution.

Another great product is Azure data factory, especially after version 2 was released and a visual GUI made it so much easier to build data driven pipelines.

The combination of these two tools is great and I use them a lot together. Logic apps can be triggered from a http request and in Azure data factory you have this web task available by default. So to run a logic app from Azure data factory you simply add this url to a web task and your good to go.

The example above fires a web request to the url that is automatically created after a logic app is created from a HTTP trigger .

Depending on what the logic app does you need to deal with authentication. For now we leave it this way and start by creating a simple logic app.

New logic app

Go to logic Apps in the Azure Portal, Click Add and specify a name. In this case I will name it “MyLogicApp”.
After submitting your Logic App should be in the list, if not hit refresh. Open your logic app and in the overview page click on ” View in Logic Apps designer “.

This is where the magic starts. Scroll down and find the “Blank Logic App”

Search for “Http” and select the following

From the “Add new parameter” dropdown select the Method and select Get.
Hit Save and look for the url that is now available in the ” HTTP POST URL ” field.

The logic App should look like this now.

If you run this logic app you can see that it succeeds. Of course, there is nothing that can fail 🙂

New Azure data factory pipeline

Go to data factories in the Azure portal. Create a data factory if you have none otherwise open the data factory by selecting it and choose for “Author & Monitor”. This brings you in the development GUI for Azure data factory

Under Factory resources hit the “+” and select New Pipeline

From the activities menu drag a Web task (under General) onto the canvas and give it a name. Select the Web task and go to settings. Paste the logic app url into the URL field and select the Method Get.

This method depends on the method we have selected while creating the logic app. There are some different methods but the most used are GET, POST, PUT and Delete. Some reference to these methodes

Run the pipeline by selecting Debug.

This should result in green, succeeded messages. This is the first step in running web requests from Azure data factory.

A practical example

In a modern data platform architecture it might be interesting to do some up- and downscaling before starting a long running ETL process. In this example we are going to check the current configuration for a Azure SQL database. This might be the first step, after we found out the current service level objective we can create another logic app to up- or downscale this database.

Create a Azure SQL database if there is none available. This can be done under SQL Databases in the Azure portal. If there is no SQL Server this should be created as well but can be done from the database creation wizard.

Navigate back to logic apps and create a new Logic App or go back to the designer interface from the previous Logic App. Start again with a HTTP Request and hit the “+ New step” button. Choose HTTP.

By selecting HTTP you this Logic App is going to do a HTTP request itself to get the state of a SQL Server database.

Fill in the field Method, URl and Authentication.

Change the url parts between [][]/resourceGroups/[]/providers/Microsoft.Sql/servers/[]/databases/[]?api-version=2017-10-01-preview

Full reference on the HTTP Requests can be found @

It’s time to save and run this Logic App. After running you will notice that it failes to get the database status. We did specify Managed Identity as our authentication but we did not give this identity access to our SQL server. Before we go into this part let me quickly explain two types of authentication I normally use.


From the Authentication dropdown a couple of options were available. I am not familair with all options but prefer to use Active directory OAuth or managed identity. With Oauth you need to create a app registration from within Active directory that can be referred to as a service principle with a username (unique identifier) and a password. Together with the tenant-id this account can be used to authenticate on the Azure components you give it permission to.

The managed Identity is not fully available on all Azure components at the moment but is quickly gaining terrain. For each Azure service there is an automatically created identity in Azure active directory that can be used between Azure services without using usernames and password in your code. Reference:

In our Logic App we selected the Managed Identity but did not yet give it permission. Time to do so. First we need to set up this identity for this Logic App because for Logic Apps it is not automatically created in Azure AD. So from our logic app navigate to Identity and toggle the status to on.

Hit Save and navigate to your Azure SQL Server. Navigate to Access control, click Add, select “Contributor” as role and now search for your Logic App. Select it and hit save.

Now go back to your logic app and rerun! This can be done directly from Run trigger in the Overview tab. It succeeds!

At this point we have created a Logic App that fires a HTTP request to our SQL Server database to return it’s current state based on a managed identity authentication. Back to Azure data factory to tie both together and do something with this result.

Combining Logic App and Azure data factory

Navigate back to your data factory. Create a new pipeline and put a web task on the canvas. Specify the Logic App url, Get as it’s method and run the pipeline. As you can see it succeeds but the response is blank.
You can see this response from the output window and selecting the outgoing arrow under Actions. The actual response is:
{ “Response”: “” }

This is because we did not specify an output in our Logic App. This is easy to set-up. Navigate back to your Logic App and open the designer. Add a new task by searching Response.

After selecting the status code field a popup appears. Select the status code from there and do the same twith the body field.

Go back to you Azure data factory pipeline and re-run the pipeline. Check the output, it should look like below.

    "sku": {
        "name": "Basic",
        "tier": "Basic",
        "capacity": 5
    "kind": "v12.0,user",
    "properties": {
        "collation": "SQL_Latin1_General_CP1_CI_AS",
        "maxSizeBytes": 2147483648,
        "status": "Online",
        "databaseId": "",
        "creationDate": "",
        "currentServiceObjectiveName": "Basic",
        "requestedServiceObjectiveName": "Basic",
        "defaultSecondaryLocation": "northeurope",
        "catalogCollation": "SQL_Latin1_General_CP1_CI_AS",
        "zoneRedundant": false,
        "earliestRestoreDate": "",
        "readScale": "Disabled",
        "readReplicaCount": 0,
        "currentSku": {
            "name": "Basic",
            "tier": "Basic",
            "capacity": 5
    "location": "westeurope",
    "id": "",
    "name": "",
    "type": "Microsoft.Sql/servers/databases"

Now add an If task to the canvas and drag the green line from the web task to the If task. Select the If task, go to settings and click the “Add dynamic content” text. Specify this expression: @equals(‘Basic’, coalesce(activity(‘Get database state’).output.sku.tier, ‘null’))
Be aware that the “Get database state” is the name of the web component.

It should look like this now

Now go to Activities and click the button under true. Add a Wait task to the canvas and now run this pipeline. In the output window you can see that the wait task was triggered so now we could add another web request to scale this database to standard or premium.

Like I sad before with enough creativity great processed can be created with Logic Apps and Azure Data factory.

But why Logic Apps?

Everything we’ve done so far with Logic Apps can also be accomplished with only Azure data factory. For example we could have used the web request to the database in our Logic App directly in Azure data factory.

The url is exactly the same as in the Logic App, method is also Get and MSI is selected with the default Azure managed identity url. So why use Logic Apps at all?

If you run processes or tasks that are not initiated from Azure data factory or you have multiple data factories running it is a good idea to put these actions into a Logic App. With this you create a higher level of flexibility and if anything needs to be changed you only have to change the Logic App.

If you only run processes from Azure data factory you could just create a couple pipelines with each a specific task that can be referred from another pipeline by using the “Execute Pipeline” task.

One of my next blogs will be full of examples with web-requests that will improve your ETL process. Start thinking about

  • Up- downscale database / datawarehouse
  • Pause / resume datawarehouse
  • Process Azure analysis services
  • Start / Stop integration runtime


There are a lot of possibilities with web-request that can run from within Azure data factory alone or in combination with Logic Apps to get a more dynamic approach to your ETL proccesses. If you can dynamically scale your whole architecture only when needed this could deliver great performance improvement and lowering your cost when everything is scaled down or paused after execution.