Deployment Pipeline For Logicapps
Azure LogicApps are sometimes marketed as the replacement for BizTalk in the cloud. They allow you to define an integration workflow “visually”. The service is very attractive because of the hundreds of integration adapters it offers. But it lacks1 some development tools and documentation on how to integrate it in a software development pipeline.
This post aims at describing a couple of technics I’ve been using to automate deployment of LogicApps.
Authoring LogicApps
As opposed to BizTalk, which you can run locally, LogicApp need to be authored within a subscription2. This is the first problem to circumvent.
In my use-case, I author an application extracting data from Dynamics 365 and Azure Devops, pushing it to a storage account, and presenting it in a webapp. Before it is released, I want to validate it on a test environment. Then if everything is right (tests pass), deploy to production.
Since the LogicApp authoring needs to be tied to an Azure environment, I also need a development environment, which will be used to edit the workflows. The authoring is done in the portal using the usual LogicApp interface3:
The only way to deploy a LogicApp is through ARM templates. So when I’m happy with what I have, I use the “Export Template” tab of the LogicApp to export its content, and save that into my project as a json
document4 in a logicapp
folder. It looks like that:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"workflows_crm_export_accounts_name": {
"defaultValue": "crm-export-accounts",
"type": "String"
},
"connections_azureblob_externalid": {
"defaultValue": "/subscriptions/6b4c4699-2167-477f-bc06-24f6c3db80c5/resourceGroups/dev-better-tool/providers/Microsoft.Web/connections/azureblob",
"type": "String"
},
"connections_dynamicscrmonline_externalid": {
"defaultValue": "/subscriptions/6b4c4699-2167-477f-bc06-24f6c3db80c5/resourceGroups/dev-better-tool/providers/Microsoft.Web/connections/dynamicscrmonline",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Logic/workflows",
"apiVersion": "2017-07-01",
"name": "[parameters('workflows_crm_export_accounts_name')]",
"location": "canadacentral",
//...
In the JSON above, you can witness the first issue: since the LogicApp is tied to a resource group, everything is hard-coded to API connections that are local to the resource group. The location is also hard-coded. When I re-deploy that LogicApp to another environment, I want it to use that environment’s connections, not the development environment’s. Something needs to be done.
The first step is to “parameterize” those connections so that the LogicApp gets reconfigured at deploy time. This is done replacing the hardcoded value /subscriptions/6b4c4699-2167-477f-bc06-24f6c3db80c5/resourceGroups/dev-better-tool/providers/Microsoft.Web/connections/azureblob
by something dynamic: [concat(resourceGroup().id, '/providers/Microsoft.Web/connections/azureblob')]
.
I wrote this small node app to automatically do that. Every-time I update the ARM template that contains the LogicApp in my solution, I run this script in bash, and all my LogicApps get “parameterized”, which allows me to deploy them in other groups / subscriptions.
#!/bin/bash
for LOGICAPP in `ls logicapp/logicapp-*.json`; do
node logicapp/parameterize-logicapp.js $LOGICAPP
done
Environment Creation
LogicApps are constituted of two resource types: the app itself, and “API connections”, which actually contain the binding and authentication information to connect to the various providers. The nice thing about that is that when re-deploying the app, the connections are kept in the resource group, and no re-authentication is required.
The ARM template that I’ve extracted so far only contains the app itself. The creation of an environment requires to create those API connections as well. We use the same technic as with the LogicApp to extract the ARM template: open the API connection, export a template, and use that as a basis.
Depending on the connector, it will be possible to automatically provision the credentials, or it won’t.
I assemble all my API connections into a tidy ARM template:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "canadacentral"
},
"storageAccountName": {
"type": "string"
},
"storageAccountKey": {
"type": "securestring"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Web/connections",
"apiVersion": "2016-06-01",
"name": "azureblob",
"location": "[parameters('location')]",
"properties": {
"displayName": "azureblob",
"parameterValues": {
"accountName": "[parameters('storageAccountName')]",
"accessKey": "[parameters('storageAccountKey')]"
},
"api": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', resourceGroup().location , '/managedApis/azureblob')]"
}
}
},
//[...]
{
"type": "Microsoft.Web/connections",
"apiVersion": "2016-06-01",
"name": "visualstudioteamservices",
"location": "[parameters('location')]",
"properties": {
"displayName": "visualstudioteamservices",
"api": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', resourceGroup().location , '/managedApis/visualstudioteamservices')]"
}
}
},
//[...]
]
}
As you can see in this example:
- I use the same “parameterization” trick to make it deployable to any group.
- Storage Accounts connections can be provided with storage credentials at deploy time5.
- DevOps and Dynamics CRM connections need a token that should be retrieved through an oauth authorization flow which can be executed after deployment. This post has given me good directions on how to retrieve the field names available for each connector type.
Once the ARM template get deployed, blob storage and table storage are good for use, Dynamics and DevOps require authorization. This is done simply by opening the API connection in the portal and clicking the big “authorize” button in the “Edit API connection” tab.
In a previous post I mentioned how I’m usually provisioning environments. This case is no different and I have a provision.sh
script in which you’d find the following lines6:
echo "Deploying ARM template api-connections"
az group deployment create -g "$DEFAULT_RESOURCE_GROUP" --template-file logicapp/api.json --name "api-connections" --parameters location=$LOCATION storageAccountName=$DEFAULT_STORAGE_ACCOUNT storageAccountKey=$DEFAULT_STORAGE_ACCOUNT_KEY --query 'properties.provisioningState' -o tsv
echo "Deploying LogicApps"
az group deployment create -g "$DEFAULT_RESOURCE_GROUP" --template-file logicapp/logicapp-export-accounts.json --name "crm-export-accounts" --query 'properties.provisioningState' -o tsv
az group deployment create -g "$DEFAULT_RESOURCE_GROUP" --template-file logicapp/logicapp-export-devops-projects.json --name "crm-export-devops-projects" --query 'properties.provisioningState' -o tsv
I run this script once per environment, it creates all the API connections, then deploy the LogicApps I need. I then go to the Devops and Dynamics CRM API connections to run the authorization flow, which I really need to do only once since API connections are persistent and decoupled from the LogicApps.
Pipeline
The last step is to effectively build a deployment pipeline with Azure DevOps.
The first step is to “build” the artifact. My LogicApps are all neatly organized in a logicapp
folder, so this is relatively simple and just consists in a “publish artifact” step7.
Next I want to publish those LogicApps to test and prod. I’m not deploying them to Dev because the only reason for this environment to exist is to author the LogicApps.
I create a task group that will deploy the ARM templates for all my LogicApps (but not the API connections - those I want to keep as-is for each environment). Those define the resource group using a parameter $(RESOURCE_GROUP)
.
Then I include the task group into both dev and prod - specifying the resource group to target in parameter:
Conclusion
This gives me some degree of automation and repeatability to author LogicApps and deploy from one environment to the next. Since there is no way of running them locally or in isolation, they can’t be unit-tested, so you have to use integration testing on the test environment to enable continuous integration8.
Notes
-
This is my own, non-corporate opinion. Microsoft is great, please don’t fire me! ↩
-
While you can technically author them disconnected by editing a JSON document, I’m suspecting no-one has ever done that. ↩
-
You can also use Visual Studio’s interface, they are the same. For this project I’m using Visual Studio Code and I don’t have that choice. ↩
-
And NOT the “code view”. They are not the same and we need the actual ARM template for later. ↩
-
Notably, AzureTables and AzureBlob don’t have the same field names (azuretables requires
storageaccount
andsharedkey
if you wonder). Go figure… ↩ -
If you’ve been using my provision tool, the yaml template looks like so:
deployment: location: canadacentral resources: - type: storageaccount containers: - export tables: - accounts - type: snippet name: Grab account key provisioning: | echo "Getting account key for $DEFAULT_STORAGE_ACCOUNT" DEFAULT_STORAGE_ACCOUNT_KEY=`az storage account keys list -n $DEFAULT_STORAGE_ACCOUNT --query "[0].value" -o tsv` summary: | echo "You need to authenticate API connections for Dynamics and Azure Devops before LogicApps can run." order: 5 - type: deployment file: logicapp/api.json parameters: - location=$LOCATION - storageAccountName=$DEFAULT_STORAGE_ACCOUNT - storageAccountKey=$DEFAULT_STORAGE_ACCOUNT_KEY name: api-connections - type: deployment file: logicapp/logicapp-export-accounts.json name: crm-export-accounts - type: deployment file: logicapp/logicapp-export-devops-projects.json name: crm-export-devops-projects
-
The other steps are to build the rest of the solution ↩
-
Arguably this is OK since we’re talking about a tool that is heavily integration-oriented. ↩