Monday, 27 February 2017

Can we replace Biztalk with Logic Apps (POC)

Question :

Can Logic Apps do for us what Biztalk Does?

The following are notes taken during a POC exercise done for my company to recreate a Biztalk process in Logic Apps.

In this case, the task is to reliably download an FTP flat file and process it line by line. The Logic Apps orchestration must be able to :

  1. Reliably pull an FTP file off a remote server (Like Biztalk FTP adapter)
  2. Persist the file until it is successfully processed (Like Biztalk message box)
  3. Selectively choose the correct orchestration to process the file based on its contents
  4. Debatch the file and process each line
  5. Notify of any errors
  6. Enable us to reprocess if any of our dependant systems are down
  7. Display a dashboard showing succeeded/failed orchestrations

The approach will be :
  1. Download an FTP flat file (Logic Apps FTP adapter)
  2. Determine what type of file it is (Azure functions)
  3. Put the file in the correct queue for reliable processing (AzureServiceBus)
  4. Pull the file off the queue (ServiceBus trigger)
  5. Convert the flat file to JSON (Azure function)
  6. Process the lines one at a time (Azure for-each)
  7. Do something with each line using the line data (Azure Email activity)
  8. Fail if the file is incorrect and notify IT (Scopes, try-catch also look at OSM alerts)
  9. Fail if dependant systems are down, notify IT and allow reprocessing (must be idempotent)
  10. Create a dashboard for IT to monitor that shows Succeeded / Failed Workflows (OSM)
  11. Additional : Look at a push-based solutions to minimise the higher cost of Azure polling
  12. Calculate the cost
Essentially re-create in Logic Apps what Biztalk provides

Lets go :

1). Azure gives you a free 1 month trial with $300 credit to try Azure out. All you need is your email address. Get yourself set up and create a new Azure Logic App.

Choose the FTP Trigger and enter the details for your remote FTP site that you wish to poll.

You can add the FTP settings to poll the remote server.

*NB : After some work I found it easier for this POC to just use an FTP Request-Reponse and post my data to the logic app using 'PostMan' than to write to an FTP server and wait for the polling interval to complete.

For convenience, this POC will use the 'HTTP Request' and 'HTTP Response' activities instead of the FTP activity however it is understood that either can be used

Config therefore looks like so :

Copy the new URL and open up Postman (download if you don't have it).

You can now POST data to your new logic app (congratulations you also just created an API)
We are going to POST the following flat-file data. (a simple flat file with 3 lines)


2. Many different messages could arrive at this location. Now lets use an Azure function to determine what sort of message this is

Create an Azure function. I've called it itg-get-queue-name and used javascript as the language. Paste in the following :

module.exports = function (context, data) {
  var content = data.content;

  var lineArray = content.split("\r");
  var firstLine = lineArray[0];
  var secondLine = lineArray[1];
  var secondLineFieldArray = secondLine.split("|");
  var id = secondLineFieldArray[1];

  // Response of the function to be used later.
  context.res = {
    body: {
      "queue_name" : firstLine,
      "id" : id

You can test the code using the [Test] icon and pasting in the source data like so :

    "content": "SOCONF_NZKS\r\n

*NB : Note the data has been wrapped in a "content" JSON object

The result should be : {"queue_name":"SOCONF_NZKS","id":"0008000411"}

The code just read the first line to determine queue-name. You can of course use whatever logic you wish here. This does the job of the Biztalk orchestration filters.

3)  Now we know the message type, lets stick it in a queue.

Create an Azure 'Service Bus'. Within the bus, create a queue called 'soconf_nzks'

*NB : At time of writing, the online Azure web interface to the bus is a bit limited. I would recommend downloading the very nice 'Service Bus Explorer'. This enables you to inspect the queues etc.

Add a new ESB Send Message activity to your orchestration.

Note the 'queue name field'. You need to switch to code-view and set the "path" field like so :

  "path": "/@{encodeURIComponent(body('itg-get-queue-name').queue_name)}/messages",

Finally add an 'HTTP Response' activity to write something back to the 'HTTP Request'. Set the body to something like :

{"result" : "OK"}

4)  That's it for this orchestration. All it does is read in files and sticks them in the correct queue.

We have deliberately kept this logic simple with no external dependencies so we can guarantee all messages are queued. You can extend the script to use whatever logic you need to determine the correct queue. We could have a reject queue or some mechanism of handling unrecognised messages.

Now we will set up a second orchestration that will monitor the service bus queue and execute the logic to process our message

Create a new Logic App. In this case I have called it 'OmegaSalesOrderConfirmation'

In the designer, use the ESB Read Message activity and set it to poll, eg, every 10 seconds

5) Now we need to read the message. I am going to use a small snippet of javascript to generically convert the flat file to JSON.

Create a new Azure function like before and call it itg-flatFileToJson

Paste in the following code

module.exports = function (context, data) {
    context.log('Webhook was triggered!');

    var content = data.content;

    // Split by line break
    var lineArray = content.split("\n");
    var firstLine = lineArray[0];

    var output = [];   

    for (var line in lineArray)

        if (lineArray[line])
            var fieldArray = lineArray[line].split("|");

            var node = [];

            for (var field in fieldArray)
                node.push({"field": fieldArray[field]});
            output.push({"line" : node});

    context.res = {body: output};


*NB : You can [Test] it using the same input snippet from step 2

*NB : This is very simplistic. A slightly more advanced version might take in a 'headers' JSON parameter and name the fields according to the headers

Add an Azure function activity to the orchestration

Open the orchestration in code-view and set the "content" tag within "itg-flatFileToJson" to :

  "content" : "@{triggerBody()?['ContentData']}"

6. Choose the for-each activity to loop over the JSON output

7. Send an email containing information from the lines from the file

Within the email, set the 'Body' to @item.line[0].field

You should now receive 3 emails. One for each line of the flat file. (just shows the first field)

8. We want to throw an error if the message that arrived failed. Open up the original 'Omega Receive' logic app we created in Step 1.

Add a scope which includes the Azure function, the ESB Send Message and the OK HTTP Response. You can rename the Scope to 'Queue Message' (click the ... in the right top corner)

Add another HTTP Response and set the body to a failure message :

{ "results" : "Fail"}

You can rename the action to "Fail Response"

Save and open the orchestration in code view and find "Fail_Response". Change the 'runafter' so that the activity runs after the scope has 'Failed.

                "runAfter": {
                    "Queue_Message": [

Save and then switch back to the Designer view. You will see something like

Now deliberately make the orchestration fail by changing the first line (so the queue cant be found) and sending via postman.

You should see a Fail message in postman.

  "result": "Fail"

9. We can fix the message up in Postman. Then open the ESB and disable the queue.
Now running it again will cause another failure.
Click on 'Overview' for the logic app and we can see our failed run

If we click this row, then we see the result of the run and which items failed

Now go back to the ESB queue and re-enable the queue.
Click the 'Re-submit' icon at the top. Now the logic app will re-run and should succeed.

*NB : There is no such thing as 'resume' that you get in Biztalk, only 'resubmit'. Therefore all orchestrations must be designed to be idempotent (re-runnable).

10. We can use the Operations Management Suite (OMS) to set up a dashboard to show our failed / succeeded workflows.

Click the Monitor icon on the left of the main dashboard (looks like a fuel gauge).
Choose Diagnostic Logs
Enable this for the Logic Apps you created
You will need to Configure -> Create a new account
Choose to save logs to 'Log Analytics'
Configure for new OSM

You can then create a new dashboard and show the workflows succeeded and workflows failed

11. The above solution can actually be quite expensive to run. This is because Azure charges us for an activity each time we poll the service bus (or poll an external FTP site)

For example, if an orchestration polls the service bus every 10 seconds then it will cost us

=6*60*24*30 = 259200 activity executions = 10 / month = 120 / year

 Yes, that's right, Azure charges each time a trigger fires, even if it does nothing. 

It turns out we can create our own 'push' trigger using an Azure Web app. Infact there's a nice one already written (Thanks nabeelp!) that, once set up means we can fire our logic app using a 'push' notification whenever a message arrives on the bus. This means no polling lag and we only pay when a message arrives. OK!

Download the kindly written :

You will need Visual Studio to compile it and publish it to Azure. Then configure the logic app. You can debug it in Visual Studio.

I found this approach to work very nicely (better than the native polling solution) and Azure enables you to scale out the WebApp if load on it grew.

Because this uses webhooks, we can add lots or orchestrations and let each subscribe to a different queue..

This only handles the push messages for the service bus. It should be possible to also do the same thing for polling FTP sites. Someone please feel free to write an AzureFTPTrigger webapp!

For some nice background on Logic Apps, here is a nice write-up :

***NB : It is now possible to also do this with AzureFunction triggers. See :

12 So whats it going to cost :

Check out the Azure Price Calculator

For the particular orchestration this POC pertains to, we expect it to be fired 1000 times per day.
With 7 activities being executed per run (3 in one logic app and 4 in the other)
2 Azure functions running
1 Service Bus
1 SQL Database
1 App Service (Web app)
Log Analytics

For the Logic App, we should have 1000 runs x 7 activites x 30 days per month = NZD250 / month

Azure functions will be within the free 1 million executions / month. The rest are fixed(ish) costs.

So thats approx $6000 / year with this one orchestration
As we add more orchestrations we would expect the Logic Apps part to increase but the others will mostly remain fixed.

Biztalk currently costs around NZD40000 / year (non-enterprise license, including AWS servers cost) so this compares well. (Of course there is no server admin to worry about as well)

With Logic Apps you pay per execution. So if you are a high-volume user then Logic Apps will get expensive. But if your volumes are low then it can be cheap.


- There is an 'Integration Pack' you can buy for Logic Apps that lets you do XSD schema validation. This is pretty expensive ($1500/month). This may be useful to larger clients but you can probably do what you need to do in an Azure Function.
- These Logic Apps are really simple, as you add more activities then they get more expensive. In our case where we expect 1000 runs/day. Each activity added costs $35/month to run
- I am assuming I can dodge the FTP polling charge / can poll at an infrequent interval

No comments:

Post a Comment