Sunday, December 27, 2015

Year 2015

Time flies when your having fun. I did have fun this year working with Microsoft Azure and Integration at a utility company. Did an IoT project, bridged on premise applications to the outside world (Service Bus Relays and Queues) and some old skool integration. What else did I do? Well you might have guessed: speaking, travelling, and have an occasional beer. Pictures tell more than a thousand words.

Thanks everyone for reading my blog and I will continue to share my knowledge through speaking and writing in 2016.



Monday, October 12, 2015

Upcoming Events

In two weeks time I will head out to North America. My first stop will be Calgary to visit my friend Kent Weare. My MVP buddies Saravana Kumar and Michael Stephenson will join me as we will be speaking during the Azure Hybrid Integration the 30th of October at Microsoft Calgary.

The Azure Hybrid Integration will be a full day with Kent, Saravana, Michael, myself and Darren King from Microsoft speaking on the following topics SaaS connectivity, IoT, Hybrid SQL Server, BizTalk administration & operations and Two Speed IT using Microsoft Azure.

The free event takes place on October 30th, 2015 at the Calgary Microsoft office. You can find more details here.

After this event we will head out to Redmond for the yearly MVP Summit. This is a multi-day event that is hosted in Bellevue and at Microsoft headquarters in Redmond, Washington. All the MVP in the world that are able to come will be there to connect with fellow MVP’s and the various product groups.

I look forward to both events as it will enable me to engage fellow MVP’s, the integration community in Canada and to learn more about the future road maps of integration, Microsoft Azure and direction of Microsoft itself. The IT world around me is changing rapidly and for personally I am noticing it as my projects change from an on premise integration- to hybrid solution focus.

For those people that register for the Calgary event,  I am looking forward to meet you and to have some interesting discussions.



Saturday, August 22, 2015

Azure WebJobs: ServiceBusTrigger

Challenge is to build a WebJob that listens or monitors a queue in the Microsoft Azure Service Bus within a certain namespace and pick of each message that is send there by a message producer and process it. The WebJob acts as a message consumer of the messages on the queue. Below a high level diagram of a scenario that will be explained in this post and how I faced the challenge.


You can build WebJobs inside Visual Studio by installing the WebJobs SDK. Once you have installed the SDK you have template available to build a WebJob in C# or Visual Basic.


You can select this template specify a name for the WebJob and click Ok. You will see that a program class will be created.


And a Functions.cs.

By default a method will be created for you to monitor or listen to an Azure Storage Queue, not Service Bus Queue! To have method that will be triggered/executed when a message is written to an Azure Service Bus Queue you will need to have ServiceBusTrigger. This not available in the project and you will need to add the Microsoft.Azure.WebJobs.ServiceBus NuGet package.

Now a can change to ServiceBusTrigger in method ProcessQueueMessage and specify the queue I want to listen to.


Next change is changing the type of the message from string to BrokeredMessage type. This type is not available in your class unless you add using statement for Microsoft.ServiceBus.Messaging. The package is already in the project, because it is part of the imported NuGet package. The TextWriter object can be used to write log statements that can be viewed in the AzureWebJob Dashboard.

When a message arrives on inboundqueue it will be picked up by WebJob and enter the ProcessQueueMessage method in runtime. Here I can extract the message body and send it for instance to Redis cache as key value pair (reason of picking this example is based on a request from someone on twitter to share how to do that). To send it to Redis Cache I need to import another NuGet Package i.e. StackExchange.Redis (client library). Now the complete code for the functions class looks like below:


Before the WebJob can be deployed to a WebApp a few configuration settings have to be done in the app.config. The connection strings for the AzureWebJobsDashboard and AzureWebJobsStorage need to be provided in the connectionStrings Section. These are required to view the log in Azure i.e. AzureWebJobsDashboard. The connection string that needs to be specified is the connection string to an Azure Storage account. Format is as follows:

DefaultEndpointsProtocol=https;AccountName=[Storage Account Name];AccountKey=[Access Key]

The other connection string that has to be provided is for the AzureWebJobsServiceBus. Format is:

Endpoint=sb://;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[Access Key]

Finally in the appSettings section the connection string for the Redis needs to specified. Connection string has the format of:,abortConnect=false,ssl=true,password=[password]

Once configuration is done the app.config will look like:


Before the deployment (Publish to Azure) of the WebJob can be done a configuration setting in WebApp has to be done to enable AzureWebJobDashboard.

This is an important step. In case you forget this than observing the WebJob logs will result in the following error:


Now the WebJob can be deployed via Visual Studio to a WebApp. Right click on the project and choose Publish as Azure WebJob...


You will see a Publish Web dialog and here you import the publishing setting from WebApp. These settings can be downloaded from Azure Portal.


Next you can click Ok and you will go to next section of the dialog i.e. Connection.


Click Validate Connection to see if connection info is correct. When valid you can click Publish. Now the WebJob will be published to WebApp. In the output window of Visual Studio you will see that deployment went successfull.


In the Azure Portal you can see the WebJob in the WebApp.


When you click on the logs url you will be redirected to the Microsoft Azure WebJobs portal.


Nothing much has happen so far, only that the Job has started. In case I send a message to the queue using for instance ServiceBus Explorer, I will see some action. Send a message via the ServiceBus Explorer to the queue.


Refresh the AzureWebJob Portal and a new entry is available.


Once you click on the Functions.ProcessQueueMessage you examine the logs.


To explore what is in my REDIS cache I need to navigate to the service in the Azure Portal and open a console. Enter GET and messagid.



As you can see the message is now in the Cache.

It took me sometime to get the ServiceBusTrigger working. After some digging around I was able to get the ServiceBusTrigger working and see its behaviour through the Azure WebJob Portal. The trigger is not limited to queues as it will also work for Service Bus Topic/Subscription. The signature of the method would look like:


Resources to explore with regards to this blog post are:




Wednesday, July 01, 2015

Book Promotion: SOA Patterns with BizTalk Server 2013 and Microsoft Azure - Second Edition.

It’s been awhile since a BizTalk title hit the market. Colin, Mahindra, Mark and Johann updated Richard Seroter’s well received book SOA Patterns with BizTalk Server 2009. I loved Richards’s book and it inspired me to write a book myself. We have met on several occasions. And last year I had the privilege of meeting Mark and Johann in person in Sydney. Both are enthusiastic integration professionals that with Colin and Mahindra’s aid created and updated this book. Excellent work guys, and respect as writing a book is a challenge!

This book covers BizTalk, WCF, JSON and Rest Support in BizTalk, Azure BizTalk Services, Azure Service Bus, SOA, Schema's and Endpoints, Asynchronous Communication Patterns, Versioning-, Orchestration Patterns, Frameworks and Tools. That is a broad spectrum of today’s integration capabilities on the Microsoft Platform (on premise and cloud).


If you are new to, intermediate- or well versed in the integration, this book is going to give you more insight into Microsoft’s integration world. The thoughts of these experienced authors and Richard original transcript will help you on your journey to build sustainable integration solutions. Especially now landscape around us is changing fast and integration has become key to success for enterprises to survive.

You read this book in your hand, read on your tablet or computer while enjoying a beer. It’s worth the investment and beneficial to your career in integration. Buy the book online at Packt Online Store, Amazon or perhaps your local bookstore.

Thanks Richard, Colin, Johann, Mark and Mahindra.


Microsoft Integration MVP 2015 – 6th Time in a row!

Today I have received an e-mail from Microsoft with exciting news that my MVP status has been renewed again!


For me this is the sixth time to receive this award. The fifth year in the program has been again an awesome experience, which gave me the opportunity to do great things and meet inspiring, very skilled people. I have had some interesting speaking engagement, which were fun to do and were very fulfilling. I learned a lot through speaking thanks to the community and fellow MVP's. I was able to share my experiences through these speaking gigs and other channels like this blog, MSDN Gallery, and above all the TechNet Wiki.

I would like to thank:
  • My MVP old lead William Jansen, and new MVP lead Birgit Huebsch.
  • The BizTalk Product Team, Mark Mortimore, Guru Venkataraman, Ed Price, Mandi Ohlinger, Allesandro Teglia, Dan Rosanova, Jon Fancey, Paolo Salvatori, and all other Microsoft employees involved.
  • People at my former employers: Rene Brauwers, Eldert Grootenboer, fellow MVP Edward Bakker and many others. 
  • At my current company DutchWorkz : Rutger van Hagen and colleguaes.
  • The BizTalk Crew: Saravana Kumar (BizTalk360), Nino Crudele, Sandro Pereira, and Tord G. Nordahl
  • Fellow Microsoft Integration MVP's: Richard Seroter, Kent Weare, Mikael Håkansson, Johan Hedberg, Stephen W. Thomas, Mick Badran (Azure), Micheal Stephenson, Tomasso Groenendijk, Nicholas Hauenstein, Salvatore Pellitteri, Sam VanHoutte, Glenn Colpeart, Bill Chesnut, Leonid Ganeline, and Ashwin Prabhu, who I got learn even better and supported me in this program.
  • The BizTalk community: Mikael Sand, Lex Hegt, Colin Meade, Naushad Alam,Howard S. Edidin, Johann Cooper, Mark Brimble, Mitch VanHelden, Sven Van den Brande, Jérémy Ronk,  Maxime Labelle, Jean-Paul Smit, Dean Robertson and the collegueaes at Mexia, Martin Abbott, and many others that make the BizTalk community strong! 
  • Andrew Slivker from Sentinet.
  • Finally my wife Lian and children Stan, Ellis and Cato for their support.
I’m looking forward to another great year in the program.



Saturday, March 14, 2015

Upcoming speaking engagements in April

In my last post I created some awareness on the upcoming BizTalk Summit in London, 13 and 14th of April. This event will be the biggest Microsoft integration focussed summit in Europe. Microsoft BizTalk product group, Microsoft Integration MVP and veterans will speak about integration, Azure and API Management.This event is once again, like the previous two events, being organized by BizTalk360 in conjunction with Microsoft and the BizTalk Product group. There are various reasons you can think to attend like for instance described in the blog post by one of the speakers Sandro Pereira.

Biztalk Summit

My topic is on Hybrid connectivity and more specifically what BizTalk Server 2013 R2 platform offers today. You can read the details below.

Hybrid Solutions with the current BizTalk Server 2013 R2 platform

The IT world has changed with the rise of the internet (cloud). Google, Amazon and Microsoft offer a variety of services in the cloud for storage to applications. Besides them there are a ton of other vendors selling software as a service (SaaS), or provide a dedicated service for instance Drop Box offering storage on demand. This means that integrating on premise, cloud services and software will generate a new demand. Enterprises will now face these challenges as they will need to integrate their on premise systems that are not likely to move to the cloud like SAP with cloud services or solutions. The latest BizTalk Server release 2013 R2 offers capabilities to full fill the demand for a new hybrid type of integration solution. In this talk various hybrid integration scenario’s will be discussed and how you can leverage Microsoft BizTalk Server 2013 R2 to build these solutions.

After this event, I will the week after travel to Sweden for Swedisch User Group event in Gothenburg and Stockholm. I am invited by Johan Hedberg to join him on stage to talk about API Management, on-premise (Sentinet) and Azure API Management.

The last stop for me in April will be the BizTalk Bootcamp in Charlotte, US. Two years I was invited to come over and talk, but was unable to make it. However, this time I will be there! I will speak on similar topics like the London event, with probability to do other talks on API management and BizTalk extensibility. The event is being organized by Mandi Ohlinger who works for Microsoft and is responsible for a lot of the technical content that you find on Azure BizTalk Services and BizTalk Server. This is a free event and registration is necessary!

Bootcamp 2

See you on the road on any of these events.


Sunday, February 08, 2015

Bigger, better, louder: BizTalk Summit 2015 London

April 13-14 London the BizTalk Summit 2015, the third time this event is organized by the world renowned BizTalk monitoring product: BizTalk360. Integration has it’s momentum now, as the IT-landscape has changed completely with the evolution of the cloud, growth of devices and being connected to everything and anything. Information Technology has reached a completly new level, where we as people are connected and consume tons of data to process and interpret.

Connectivity has become key to enable us to be connected. This means applications, systems and services need to integrate (communicate) with each other to exchange data. Data that resides in multiple places. We will not see all data move to the cloud. Reasons are privacy, regulations and divers laws. This is another main driver for integration as data needs to be pushed around.

In London you will hear and learn about Microsoft’s evolving (cloud) application platform, Microsoft Azure with its numerous services, BizTalk Server the on-premise integration product, currently in it’s 9th release. Microsoft Product Group, Microsoft Integration MVP’s and a few secret guest speakers will unleash interesting, intruiging presentations. One of these will be a presentation by myself.


All the speakers like to see all of you, who care about integration and like to meet us and Microsoft to share our devotion to current and evolving platform. You can register now for the early-bird price until 15th of March.

See you there!


Sunday, January 25, 2015

BizTalk Server 2013 R2 Integration with Cloud API

In previous post I described a way to consume a public Rest API using the BizTalk WCF-WebHttp adapter in combination with JSON-decoder, which is a new component with the BizTalk Server 2013 R2 edition. Now I like to mix things up a bit and consume a different API that is public. That is you can use this API from This is an online music discovery service that gives you personalised recommendations based on the music you listen to. To use the API of this service you need to registering yourself first. Because when you call of one of the methods of the API you need to stick in an api_key as one of the parameters of your request. This is not uncommon as various cloud API’s have this kind of mechanism.


I have the following scenario, where I have built a client application (still one of those old fashioned guys that use window forms). The client application have the following functionality:
  • Get information of an artist.
  • Get the top albums from the artist.
Information and top albums can be obtained through calling the API artist methods. The client will via BizTalk call these API methods. Similar as in my previous post calling the Airport Service to retrieve its status. Below you find an overview of the scenario.


The communication with the internal endpoint in this scenario will be SOAP/XML. The endpoint is hosted in a two way receive port (request/response). It exposes schemas for two operations: GetArtistInfo and GetTopAlbums. The request message will subsequently be mapped to a REST call to the service indicating that the expected response format is Json or default xml. BizTalk will decode the response into XML, so that it is published as an XML message in the MessageBox in case the response message is Json (GetArtistInfo) otherwise it will be just received by the adapter (GetTopAlbums). The receive port will subscribe on that message so it will be routed back as response to the client that on his turn renders it in the UI (Form). This scenario shows that BizTalk acts as a broker and protocol mediator (SOAP/XML --> REST/JSON –> SOAP/XML or SOAP/XML –> REST/XML –> SOAP/XML) between internal client and the external API.

The solution of the described scenario consists of the following parts that will be discussed in the subsequent paragraphs:
  • Exposing schema’s for internal service exposing an operation to client application that will consume the service.
  • Creating a custom receive pipeline to enable decoding of Json message to xml (see previous post).
  • Configuration of a Send Port with the Web-Http adapter (or binding if you like), send and receive.
  • Demonstrating the solution.

Exposing schema’s as service

To support both calls from the client to the API the request schema’s are as follows:

Both request schemas look the same expect for the root name. These could be consolidated to one schema, nevertheless I choose to keep each method call isolated. Both schema contain promoted properties. The elements need to be promoted to variable mapping later when configuring the send port with WCF-WebHttp adapter to support dynamic URL mapping.

The response for the GetArtistInfo will be Json and therefore I will use the postman application in Google Chrome:


Here you can see that for calling the API you need a method parameter, artist name, api_key and format. However, the format is optional. By default XML will be returned when no format has been specified. The Json response can be used as instance for creating an XSD using the JSON Schema Wizard in Visual Studio BizTalk templates. The schema looks like:

lastfm response schema 1

Similar approach will be used to get an instance of the response to the GetTopAlbums call. This schema will be based on XML. Having the schemas enabled me to create an internal service that exposes two methods.

Once I have the internal service up and running the next part is to create a custom pipeline for receiving the Json response from the GetArtistInfo API method call. The Json decoder will be specified to serialize that response into XML. For the GetTopAlbums no specific custom pipeline is necessary. The schemas and custom pipeline will be deployed to BizTalk runtime.

Creating and configuring the Send Port with the Web-Http adapter

To be able to communicate with the API and call both methods I will need to have two send ports configured with the WCF-WebHttp adapter. The API doesn’t require authentication other than supplying the api_key as a parameter in call tp any of the API methods. In the general tab of the WCF-WebHttp Transport properties the address of the service can be specified (URI). Besides the address I need to specify here the HTTP Method (GET) and perform a URL mapping.


The URL mapping will be interesting as I need to add a few parameters to my REST call.<your last fm api_key>&format=json
My HTTP Method and URL Mapping will look like:
<BtsHttpUrlMapping><Operation Method="GET" Url="/?method={method}&amp;artist={artist}&amp;api_key={api_key}&amp;format=json"/></BtsHttpUrlMapping>

Interesting thing in this URL mapping is that & is and &amp;. If you try to just use the & you will run into an error like depicted below:


Next I click Edit… to do the variable mapping i.e. map the parameters to promoted properties of my incoming request message.


Variable is mapped to the property namespace that defines the API_KEY, ARTIST and METHOD.
The general tab is important for specifying the address, method and URL mapping. The other tabs are:
  • The Binding tab provides you the ability to configure the time-out and encoding-related properties.
  • The Security tab provide you the ability to define the security capabilities of the WCF-WebHttp send port.
  • The Behaviour tab provides you the ability to configure the endpoint behavior for the send port.
  • The Proxy tab provides you the ability to configure the proxy setting for the WCF-WebHttp send port.
  • The Messages tab provides you the ability to specify how the message is sent to the REST interface.
Note: In this scenario we only use GET operation of the API service. Based on the verb you have to specify in the Suppress Body for Verbs the GET, because a payload is not required for this operation. Since BizTalk sends out messages with a payload this needs to suppress!
For further details on configuration settings for these tabs see MSDN article How to Configure a WCF-WebHttp Send Port.

Test the solution

Now building the client to call the API methods indirectly via BizTalk was quite some work. I wanted to render the information nicely in a Windows Form. When I enter an artist name and click GetInfo then a request will be send to BizTalk routed to the send port that communicates with Lastfm API and request info of the band Metallica.


The response of the message is nicely rendered in the above form. When I click TopAlbums another request is sent to a different send port that send to a different API method.


If we look at the traffic between BizTalk Server and using Netmon I can examine what goes over the wire.


This blog has demonstrate how fairly easy it is to consume a Json message with BizTalk Server 2013 R2 after invoking a Rest API. And how I was able to leverage an API from The cool thing is that BizTalk Server 2013 R2 is capable to communicate with tons of REST API’s out there in the cloud with the WCF-WebHttp adapter. And with JSON support things get less complex. I haven't tried communicating with an API that requires Basic- or OAuth authentication. I probably will have to do some custom coding using behavious like explained in the blog post from Quicklearn.