[KOSD Series] IP Addresses of Our Azure App Services that need to be Whitelisted by Our API Providers

KOSD, or Kopi-O Siew Dai, is a type of Singapore coffee that I enjoy. It is basically a cup of coffee with a little bit of sugar. This series is meant to blog about technical knowledge that I gained while having a small cup of Kopi-O Siew Dai.

kosd-azure_web_app-powershell

It is a common scenario for developers to integrate with different parties by using their APIs. Most of the time, the APIs are located in a locked-down network environment where only whitelisted IP addresses are allowed to access their APIs. We will then be asked to give the API providers the IP addresses of our servers.

If it’s our web back-end calling the APIs and we host our web applications on Microsoft Azure App Services, then how could we get the IP addresses?

As mentioned in a discussion about inbound IP address by Benjamin Perkins, the Escalation Engineer on the Azure team, there are about 4 outgoing IP addresses for an Azure Web Apps normally. To retrieve the outbound IP addresses of an Azure web app, we simply need to get it from the Properties of the web app on Azure Portal.

outbound-ip-addresses-in-azure-app-service.png

Locate the outbound IP addresses here.

We can also get the same result if we use the Azure Resource Explorer which is still in preview now.  Benjamin covered this in a video clip on his article too.

For PowerShell lovers, as pointed out by Adrian Calinescu, one of the commenters on Benjamin’s article, we can use PowerShell to find out the outbound IP addresses too. With the new Azure Cloud Shell, we can simply use the following command to retrieve directly the outbound IP addresses of an Azure web app on Azure Portal directly.

Get-AzureRmResource -ResourceGroupName  -ResourceType Microsoft.Web/sites -ResourceName  | select -expand Properties | Select-Object outboundIpAddresses
outbound-ip-addresses-in-azure-app-service-using-powershell.png

Managing Azure resources using shell directly on a browser.

For those who would like to have your own set of outbound IP addresses, please check out ASE (App Service Environment) which grants users control over inbound and outbound application network traffic.

Finally, we can also whitelist all the IP addresses of the Azure datacentres, which can be downloaded here.

azure-datacenter-ip-range-download.png

List of Microsoft Azure Datacentre IP addresses are available on Microsoft website.

References

Advertisements

[KOSD Series] First Attempt of Deploying ASP .NET Core to Azure Container Service

KOSD, or Kopi-O Siew Dai, is a type of Singapore coffee that I enjoy. It is basically a cup of coffee with a little bit of sugar. This series is meant to blog about technical knowledge that I gained while having a small cup of Kopi-O Siew Dai.

kosd-docker-azure_container_registry-vsts

Last month, after sharing the concepts and use cases of Domain Driven Development, Riza moved on to talk about Containers in the sharing session of Singapore .NET Developers Community.

microservices-not-equal-to-containers.png

Riza’s talking about Containers. Yes, microservices are not containers!

Learning Motivation

In the beginning of Riza’s talk, he mentioned GO-JEK, an Indonesia ride-hailing phone service. Due to their rapid growth, the traditional monolithic architecture can no longer support their business. Hence, they switched to use a modern approach which includes moving apps to containers.

Hence, after the meetup, I was very excited to find out more about micro-services and Docker containers. With the ability of .NET Core to be cross-platform, as a Azure lover, I am interested to find out more how I can deploy ASP .NET Core web app to a container in Azure. So, I decided to write this short article to share with my teammates about this that they can learn while drinking a cup of coffee.

Creating New Project with Docker Support

Since I am trying it out as personal project, I choose to start it with a new ASP .NET Core project. Then in the Visual Studio, I can easily turn it to be a Docker supporting app easily by checking the “Enable Docker Support” option.

enable-docker-support.png

Enable Docker Support

For existing web application projects, we will not have the screen above. Luckily, it is still easy to add Docker Support to an existing ASP .NET Core project on Visual Studio.

add-docker-support-to-existing-project

Enabling Docker Support in existing projects.

Then by clicking on the “F5” button to run the project, I manage to get the following screen (The background is customized by me). The message is displayed using the following line.

System.Runtime.InteropServices.RuntimeInformation.OSDescription;
launched-at-localhost.png

Yay, we managed to run the web app inside a Linux container locally.

Publishing to Microsoft Azure with Continuous Delivery

Without Continuous Delivery, we also can easily right-click the web application to publish it to the Container Registry on Azure.

publishing-to-container-registry

Creating a new Azure Container Registry which will have the Docker image published to.

Then, on Azure Portal, we will see three new resources added. Firstly, we will have the Container Registry.

Then, we will also have an app service site which is running the image downloaded from the Container Registry. Finally, we have an App Service Plan which needs to be at least B1 because free and shared SKUs are not available for apps running on Linux (The official Microsoft documentation says we should have the VM size of the App Service Plan to be S1 or larger though).

container-registry-on-azure.png

Container Registry for my new web app, Changshi.

To enable Continuous Delivery, I choose to use Github + Visual Studio Team Services (VSTS). By doing so, build and release will be automatically started whenever I check in code to Github.

build-on-vsts

Build history and details on VSTS.

Yup, this is so far what I have tried out in my first step of playing with containers. If you are interested, please check out the references listed below.

References

Exploring Azure Functions for Scheduler

azure-function-documentdb.png

During my first job after finishing my undergraduate degree in NUS, I worked in a local startup which was then then the largest bus ticketing portal in Southeast Asia. In 2014, I worked with a senior to successfully migrate the whole system from on-premise to Microsoft Azure Virtual Machines, which is the IaaS option. Maintaining the virtual machines is a painful experience because we need to setup the load balancing with Traffic Manager, database mirroring, database failover, availability set, etc.

In 2015, when I first worked in Singapore Changi Airport, with the support of the team, we made use of PaaS technologies such as Azure Cloud Services, Azure Web Apps, and Azure SQL, we successfully expanded our online businesses to 7 countries in a short time. With the help of PaaS option in Microsoft Azure, we can finally have a more enjoyable working life.

Azure Functions

Now, in 2017, I decided to explore Azure Functions.

Azure Functions allows developers to focus on the code for only the problem they want to solve without worrying about the infrastructure like we do in Azure Virtual Machines or even the entire applications as we do in Azure Cloud Services.

There are two important benefits that I like in this new option. Firstly, our development can be more productive. Secondly, Azure Functions has two pricing models: Consumption Plan and App Service Plan, as shown in the screenshot below. The Consumption Plan lets us pay per execution and the first 1,000,000 executions are free!

Screen Shot 2017-02-01 at 2.22.01 PM.png

Two hosting plans in Azure Functions: Consumption Plan vs. App Service Plan

After setting up the Function App, we can choose “Quick Start” to have a simpler user interface to get started with Azure Function.

Under “Quick Start” section, there are three triggers available for us to choose, i.e. Timer, Data Processing, and Webhook + API. Today, I’ll only talk about Timer. We will see how we can achieve the scheduler functionality on Microsoft Azure.

Screen Shot 2017-02-05 at 11.16.40 PM.png

Quick Start page in Azure Function.

Timer Trigger

Timer Trigger will execute the function according to a schedule. The schedule is defined using CRON expression. Let’s say if we want our function to be executed every four hours, we can write the schedule as follows.

0 0 */4 * * *

This is similar to how we did in the cron job. The CRON expression consists of six fields. The first one is second (0-59), followed by minute (0 – 59), followed by hour (0 – 23), followed by day of month (1 – 31), followed by month (1 – 12) and day of week (0-6).

Similar to the usual Azure Web App, the default time zone used in Azure Functions is also UTC. Hence, if we would like to change it to use another timezone, what we need to do is just add the WEBSITE_TIME_ZONE application setting in the Function App.

Companion File: function.json

So, where do we set the schedule? The answer is in a special file called function.json.

In the Function App directory, there always needs a function.json file. The function.json file will contain the configuration metadata for the function. Normally, a function can only have a single trigger binding and can have none or more than one I/O bindings.

The trigger binding will be the place we set the schedule.

{
    "bindings": [
        {
            "name": "myTimer",
            "type": "timerTrigger",
            "direction": "in",
            "schedule": "0 0 */4 * * *"
        },
        ...
    ],
    ...
}

The name attribute is to specify the name of the parameter used in the C# function later. It is used for the bound data in the function.

The type attribute specifies the binding time. Our case here will be timerTrigger.

The direction attribute indicates whether the binding is for receiving data into the function (in) or sending data from the function (out). For scheduler, the direction will be “in” because later in our C# function, we can actually retrieve info from the myTimer parameter.

Finally, the schedule attribute will be where we put our schedule CRON expression at.

To know more about binding in Azure Function, please refer to the Azure Function Developer Guide.

Function File: run.csx

2nd file that we must have in the Function App directory is the function itself. For C# function, it will be a file called run.csx.

The .csx format allows developers to focus on just writing the C# function to solve the problem. Instead of wrapping everything in a namespace and class, we just need to define a Run method.

#r "Newtonsoft.Json"

using System;
using Newtonsoft.Json;
...

public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
    ...
}

Assemblies in .csx File

Same as how we always did in C# project, when we need to import the namespaces, we just need to use the using clause. For example, in our case, we need to process the Json file, so we need to make use of the library Newtonsoft.Json.

using Newtonsoft.Json;

To reference external assemblies, for example in our case, Newtonsoft.Json, we just need to use the #r directive as follows.

#r "Newtonsoft.Json"

The reason why we are allowed to do so is because Newtonsoft.Json and a few more other assemblies are “special case”. They can be referenced by simplename. As of Jan 2017, the assemblies that are allowed to do so are as follows.

  • Newtonsoft.Json
  • Microsoft.WindowsAzure.Storage
  • Microsoft.ServiceBus
  • Microsoft.AspNet.WebHooks.Receivers
  • Microsoft.AspNet.WebHooks.Common
  • Microsoft.Azure.NotificationHubs

For other assemblies, we need to upload the assembly file, for example MyAssembly.dll, into a bin folder relative to the function first. Only then we can reference is as follows.

#r "MyAssembly.dll"

Async Method in .csx File

Asynchronous programming is recommended best practice. To make the Run method above asynchronous, we need to use the async keyword and return a Task object. However, developers are advised to always avoid referencing the Task.Result property because it will essentially do a busy-wait on a lock of another thread. Holding a lock creates the potential for deadlocks.

Inputs in .csx File and DocumentDB

latest-topics-on-dotnet-sg-facebook-group

This section will display the top four latest Facebook posts pulled by Azure Function.


For our case, the purpose of Azure Function is to process the Facebook Group feeds and then store the feeds somewhere for later use. The “somewhere” here is DocumentDB.

To gets the inputs from DocumentDB, we first need to have 2nd binding specified in the functions.json as follows.

{
    "bindings": [
        ...
        {
            "type": "documentDB",
            "name": "inputDocument",
            "databaseName": "feeds-database",
            "collectionName": "facebook-group-feeds",
            "id": "41f7adb1-cadf-491e-9973-28cc3fca57df",
            "connection": "dotnetsg_DOCUMENTDB",
            "direction": "in"
        }
    ],
    ...
}

In the DocumentDB input binding above, the name attribute is, same as previous example, used to specify the name of the parameter in the C# function.

The databaseName and collectionName attributes correspond to the names of the database and collection in our DocumentDB, respectively. The id attribute is the Document Id of the document that we want to retrieve. In our case, we store all the Facebook feeds in one document, so we specify the Document Id in the binding directly.

The connection attribute is the name of the Azure Function Application Setting storing the connection string of the DocumentDB account endpoint. Yes, Azure Function also has Application Settings available. =)

Finally, the direction attribute must be “in”.

We can then now enhance our Run method to include inputs from DocumentDB as follows. What it does is basically just reading existing feeds from the document and then update it with new feeds found in the Singapore .NET Facebook Group

#r "Newtonsoft.Json"

using System;
using Newtonsoft.Json;
...

private const string SG_DOT_NET_COMMUNITY_FB_GROUP_ID = "1504549153159226";

public static async Task Run(TimerInfo myTimer, dynamic inputDocument, TraceWriter log)
{
    string sgDotNetCommunityFacebookGroupFeedsJson = 
        await GetFacebookGroupFeedsAsJsonAsync(SG_DOT_NET_COMMUNITY_FB_GROUP_ID);
    
    ...

    var existingFeeds = JsonConvert.DeserializeObject(inputDocument.ToString());

    // Processing the Facebook Group feeds here...
    // Updating existingFeeds here...

    inputDocument.data = existingFeeds.Feeds;
}

Besides getting input from DocumentDB, we can also have DocumentDB output binding as follows to, for example, write a new document to DocumentDB database.

{
    "bindings": [
        ...
        {
            "type": "documentDB",
            "name": "outputDocument",
            "databaseName": "feeds-database",
            "collectionName": "facebook-group-feeds",
            "id": "41f7adb1-cadf-491e-9973-28cc3fca57df",
            "connection": "dotnetsg_DOCUMENTDB",
            "createIfNotExists": true,
            "direction": "out"
        }
    ],
    ...
}

We don’t really use this in our dotnet.sg case. However, as we can see, there are only two major differences between DocumentDB input and output bindings.

Firstly, we have a new createIfNotExists attribute which specify whether to create the DocumentDB database and collection if they don’t exist or not.

Secondly, we will have to set the direction attribute to be “out”.

Then in our function code, we just need to have a new parameter with “out object outputDocument” instead of “in dynamic inputDocument”.

You can read more at the Azure Functions DocumentDB bindings documentation to understand more about how they work together.

Application Settings in Azure Functions

Yes, there are our familiar features such as Application Settings, Continuous Integration, Kudu, etc. in Azure Functions as well. All of them can be found under “Function App Settings” section.

Screen Shot 2017-02-18 at 4.40.24 PM.png

Azure Function App Settings

As what we have been doing in Azure Web Apps, we can also set the timezone, store the App Secrets in the Function App Settings.

Deployment of Azure Functions with Github

We are allowed to link the Azure Function with variety of Deployment Options, such as Github, to enable the continuous deployment option too.

There is one thing that I’d like to highlight here is that if you are also starting from setting up your new Azure Function via Azure Portal, then when in the future you setup the continuous deployment for the function, please make sure that you first create a folder having the same name as the name of your Azure Function. Then all the files related to the function needs to be put in the folder.

For example, in dotnet.sg case, we have the Azure Function called “TimerTriggerCSharp1”. we will have the following folder structure.

Screen Shot 2017-02-18 at 4.49.11 PM.png

Folder structure of the TimerTriggerCsharp1.

When I first started, I made a mistake when I linked Github with Azure Function. I didn’t create the folder with the name “TimerTriggerCSharp1”, which is the name of my Azure Function. So, when I deploy the code via Github, the code in the Azure Function on the Azure Portal is not updated at all.

In fact, once the Continuous Deployment is setup, we are no longer able to edit the code directly on the Azure Portal. Hence, setting up the correct folder structure is important.

Screen Shot 2017-02-18 at 4.52.17 PM.png

Read only once we setup the Continuous Deployment in Azure Function.

If you would like to add in more functions, simply create new folders at the same level.

Conclusion

Azure Function and the whole concept of Serverless Architecture are still very new to me. However, what I like about it is the fact that Azure Function allows us to care about the codes to solve a problem without worrying about the whole application and infrastructure.

In addition, we are also allowed to solve the different problems using the programming language that best suits the problem.

Finally, Azure Function is cost-saving because we can choose to pay only for the time our code is being executed.

If you would like to learn more about Azure Functions, here is the list of references I use in this learning journey.

You can check out my code for TimerTriggerCSharp1 above at our Github repository: https://github.com/sg-dotnet/FacebookGroupFeedsProcessor.

Burger and Cheese

xamarin-cognitive-services-android-voicetext

As a web developer, I don’t have many chances to play with mobile app projects. So rather than limit myself to just one field, I love to explore other technologies, especially mobile app development.

Burger Project: My First Xamarin App

Last month, I attended a Xamarin talk at Microsoft Singapore office with my colleague. The talk was about authentication and authorization with social networks such as Facebook and Twitter via Azure App Service: Mobile App.

Ben Ishiyama-Levy is talking about how Xamarin and Microsoft Azure works together.

Ben Ishiyama-Levy is talking about how Xamarin and Microsoft Azure works together.

The speaker is Ben Ishiyama-Levy, a Xamarin evangelist. His talk inspired me to further explore how I could retrieve user info from social network after authenticating the users.

Because I am geek-first and I really want to find out more, so I continue to read more about this topic. With the help from my colleague, I developed a simple Xamarin.Android app to demonstrate the Authentication and logged-in user’s info retrieval.

The demo app is called Burger and it can be found on my Github repository: https://github.com/goh-chunlin/Burger.

Challenges in Burger Project

Retrieving user's info from social network.

Retrieving user’s info from social network.

In Burger project, the first big challenge is to understand how Azure App Service: Mobile App works in Xamarin. Luckily, with the material and tutorial given in the Xamarin talk from Ben, I was able to get a quick start on this.

My colleague also shared another tutorial which is about getting authenticated user’s personal details on Universal Windows Platform (UWP). It helps me a lot to understand about how mobile app and Azure App Service can work together.

My second challenge in this project is to understand Facebook Graph API. I still remember that I spent quite some time finding out why I could not retrieve the friend list of a logged-in Facebook user. With the introduction of the Facebook Graph API 2.0, access to a user’s friends list via /me/friends is limited to just friends using the same app. Hence after reading a few other online tutorials, I finally somehow able to get another subset of a user’s friends via /me/taggable_friends.

In this project, it’s also the first time I apply Reflection in my personal project. It helps me easily get the according social network login class with a neat and organized code.

microsoftdeveloperday

Microsoft Developer Day at NUS, Singapore in May 2016

Cheese Project: When Google Speech Meets MS LUIS on Android

Few months ago, I’m fortunate to represent my company to attend Microsoft Developer Day 2016 in National University of Singapore (NUS).

The day is the first time Microsoft CEO Satya Nadella comes to Singapore. It’s also my first time learn about the powerful Cognitive Services and LUIS (Language Understanding Intelligence Service) in Microsoft Azure in Riza’s talk.

presentation

Riza’s presentation about Microsoft Cognitive APIs during Microsoft Developer Day.

Challenges in Cheese Project

Everyday, it takes about one hour for me to reach home from office. Hence, I will only have two to three hours every night to work on personal projects and learning. During weekends, when people are having fun out there, I will spend time on researching about some exciting new technologies.

There are many advance topics in LUIS. I still remember that when I was learning how LUIS works, my friend was actually playing the Rise of the Tomb Raider beside me. So while he was there phew-phew-phew, I was doing data training on LUIS web interface.

luis

Microsoft LUIS (Language Understanding Intelligence Service) and Intents

Currently, I only worked on some simple intents, such as returning me current date and time as well as understanding which language I want to translate to.

My first idea in Cheese project is to build an Android app such that if I say “Please translate blah-blah to xxx language”, the app will understand and do the translation accordingly. This can be quite easily done with the help of both LUIS and Google Translate.

After showing this app to my colleagues, we realized one problem in the app. It’s too troublesome for users to keep saying “Please translate blah-blah to xxx language” every time they need to translate something. Hence, recently I have changed it to use GUI to provide language selection. This, however, reduces the role played by LUIS in this project.

voicetext

VoiceText provides a range of speakers and voices with emotions!

To make the project even more fun, I implemented the VoiceText Web API from Japanese in the Android app. The cool thing about this TTS (Text-To-Speech) API is that it allows developers to specify the mood and characteristic of the voice. The challenge, of course, is to read the API written in Japanese. =P

Oh ya, this is the link to my Cheese repository on Github: https://github.com/goh-chunlin/Cheese. I will continue to work on this project while exploring more about LUIS. Stay tuned.

languagelist    googlespeech    SuccessfullyTranslated.png

After-Work Personal Projects

There are still more things in mobile app development for me to learn. Even though most of the time I feel exhausted after long work day, working on new and exciting technologies helps me getting energized again in the evening.

I’m not as hardworking as my friends who are willing to sacrifice their sleep for their hobby projects and learning, hence the progress of my personal project development is kind of slow. Oh well, at least now I have my little app to help me talking to people when I travel to Hong Kong and Japan next year!

Journey to Microsoft Azure: Good and Bad Times

I told my friends about problems I encountered on Microsoft Azure. One of my friends, Riza, then asked me to share my experience of hosting web applications on Azure during the Singapore Azure Community meetup two weeks ago.

Azure Community March Meetup in Microsoft Singapore office.

Azure Community March Meetup in Microsoft Singapore office. (Photo credit: Riza)

Problems with On-Premise Servers

Our web applications were hosted on-premise for about 9 years. Recently, we realized that our systems were running slower and slower. The clients kept receiving timeout exception. At the same time, we also ran out of storage space. We had to drive all the way to data centre which is about 15km away from our office just to connect one 1TB external hard disk to our server.

Hence, in one of our company meetings in June, we finally decided to migrate our web applications and databases to the cloud. None of the developers, besides me, knew about cloud hosting. Hence, we all agreed to use Microsoft Azure, the only cloud computing platform that I was familiar with.

Self Learning Microsoft Azure on MVA

When I first heard that the top management of our company had the intentions to migrate web applications to cloud last year, I already started to learn Azure on Microsoft Virtual Academy (MVA) at my own time and pace.

MVA is an online learning platform for public to get free IT training, including some useful introductory courses to Microsoft Azure, as listed below.

  1. Establish Microsoft Azure IaaS Technical Fundamentals
  2. Windows Azure for IT Pros Jump Start
  3. Microsoft Azure IaaS Deep Dive Jump Start
  4. SQL Server in Windows Azure Virtual Machines Jump Start

If you have noticed, the courses above are actually mostly related to IaaS. This is because IaaS was the most straightforward way for us who were going to migrate systems and databases from on-premise to the cloud. If we had chosen PaaS, we would need to redo our entire code base.

You can enjoy the fun shows presented by David and David on MVA

You can enjoy the fun shows presented by David and David on MVA

If you are more into reading books, you can also checkout some free eBooks about Microsoft Azure available on MVA. Personally, I didn’t read any of the book because I found watching MVA training videos was far more interesting.

I learnt after work and during weekends. I started learning Azure around March and the day we did the migration from on-premise to Azure was July. So I basically had a crash course of Azure in just four months.

Now I will say that the learning approach is not recommended. If you are going to learn Azure, it’s important to understand key concepts by reading books and talking to people who are more experience with Microsoft Azure and networking. Otherwise, you might encounter some problems that were hard to be fixed in later stage.

Migration at Midnight

Before doing a migration, we had to do some preparation work.

Firstly, we called our clients one by one. This is because we also hosted clients’ websites on our server. So, we need to inform them to update A record in their DNS. Later, we found out that, in fact, they should be using CNAME so that change of IP address on our side shouldn’t affect them.

Secondly, we prepared a file called app_offline.htm. This is a file to put in the root folder of our web applications hosted on our on-premise server. It would show a page telling our online users that our application was under maintenance no matter the user visited which web page.

Website is under maintenance. Sorry about that!

Website is under maintenance. Sorry about that!

Finally, we did backup for all our databases which were running on our on-premise servers. Due to the fact that our databases were big, it took about 20-30 minutes for us to just do a backup of one database. Of course, this could only be done right before we migrated to the cloud.

We chose to do the migration at midnight because we had many online transactions going on at daytime. In our company, only my senior and I were in charge of doing the migration. The following schedule listed the main activities during our midnight migration.

  • 2am – 3am: Uploading app_offline.htm and backing up databases
  • 3am – 4am: Restoring databases on Azure
  • 4am – 5am: Uploading web applications to Azure and updating DNS in Route 53

Complaints Received on First Day after Migration

We need to finish the migration by 5am because that is when our clients start logging in to our web applications. So, everything was done in a rush and thus we received a number of calls from our clients after 6am on the day.

Some clients complaining that our system became very slow. It turns out that this has to do with us not putting our web application and databases in the same virtual network (v-net). Without putting them in the same v-net, every time our web application called the databases, they had to go through the Internet, instead of the internal connection. Thus the connection was slow and expensive (Azure charged us for outbound data transfer).

We also received calls complaining their websites were gone. That was actually caused by them not updating their DNS records fast enough.

Another interesting problem is part of our system was rejected by our client’s network because they only allowed traffics from certain IP address to access. So, we had to give them the new IP address of our Azure server before everything can work at their side again.

Downtime: The Impact and Microsoft Responses

The web applications have been running for about 8 months on Azure environment since July 2014. We encountered roughly 10 downtimes. Some are because we setup wrongly. Some are due to the Azure platform errors, as reported by Microsoft Azure team.

Our first downtime happened on 4 August 2014, from 12pm to 1:30pm. It’s expected to have high volume to our websites at noon. So, the downtime caused us to loss a huge amount of online sales. The cause of the downtime was later reported by Microsoft Azure team as all our deployments were in the affected cluster in Southeast Asia data centre.

Traffic Manager Came to Rescue

That was when we started to plan to host the backup of all our web applications in another Azure data centre. We then use traffic manager to do a failover load balancing. We planned to carry that out so that when our primary server went down, the backup server was still be there running fine.

Azure Traffic Manager helps to redirect traffic to deployments in another DC when current DC fails to work.

Azure Traffic Manager helps to redirect traffic to deployments in another DC when current DC fails to work.

In the reply Microsoft Azure team sent us, they also mentioned that uptime SLA of virtual machine requires 2 or more instances. Hence, they highly recommended to implement the Availability set configuration for our deployment. Before that, we always thought that it’s sufficient to have one instance running. However, the planned maintenance in Azure was, in fact, quite frequent and sometimes the maintenance took a long time to complete.

Database Mirroring: DB Will Always be Available

So, in addition to the traffic manager, we also applied database mirroring to our setup. We then had three database servers, instead of just one. One as principal, one as witness, and one as mirror. Regarding steps on how we set that up can be find in my another post.

Elements in my simple database mirroring setup.

Elements in my simple database mirroring setup.

With all these setup, we thought the downtime would not happen again. However, soon we realized that the database mirroring was not working.

When the principal was down, there was auto failover. However, none of our web application could connect to the mirror. Also, when the original principal was back online, it would still be a mirror until I did a manual failover. After a few experiments with Microsoft engineers, we concluded that it could be due to the fact that our web applications were not in the same virtual network as the database instances.

Availability Set: At Least One VM is Running

Up to this point, I haven’t talked about configuring two virtual machines in an availability set. That is to make sure that in the same data centre, when one of the virtual machines goes down, another will still be up and running. However, for our web applications, due to the fact that they were all using old version of .NET framework, Azure Redis Cache Service couldn’t even help.

Our web applications use session state a lot. Hence, without Redis, an external session state provider, we had no choice but to use SQL Server as the external session state provider. Otherwise, we would be limited to run web applications on only one instance.

Soon, we found out that we couldn’t even use SQL Server mode for session state because some of the values stored in our session are not serialisable. We had no other option but to rely on Traffic Manager at that moment.

In October 2014, few days after we encountered our third downtime, Microsoft Azure announced the new distribution mode in Azure Load Balancer, called Source IP Affinity. We were so happy when we heard that because that means sticky session would be possible on Azure. Soon, we configured the second instance successfully in the same availability set.

Source IP Affinity

Source IP Affinity

High Availability

After all these have been done, there were still downtime or restarts for one of the virtual machine. However, thanks to load balancer and traffic manager, our websites were still up and running. Regarding the random restarts of virtual machines, Microsoft Azure team had investigated the issue and identified that some of them were due to platform bugs.

There are still more work needs to be done to achieve high availability for our web applications on Azure. If you are interested to find out more about high availability and disaster recovery on Azure, please read this article from Microsoft Azure.

Migrating Back to On-Premise?

When we were still using on-premise, we had only one web server and one database server. However, when we moved to Azure, we had to setup seven servers. So, it’s a challenge to explain to managers on the increase of the cost.

Sometimes, our developers would be also asked by manager if moving back to on-premise was a better option. I have no answer for that. However, if we migrated back to on-premise and there was a downtime happening, who would be in charge of fixing the problems rapidly?

Hence, what we can do now as developers, is to learn as much as we can on how to improve the performance as well as the stability of our web application on Azure. In addition, we will also need to seek help from Microsoft Azure team, if necessary, to introduce new cloud solution to our web applications.

Claudia Madobe, the heroine of Microsoft Azure, is cute but how much do we really know about her?

Claudia Madobe, the heroine of Microsoft Azure, is cute but how much do we really know about her? (Image Credit: Microsoft)