Azure Blob Storage and File API

Azure Blob Storage - Azure SDK - ASP .NET MVC - Entity Framework - HTML5

When my applications were hosted on Windows Azure Virtual Machines (VM), we stored the images uploaded via our web applications in the hard disks of the VMs (except the temporary disk). However, when we started load balancing, we soon encountered a problem that the uploaded images were only found in one of the VMs. So we needed to find a centralized storage for those images.

Recently, when we are using Azure PaaS (aka Cloud Service), even without load balancing, we already encounter the same issue. That is simply because the hard drives used in Cloud Service instances are not persistent. Hence, a persistent file storage on the cloud is needed.

IaaS vs. PaaS
IaaS vs. PaaS

Blob Storage

Azure Blob Storage, according to Azure Documentation, is a service for storing large amount of unstructured data that can be accessed everywhere via HTTP or HTTPS. Hence, it is an ideal tool that we can use as the persistent image cloud storage.

There are two types of blob, Page Blob and Block Blob. Page Blob is commonly used for storing VHD files for VMs because it is optimized for random read and write operations.

For most of the files uploaded, it’s recommended to store as Block Blobs because large files will be split into smaller blocks and then uploaded concurrently. Hence, Block Blob is designed to give us faster upload and better throughput, which is great for image upload.

The maximum size for a Block Blob is 64 MB. Hence, if the uploaded file is more than 64 MB, we must upload it as a set of blocks; otherwise, we will receive status code 413 (Request Entity Too Large). For my web applications, there is no need for uploading an image which is more than 5MB most of the time. Hence, I can just limit the size of images before the user uploads them.

HttpPostedFileBase imageUpload;
...
if (imageUpload.ContentLength > 0 && imageUpload.ContentLength <= 5242880)
{
    //warn the user to resize the image
}

Let’s Try Uploading Images

I’m going to share how to upload more than one image to the Azure Blob Storage from an ASP .NET MVC 5 application. If you are going to upload just one image, simply remove the for loop and change List to just DBPhoto in the codes below.

First of all, I create a class to handle upload to Azure Storage operation.

public class AzureStorage
{
    public static async Task UploadAndSaveBlobAsync(
        HttpPostedFileBase imageFile, CloudBlobContainer container)
    {
        string blobName = Guid.NewGuid().ToString() + 
            Path.GetExtension(imageFile.FileName);

        CloudBlockBlob imageBlob = container.GetBlockBlobReference(blobName);
        using (var fileStream = imageFile.InputStream) 
        {
            await imageBlob.UploadFromStreamAsync(fileStream);
        }

        return imageBlob;
    }
}

So, in my controller, I have the following piece of code which will be called when an image is submitted via web page.

[HttpPost]
[ValidateAntiForgeryToken]
public async Task Create(
    [Bind(Include = "ImageUpload")] PhotoViewModel model)
{
    var validImageTypes = new string[] { "image/jpeg", "image/pjpeg", "image/png" };
    
    if (ModelState.IsValid) 
    {
        if (model.ImageUpload != null && model.ImageUpload.Count() > 0)
        {
            var storageAccount = CloudStorageAccount.Parse 
                (WebConfigurationManager.AppSettings["StorageConnectionString"]);

            var blobClient = storageAccount.CreateCloudBlobClient();
            blobClient.DefaultRequestOptions.RetryPolicy = 
                new LinearRetry(TimeSpan.FromSeconds(3), 3);  

            var imagesBlobContainer = blobClient.GetContainerReference("images");
            foreach (var item in model.ImageUpload) 
            { 
                if (item != null) {
                    continue;
                }
                
                if (validImageTypes.Contains(item.ContentType) && 
                    item.ContentLength > 0 && item.ContentLength <= 5242880)
                {
                    var blob = await AzureStorage.UploadAndSaveBlobAsync(item, imagesBlobContainer);
                    DBPhoto newPhoto = new DBPhoto(); 
                    newPhoto.URL = blob.Uri.ToString();
                    db.DBPhoto.Add(newPhoto); 
                } 
                else 
                {
                    // Show user error message 
                    return View(model); 
                }
            }
            db.SaveChanges();
            ... 
        } 
        else
        {
            // No image to upload
        } 
    }
    return View(model);
}

In the code above, there are many new cool things.

Firstly, it is the connection string to Azure Blob Storage, which I store in StorageConnectionString in web.config. The format for secure connection string is as follows.

DefaultEndpointsProtocol=https;AccountName=;AccountKey=;
Retrieve the access keys to the Storage Account.
Retrieve the access keys to the Storage Account.

Secondly, it’s LinearRetry. It is basically a retry policy which states how many times the program will retry and how much time needed between retries. In my case, it will only wait for 3 seconds after each try up to 3 tries.

Thirdly, I get the URL of the image on the Azure Blob Storage via blob.Uri.ToString() and store it into the database table. The URL will be used later for displaying the image as well as deleting the image.

Fourthly, I actually check to see if model.ImageUpload has null entries. This is because if I submit the form without any image to upload, model.ImageUpload has one entry. Not zero, but one. The only one entry is actually null. So if I don’t check to see whether the entry in model.ImageUpload is null, there will be an exception thrown.

The controller has such a long code. Luckily the code needed in the model and view is short and simple.

For the model PhotoViewModel, I have the following.

public class PhotoViewModel
{
    ...
    
    [Display(Name = "Current Images")]
    public List AvailablePhotos { get; set; }
}

For view, it is easy to allow selecting multiple files in the same view page. The “multiple = “true”” is to make sure more than one file can be selected in the File Explorer. You can omit this attribute if you only want at most one file being selected.

@Html.LabelFor(model => model.ImageUpload, new { style = "font-weight: bold;" })
@Html.TextBoxFor(model => model.ImageUpload, new { type = "file", multiple = "true" })
@Html.ValidationMessageFor(model => model.ImageUpload)

Image Size and HttpException

The image upload function looks fine. However, when images having size larger than a certain size is uploaded, HttpException will be thrown.

There is no way that having exception would be fun too! (Image Credit: Tari Tari)
There is no way that having exception would be fun too! (Image Credit: Tari Tari)

In order to prevent DOS attacks which upload huge files to the server, IIS by default only allows files which have size less than 4MB to be uploaded. Hence, although I earlier put a check to prevent image larger than 5MB to be uploaded, the exception will still be thrown if an image of size between 4 to 5MB is uploaded.

What if we just change the if clause above to allow only at most 4MB of image being uploaded? This won’t work because the exception is already thrown before the if condition is reached.

Then, can we just increase the IIS limit from 4MB to, let’s say, 100MB or something bigger? Sure. This can work. However, it still doesn’t stop someone uploads something bigger than the limit. Also, it makes attackers easier to exhaust your server with big files. Hence, expanding the upload size restriction is not really a full solution.

If you are interested, there are many good articles online discussing about this problem. I highlight some interesting ones below.

  1. Use HttpModule to Handle File Uploads;
  2. Use RIA (Rich Internet Application) Services in Silverlight (Seriously, we are talking about Silverlight in year 2015?);
  3. SubStatusCode = 13 in IIS 7;
  4. Catch the Exception in Global.asax.

I don’t really like the methods listed above, especially the 3rd and 4th options. It’s already too late to inform the user when the exception is thrown. Could we do something at client side before the images are being uploaded?

Luckily, we have File API in HTML 5. It allows to loop through the files in JavaScript to check their size. So, after the submit button is clicked, I will call a JavaScript method to check for the size of the images before they are being uploaded.

function IsFileSizeAcceptable() {
    if (typeof FileReader !== "undefined") {
        var filesBeingUploaded = document.getElementById('ImageUpload').files;
        for (var i = 0; i < filesBeingUploaded.length; i++) {
            if (filesBeingUploaded[i].size >= 4194304) { // Less than 4MB only
                alert('The file ' + filesBeingUploaded[i].name + ' is too large. Please remove it from your selection.');
                return false;
            }
        }
    }
    return true;
}
File API is currently supported in major modern browsers. (Image Credit: http://caniuse.com/#feat=fileapi)
File API is currently supported in major modern browsers. (Image Credit: http://caniuse.com/#feat=fileapi)

Remove from Azure Blob Storage

It’s normal that files uploaded to storage will be removed later. So how are we going to implement this feature in our ASP .NET MVC 5 application?

First of all, I added the following code to my AzureStorage.cs.

public static async Task DeleteBlobAsync(Uri blobUri, CloudBlobContainer container)
{
    string blobName = blobUri.Segments[blobUri.Segments.Length - 1];
    CloudBlockBlob blobToDelete = container.GetBlockBlobReference(blobName);

    await blobToDelete.DeleteAsync(); 
}

Secondly, I just pass in the Azure Storage URL of the image that I would like to remove and then call the DeleteBlobAsync method.

Uri blobUri = new Uri();
await AzureStorage.DeleteBlobAsync(blobUri, imagesBlobContainer);

Then the image will be deleted from the Azure Storage successfully.

Global.asax.cs and Blob Container

In order to have my application to create a blob container automatically if it doesn’t already exist, I add a few lines in Global.asax.cs as follows.

var storageAccount = CloudStorageAccount.Parse(
    WebConfigurationManager.AppSettings["StorageConnectionString"]);
var blobClient = storageAccount.CreateCloudBlobClient();
var imagesBlobContainer = blobClient.GetContainerReference("images");
if (imagesBlobContainer.CreateIfNotExists())
{
    imagesBlobContainer.SetPermissions(new BlobContainerPermissions
        {
            PublicAccess = BlobContainerPublicAccessType.Blob
        });
}

Write a Console Program to Upload File to Azure Storage

So, how is it done if we are developing a console application, instead of web application?

Windows Azure Storage NuGet Package needs to be installed first.
Windows Azure Storage NuGet Package needs to be installed first.

The codes below show how I upload an html file from my local hard disk to Azure Blob Storage. Then I can share the Azure Storage URL of the file to my friends so that they can read the web page.

Similar to what I do in web application, this is how I connect to the Storage account via https.

var azureStorageAccount = new CloudStorageAccount(
    new StorageCredentials("", ""), true);

This is how I access the container.

var blobClient = new CloudBlobClient(azureStorageAccount.BlobStorageUri, azureStorageAccount.Credentials);
var container = blobClient.GetContainerReference("myfiles");

Then the next thing I do is just upload the local file to Azure Storage by specifying the file name, content type, etc.

CloudBlockBlob blob = container.GetBlockBlobReference("mysimplepage.html");
using (Stream file = System.IO.File.OpenRead(@"C:\Users\ChunLin\Documents\mysimplepage.html")) 
{
    blob.Properties.ContentType = "text/html"; 
    blob.UploadFromStream(file); 
}

Yup, that’s all. =)

Pricing

Hosting your files on cloud storage is sure convenience. However, Azure Blob Storage is not free. The following table shows the current pricing of Azure Block Blob Storage in South East Asia region. To get the latest pricing details, please visit Azure Storage Pricing page.

Azure Standard Block Blob Storage in SEA Pricing
Azure Standard Block Blob Storage in SEA Pricing

Summer 2015 Self-Learning Project

This article is part of my Self-Learning in this summer. To read the other topics in this project, please click here to visit the project overview page.

Summer Self-Learning Banner

Entity Framework and Database

By using Entity Framework, we can save a lot of time on writing SQL ourselves because Entity Framework, a Microsoft-supported ORM for .NET, is able to generate the SQL for us.

I started to use ADO .NET when I was building .NET web applications in my first job. I learnt about how to call stored procedures with ADO .NET. I witnessed how my colleague wrote a 400-line SQL to complete a task which we normally will choose to do it in C#. I also realized the pain of forgetting to update the stored procedure when the C# code is already different.

After that, my friend introduced me Entity Framework when I was working on my first ASP .NET MVC project. Since then, I have been using Entity Framework because it enables me to deliver my web applications faster without writing (and debugging) any SQL myself. I read a very interesting article comparing between Entity Framework and ADO .NET. The author also acknowledged that the performance of Entity Framework was slower than hand-coded ADO .NET. He emphasized that, however, Entity Framework did maximize his productivity.

How I react when I read a 400-line stored procedure submitted by my colleague.
How I react when I read a 400-line stored procedure submitted by my colleague.

What Is Happening in Database with Entity Framework?

The SQL generated by Entity Framework is believed to be pretty good. However, it’s still nice to be aware of what SQL is being generated. For example, I have the following code to retrieve Singapore weather info.

using (var db = new ApplicationDbContext())
{
    var forecastRecords = db.SingaporeWeathers.ToList();
}

In Visual Studio, I can just mouse-over “SingaporeWeather” to get the following query.

SELECT 
    [Extent1].[RecordID] AS [RecordID], 
    [Extent1].[LocationID] AS [LocationID], 
    [Extent1].[WeatherDescription] AS [WeatherDescription], 
    [Extent1].[Temperature] AS [Temperature], 
    [Extent1].[UpdateDate] AS [UpdateDate]
FROM [dbo].[SingaporeWeathers] AS [Extent1]

If I have the following code which retrieves only records having temperature greater than 37, then I can use ToString().

using (var db = new ApplicationDbContext())
{
    var query = from sw in db.SingaporeWeathers where sw.Temperature > 37 select sw;
    Console.WriteLine(query.ToString());
}
SELECT
     [Extent1].[RecordID] AS [RecordID],
     [Extent1].[LocationID] AS [LocationID],
     [Extent1].[WeatherDescription] AS [WeatherDescription]
     [Extent1].[Temperature] AS [Temperature],
     [Extent1].[UpdateDate] AS [UpdateDate]
FROM [dbo].[SingaporeWeathers] AS [Extent1]
WHERE [Extent1].[Temperature] > cast(37 as decimal(18))

I am using DBContect API, so I can just use ToString(). Alternatively, you can also use ToTraceString(), which is a method of ObjectQuery, to get the generated SQL.

SQL Logging in Entity Framework 6

It is a great news for developer when Entity Framework is announced to have SQL Logging feature added For example, to write database logs to a file, I just need to do as follows.

using (var db = new ApplicationDbContext())
{
    var logFile = new StreamWriter("C:\\temp\\log.txt");
    db.Database.Log = logFile.Write;
    var forecastRecords = db.SingaporeWeathers.Where(x => x.Temperature > 37).ToList();
    logFile.Close();
}

Then in the log file, I can see logs as follows.

...
Closed connection at 6/6/2015 10:59:32 PM +08:00
Opened connection at 6/6/2015 10:59:32 PM +08:00
SELECT TOP (1) 
    [Project1].[C1] AS [C1], 
    [Project1].[MigrationId] AS [MigrationId], 
    [Project1].[Model] AS [Model], 
    [Project1].[ProductVersion] AS [ProductVersion]
FROM ( SELECT 
    [Extent1].[MigrationId] AS [MigrationId], 
    [Extent1].[Model] AS [Model], 
    [Extent1].[ProductVersion] AS [ProductVersion], 
    1 AS [C1]
    FROM [dbo].[__MigrationHistory] AS [Extent1]
    WHERE [Extent1].[ContextKey] = @p__linq__0
) AS [Project1]
ORDER BY [Project1].[MigrationId] DESC
-- p__linq__0: 'MyWeb.Migrations.Configuration' (Type = String, Size = 4000)
-- Executing at 6/6/2015 10:59:32 PM +08:00
-- Completed in 70 ms with result: SqlDataReader

Closed connection at 6/6/2015 10:59:32 PM +08:00
Opened connection at 6/6/2015 10:59:32 PM +08:00
SELECT 
    [Extent1].[RecordID] AS [RecordID], 
    [Extent1].[WeatherDate] AS [WeatherDate], 
    [Extent1].[WeatherDescription] AS [WeatherDescription], 
    [Extent1].[WeatherSecondaryDescription] AS [WeatherSecondaryDescription], 
    [Extent1].[IconFileName] AS [IconFileName], 
    [Extent1].[Temperature] AS [Temperature], 
    [Extent1].[UpdateDate] AS [UpdateDate]
FROM [dbo].[Weathers] AS [Extent1]
WHERE [Extent1].[Temperature] > cast(37 as decimal(18))
-- Executing at 6/6/2015 10:59:33 PM +08:00
-- Completed in 28 ms with result: SqlDataReader
...

So, as you can see, even the Code First migration related activity is logged as well. If you would like to know what are being logged, you can read an article about SQL Logging in EF6 which was written before it’s released.

Migration and the Verbose Flag

Speaking of Code First migration, if you would like to find out the SQL being generated when Update-Database is executed, you can add a Verbose flag to the command.

Update-Database -Verbose

Navigation Property

“I have no idea why tables in our database don’t have any relationship especially when we are using relational database.”

I heard from my friend that my ex-colleague shouted this in the office. He left his job few days after. I think bad codes and bad design do anger some of the developers. So, how do we do “relationship” in Entity Framework Code First? How do we specify the foreign key?

I quit!
I quit!

In Entity Framework, we use the Navigation Property to represent the foreign key relationship inside the database. With Navigation Property, we can define relationship between entities.

If we have a 1-to-1 Relationship between two entities, then we can have the following code.

public class Entity1
{
    [Key]
    public int Entity1ID { get; set; }
    public virtual Entity2 Entity2 { get; set; }
}

public class Entity2
{
    [Key, ForeignKey("Entity1")]
    public int Entity1ID { get; set; }
    public virtual Entity1 Entity1 { get; set; }
}

By default, navigation properties are not loaded. Here, the virtual keyword is used to achieve the lazy loading, so that the entity is automatically loaded from the database the first time a property referring to the entity is accessed.

However, there are people against using virtual keyword because they claim that lazy loading will have subtle performance issue in the application using it. So, what they suggest is to use the include keyword, for example

dbContext.Entity1.Include(x => x.Entity2).ToArray();

By specifying the ForeignKey attribute for Entity1ID in Entity2 class, Code First will then create a 1-to-1 Relationship between Entity1 and Entity2 using the DataAnnotations attributes.

For 1-to-n Relationship, we then need to change the navigation property, for example, in Entity1 class to use collection as demonstrated in the code below.

public class Entity1
{
    [Key]
    public int Entity1ID { get; set; }
    public virtual ICollection<Entity2> Entity2s { get; set; }
}

Finally, how about n-to-m Relationship? We will just need to change the navigation property in both Entity1 and Entity2 classes to use collection.

public class Entity2
{
    [Key]
    public int Entity2ID { get; set; }
    public virtual ICollection<Entity1> Entity1s { get; set; }
}

Together with the following model builder statement.

protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
    modelBuilder.Entity<Entity2>()
        .HasMany(e2 => e2.Entity1s)
        .WithMany(e1 => e1.Entity2s)
        .Map(e12 => 
            {
                e12.MapLeftKey("Entity1ID");
                e12.MapRightKey("Entity2ID");
                e12.ToTable("Entity12");
            });
}

The code above is using Fluent API which won’t be discussed in this post.

Database Context Disposal

When I first used Scaffolding in MVC 5, I noticed the template of controller class it generates look something as follows.

public class MyController : Controller
{
    private MyContext db = new MyContext();
    
    protected override void Dispose(bool disposing)
    {
        if (disposing) 
        {
            db.Dispose(); 
        } 
        base.Dispose(disposing);
    }
}

Before using Scaffolding, I have always been using the Using block, so I only create database context where I have to, as recommended in a discussion on StackOverflow. Also, the Using block will have the Dispose() be called automatically at the end of the block, so I don’t need to worry about forgetting to include the Dispose() method to dispose the database context in my controller.

Azure SQL: Database Backup and Restore

Before ending this post, I would like to share about how DB backup and restore is done in Azure SQL Database.

First of all, Azure SQL Database has built-in backups and even self-service point in time restores. Yay!

For each activate databases, Azure SQL will create a backup and geo-replicate it every hour to achieve 1-hour Recovery Point Objective (RPO).

If there is a need to migrate the database or archive it, we can also export the database from Azure SQL Database. Simply click on the Export button in the SQL Databases section of Azure Management Portal and then choose an Azure blob storage account to export the database to.

Finally, just provide the server login name and password to the database and you are good to go.

Export DB from Azure SQL Database.
Export DB from Azure SQL Database.

Later, we can also create a new database using the BACPAC file which is being generated by the Export function. In the Azure Management Portal, click New > Data Services > SQL Database > Import. This will open the Import Database dialog, as shown in the screenshot below.

Create a new database in Azure SQL Database by import BACPAC file.
Create a new database in Azure SQL Database by import BACPAC file.

Okai, that’s all for this post on Entity Framework, database, and Azure SQL Database. Thank you for your time and have a nice day!

Summer 2015 Self-Learning Project

This article is part of my Self-Learning in this summer. To read the other topics in this project, please click here to visit the project overview page.

Summer Self-Learning Banner

Summer 2015 Self-Learning

Summer Self-Learning
It has been about half a year since I started to learn ASP .NET MVC and Entity Framework (EF). In this period of time, I have learnt about not just MVC and EF, but also Azure PaaS, Google Maps API, web application security, cool jQuery plugins, Visual Studio Online, etc.

In the beginning of May, I started to note down useful things I’d learned in my learning journey. Months of bringing together information in this summer has helped me compile my notes about what I’ve learned in the past 6 months. I have currently completed compiling notes for 17 topics that I’ve learnt in this summer.

I listed down the title of the 17 posts below to give you a quick overview about all the 17 topics.

Contents

ASP .NET MVC and Entity Framework

Security

Microsoft Azure

Google APIs

Web Development Tools

Learning After Work

I’m working in Changi Airport. The office working hour is from 8:30am to 6pm. In addition, I am staying quite far from the airport which will take about one hour for me to travel from home to office. Hence, the only time that I can have sufficient time to work on personal projects is weekends.

This summer self-learning project is originally planned to be done by the end of May. Normally, it takes me about one day to finish writing a post. After that, if I find any new materials about the topics, I will then modify the post again. Sometimes, however, I am just too tired and I would not write anything even though it’s weekend. Hence, I end up finishing all the 17 topics three months later.

This summer learning project covers not only what I’ve learnt in my personal projects, but also new skills that I learn in my workplace. I always enjoy having a chat with my colleagues about the new .NET technology, app development, Azure hosting, and other interesting development tools. So yup, these 17 articles combine all the new knowledge I acquire.

I’m also very happy that that I am able to meet developers from both .NET Developers Community Singapore and Azure Community Singapore and share with them what I’ve learnt. That gives me a great opportunity to learn from those experienced .NET developers. =)

Azure Community March Meetup in Microsoft Singapore office.
Azure Community March Meetup in Microsoft Singapore office.

I am not that hardworking to work on personal projects every day. Sometimes, I will visit family and friends. Sometimes, I will travel with friends to overseas. Sometimes, I will play computer games or simply just sleep at home. So ya, this self-learning project takes a longer time to complete. =D

Working on personal projects after work is stressful also. Yup, so here is a music that helps reducing my stress. =)

Journey to ASP .NET MVC 5

When I first worked as web developer after graduation, I used to think what I knew about web development was already enough. However, as I learned more from friends and colleagues, I realized how difficult the field is, even though in Easibook.com we were just dealing with ASP .NET for web development.

New Ideas

Singapore .NET Developers Community meetup
Singapore .NET Developers Community meetup (Photo Credit: .NET Developers Singapore)

I participated in the Singapore .NET Developers Community meetup with my colleagues on 28 January. The theme is about web development. We had the chance to learn about ASP .NET MVC 5, Dependency Injection and how ASP .NET MVC 5 works with Angular JS.

What interested me is the ASP .NET MVC 5 talk given by Nguyen Quy Hy. In work, I was always using ASP .NET Web Forms. When I first started the ASP .NET MVC project in Visual Studio, I was already shocked by new terminologies like Razor, Identity, Scaffold, and all sort of folders, such as Models, Views, Controllers, App_Start, etc. Those are basically not found in my existing Web Forms project.

Working in a startup, there is always more to do and even more to learn, no matter the size of business. In many ways, my job changes frequently. I have to always take time to learn and challenge myself to play with new technology. Hence, learning ASP .NET MVC becomes my new challenge in this year.

I thus decided to write this post to share about what I’ve learned in my ASP .NET MVC 4/5 projects in February.

Bootstrap

Let’s start with simple stuff first. The GUI.

It’s nowadays quite common that people want a website which is responsive and mobile friendly. Luckily, there are frameworks to help. A even better news is that Visual Studio web application template by default is using Bootstrap, a framework providing design and theming features.

Previously we were using VS 2008. There was no such thing as bootstrap in our Web Forms application. Hence, I only started playing with Bootstrap when I did my first ASP .NET MVC 4 project in VS 2012.

ASP .NET web server controls can no longer be seen in ASP .NET MVC project. I was once asked about how GridView and paging were going to be handled in ASP .NET MVC without the use of the web server controls. I found some online discussions and articles which gave good answer to the question.

  1. Grid Controls for ASP .NET MVC
  2. Bootwatch: Free themes for Bootstrap including table and paging themes
  3. Paging, Searching, and Sorting in ASP .NET MVC 5
  4. ASP .NET MVC Paging Done Perfectly with @Html.PagedListPager()

12-Column Grid System is another thing I learnt when playing with Bootstrap. The grid system allows us to easily create complex grid layouts for different devices.

Grid System of Bootstrap 3
Grid System of Bootstrap 3

With the help of Bootstrap, even before I do anything, my web application is already responsive and mobile friendly. It’s true that technology is just a tool but with the right tools, we are able to work more efficiently and productively. =)

Native Support of Clean URL: Good News for SEO

My colleague, who was doing SEO, always received requests to do URL Rewrite in our existing Web Forms applications. Whenever there is a new page created, he has to add a new rule to web.config, sometimes just to get rid of the .aspx thingy.

<urlrewritingnet rewriteOnlyVirtualUrls="true" contextItemsPrefix="QueryString" defaultPage="default.aspx" xmlns="http://www.urlrewriting.net/schemas/config/2006/07">
    <rewrites>
        <add name="RedirectInDomain" virtualUrl="^http\://(.*)/SomethingFriendly"
            rewriteUrlParameter="IncludeQueryStringForRewrite" 
            destinationUrl="~/test.aspx" ignoreCase="true" 
            redirectMode="Permanent" rewrite="Domain" />
        ...
 
     </rewrites>
 </urlrewritingnet>

If there are one thousand pages, then there will be same amount of rules. So in the end, we even need to create separate config file just to keep the rules for URL rewrite.

In ASP .NET MVC 5, with the help of ASP .NET Routing, URLs no need to be mapped to specific web pages. Hence, in MVC web application, we can always see clean URLs which is friendly to not only the web crawler but also sometimes to the users. This is one of the features that I love in ASP .NET MVC.

Identity and Social Network Login

Whenever I visit an online store, I always find it more customer-friendly to accept Facebook or Google login.

Fortunately, ASP .NET Identity is powerful enough to not just accept application-wise user name and password, but also allows the connections from social websites like Facebook, Twitter, and Google+.

I only need to create a Facebook app and then key in the https URL of my website. After that, I put both the application ID and secret key to Startup.Auth.cs. Tada, users can now login to my website with their Facebook credentials.

app.UseFacebookAuthentication(
    appId: "<Facebook app ID here>",
    appSecret: "<Facebook app secret here>");
Localhost HTTPS URL is also accepted! =)
Localhost HTTPS URL is also accepted! =)

Just in case if you also encounter exception saying “Object reference not set to an instance of an object” on the line with AuthenticationManager.GetExternalLoginInfoAsync(), as shown in the following screenshot, please update Microsoft.Owin.Security.Facebook Nuget package.

Facebook Login Exception. Boom!
Facebook Login Exception. Boom!
Update nuget.org - Microsoft.Owin.Security.Facebook
Update nuget.org – Microsoft.Owin.Security.Facebook

Entity Framework Code First

Due to the fact that my project is a new one. So, I used Code First to help me create tables in a new database according to my Model definition.

There is also a video on MSDN Data Developer Center website where they give an introduction to Code First development.

I like how easy it is to have all my tables created auto-magically by just defining model using classes. Then after that, I can create new views and controller by adding Scaffold.

Easily create MVC controller and views with Scafolding
Easily create MVC controller and views with Scafolding

Headache with Migrations

In order to have database scheme updated when the model is changed, I have enabled migration by running the Enable-Migrations command.

Ran Enable-Migrations command in the Package Manager Console
Ran Enable-Migrations command in the Package Manager Console

After that, whenever I changed my model classes, I will run Update-Database to have database schema updated as well. However, soon I encountered a problem.

When I was working on an ASP .NET MVC 4 project with VS2012, the Id in the Users table is integer. So, in VS2013, I assumed it to be the same when I created the model classes and updated the database. Unfortunately, nope. The default web application of VS2013 uses GUID for user ID. There is an online tutorial on how to change the primary key of Users back to integer, if you are interested.

Due to the fact that my project is a totally new project, so what I am going to do is just to change my model classes to use GUID as the type of storing user ID in other tables. However, when I ran the Update-Database command, the console prompted me an error message, saying “Operand type clash: int is incompatible with uniqueidentifier”. To quickly get rid of this problem, I deleted my tables (Don’t do this at home. =P) from the database. Then when I ran Update-Database command again, it complaint the table was missing. Finally, I had no choice but deleting the relevant records in __MigrationHistory table before making Update-Database to work again. =P

Yay, successfully updated database schema after deleting migration history.
Yay, successfully updated database schema after deleting migration history.

Yay with Entity Framework

Before using Entity Framework, I played with stored procedure for few years. My colleagues have always been complaining that sometimes the logic was being hidden in stored procedures and thus made the debugging difficult. Also, having logic in stored procedures means that our business logic is actually split up into both C# and SQL. So, sometimes the developers need to spend a few hours debugging the C# code before realizing the store procedure was actually the culprit.

With Entity Framework, I am now able to modify the table structure and logic all in C# code which helps developers to easily find out where goes wrong.

Still, sometimes it is good to group related functions into one well-defined stored procedure so that the system only needs to call to the database once to get all the work done. However, after reading a 400-line store procedure once, I decided that doing this may not be the best option because no one in my team was interested to debug SQL code.

Review a long stored procedure?
Review a long stored procedure?

There are more related topics online regarding Entity Framework vs. Stored Procedures, as listed below. If you are interested, feel free to check them out.

  1. Entity Framework Vs Stored Procedures – Performance Measure
  2. Stored Procedure or Entities?

Using MySQL Instead of Default SQL Server: I Was Having a Hard Time

By default, the data provider of ASP .NET Identity with Entity Framework is set to be MS SQL in VS 2013. However, MS SQL Server is not free. So, I decided to use MySQL instead. Hence, I need to find ways to configure Entity Framework on my project to work with MySQL.

The first tutorial that I started with is a detailed step-by-step guide on ASP .NET website regarding how to use use MySQL Storage with an Entity Framework MySQL Provider. It mainly involves steps on changing the web.config. Some important steps are listed below.

Change database connection string.

<add name="DefaultConnection" connectionString="Server=localhost;Uid=root;Pwd=password;Database=mediablog;" providerName="MySql.Data.MySqlClient" />

Configure Entity Framework to use MySQL.

<entityFramework codeConfigurationType="MySql.Data.Entity.MySqlEFConfiguration, MySql.Data.Entity.EF6">
    <defaultConnectionFactory type="System.Data.Entity.Infrastructure.SqlConnectionFactory, EntityFramework" />
    <providers>
        <provider invariantName="MySql.Data.MySqlClient" type="MySql.Data.MySqlClient.MySqlProviderServices, MySql.Data.Entity.EF6, Version=6.9.5.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d" />
    </providers>
</entityFramework>
<system.data>
    <DbProviderFactories>
        <add name="MySQL Data Provider" invariant="MySql.Data.MySqlClient" description=".Net Framework Data Provider for MySQL" type="MySql.Data.MySqlClient.MySqlClientFactory, MySql.Data, Version=6.8.3.0" />
    </DbProviderFactories>
</system.data>

However, if I am not wrong, part of these can be done easily by just including the related MySQL nuget packages. I chose four of them to be installed in my project: MySQL.Data, MySQL.Data.Entity, MySQL,Data.Entities, and MySQL.Web.

Install related NuGet packages to make Entity Framework Code First works with MySQL.
Install related NuGet packages to make Entity Framework Code First works with MySQL.

After changing web.config, I followed the tutorial to introduce two new classes in the project. One is MySqlHistoryContext.cs which will sync the model changes with the database schema using MySQL standard and not MS SQL.

According to an online post, I added extra one line to the OnModelCreating method MySQLHistoryContext.cs. It’s to fix the exception of the famous Error 0040: The Type nvarchar(max) is not qualified with a namespace or alias. Only primitive types can be used without qualification.

modelBuilder.Properties<String>().Configure(c => c.HasColumnType("longtext"));

However, the Error 0040 didn’t disappear because of this line. I will share later the other steps I took to fix this problem.

The famous Error 0040 encountered when doing migrations for MySQL.
The famous Error 0040 encountered when doing migrations for MySQL.

Another new class is called MySqlConfiguration which is used to make sure the Entity Framework will use MySqlHistoryContext, instead of the default one.

Besides, I also made changes to Configuration.cs. Remember the Error 0040? A discussion thread on Github actually suggested to add the following line to fix it.

SetSqlGenerator("MySql.Data.MySqlClient", new MySql.Data.Entity.MySqlMigrationSqlGenerator());

This didn’t fix the Error 0040 on my project too.

In the end, I found a Chinese post which said the following.

此时只需要将Data层的Migrations的文件夹删掉即可。因为SqlServer做过一些迁移,有些数据类型与MySql不兼容。

The sentence basically says that due to the fact that the migration earlier done in SQL Server and thus some data types are not compatible with MySQL, we need to delete the Migrations folder. So, after I excluded the 201502231459263_InitialCreate.cs file (which was created when I am still using MS SQL for my project) in Migrations folder from the project, the Error 0040 was gone when I did Update-Database. Yay!

So yup, sometimes it’s very, very useful to know more than one language. And yup, I spent half of my holiday to figure out how to make Entity Framework to work with MySQL. =)

Oh well, half day gone just to make MySQL work in my little project.
Oh well, half day gone just to make MySQL work in my little project.

By the way, the Chinese web page mentioned above was already not available. What I shared with you is actually a link to its Google cached copy. I am not sure if the cache is still around when you visit it.

Self Learning ASP .NET MVC on MVA during Chinese New Year

The talks given during the community meetup are good. However, in order to learn more, I also need to get advice from my colleagues who have more experience with ASP .NET MVC.

In addition, during Chinese New Year period, instead of watching the new year shows, I stayed in front of my computer to complete the introductory series of ASP .NET MVC delivered by two Microsoft experts, Christopher Harrison and Jon Galloway. It’s definitely a good starting point for beginners. And yup, the two speakers are very good at explaining the key concepts and they also tell good jokes so you shouldn’t find the course to be boring. =P

Yup, people from Malaysia are watching the live too!
Yup, people from Malaysia are watching the live too!

The End of the Beginning

I am now still a beginner in ASP .NET MVC. I always find that there are many new things to learn in just web development. Actually, it’s very challenging. For example, to get Entity Framework Code First to work with MySQL already takes me half day to figure it out.

Anyway, this is just a post sharing how I get started on ASP .NET MVC. In the future, I will do my best to share with you all more about what I learn in this cool technology. =)