So it’s nearly a month since I have written a blog post. Yikes!

Not that I haven’t wanted to and lord knows I have enough material to blog about from this current projectto keep me going for the next few months. Just have not had time.

However, I figured I would take a 10 minute break and jot down some of my experiences migrating a project to Azure and the pitfalls I have come across.

So read on for some (hopefully) useful tips

Number 1

Ok so the first pitfall of developing for Azure is not a new one and you will encounter this on any platform that hosts a web site in a web farm. Session State.

If you are using Azure for scale (and that is one of the reasons you would be) then you won’t be running your site on 1 box you will be running it across a few or perhaps many. As soon as you do that then forget about storing stuff in session state, you will need to push it to the database and/or use cookies so that all boxes have access to the session data.

Number 2

ASP.NET Membership Provider. If you are using this to manage your accounts then you can’t just go with the run on the mill install of this. You will need to download some updated scripts from here and there are some limitations

Number 3

Script combining and compression. If you are a fan of SquishIt and use this for your script combining and compression then you are going to be out of luck unless you modify the source to output the combined files to blog storage. It’s not a lot of work and I have done a partial implementation which gives me all I need for the current project.

There are only a couple of things you need to do.

  1. update the RenderRelease method of CssBundle.cs and JavascriptBundle.cs to output to blog storage
  2. Update the ExpandAppRelativePath method of BundleBase.cs to output an amended path to your .css or .js files that points to Blog Storage.
  3. Implement your Blob Storage class

The sample code below should get you going if you need to do this. Please note this only caters for combined files that are versioned using the #,you will need to add the Settings being used in this code to your project, of which there are 4 and that there are paths specific to my project in that code. Hey, it’s a sample and not a complete solution Smile

// AZURE CODE TO SEND TO BLOB STORAGE STARTif (Properties.Settings.Default.UseAzure) { BlobStorage blob = new BlobStorage(); using (MemoryStream memStream = new MemoryStream()) { byte[] content = ASCIIEncoding.ASCII.GetBytes(minifiedJavaScript); memStream.Write(content, 0, content.Length); memStream.Position = 0; memStream.Flush(); string fileName = renderTo.Replace("~/", string.Empty); fileName = fileName.Replace("\", "/"); blob.UploadFile(fileName, memStream); } } // AZURE CODE TO SEND TO BLOB STORAGE START END // AMENDED ExpandAppRelativePath TO POINT TO BLOG STORAGE START protected string ExpandAppRelativePath(string file) { if (file.StartsWith("~/")) { string appRelativePath = HttpRuntime.AppDomainAppVirtualPath; if (appRelativePath != null && !appRelativePath.EndsWith("/")) appRelativePath += "/"; if (Properties.Settings.Default.UseAzure) { return file.Replace("~", Properties.Settings.Default.UrlImages); } else { return file.Replace("~/", appRelativePath); } } return file; } // AMENDED ExpandAppRelativePath TO POINT TO BLOG STORAGE END // AZURE BLOG STORAGE CLASS START using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; using Microsoft.WindowsAzure.StorageClient; using System.Collections.Specialized; using Microsoft.WindowsAzure; namespace SquishIt.Framework.Azure { class BlobStorage { public void UploadFile(string fileName, Stream memStream) { // Create a blob in container and upload image bytes to it var blob = GetContainer("content/").GetBlobReference(fileName); if (fileName.EndsWith(".css")) { blob.Properties.ContentType = "text/css"; } else if(fileName.EndsWith(".js")) { blob.Properties.ContentType = "text/javascript"; } // Create some metadata for this image var metadata = new NameValueCollection(); metadata["Id"] = fileName; metadata["Filename"] = fileName; // Add and commit metadata to blob blob.Metadata.Add(metadata); blob.UploadFromStream(memStream); } private CloudBlobContainer GetContainer(string container) { // Get a handle on account, create a blob service client and get container proxy var account = CloudStorageAccount.Parse(string.Format("DefaultEndpointsProtocol=http;AccountName={0};AccountKey={1}", Properties.Settings.Default.AzureAccount, Properties.Settings.Default.AzurePrimaryKey)); var client = account.CreateCloudBlobClient(); return client.GetContainerReference(container); } } } // AZURE BLOG STORAGE CLASS END

Number 4

If you are working with Blob Storage, and you want to be, then grab yourself a free app that will help you manage that storage. I use CloudBerry Explorer for Azure Blog Storage and it works a treat. Grab it here.

Number 5

This may (or may not) seem like an obvious one but get your SqlAzure database and Blob Storage setup as soon as possible for your live project. Storage is cheap on Azure, it’s the Compute time that will cost you so take advantage of this early.

By doing so you can run your project in the Dev environment against your live storage and flush out a lot of the little problems early on. Plus, apart from clearing out your test data, when you are ready to go live you have all your storage setup and ready to go.

From time to time you can push the solution to the Cloud and start eating up that Compute time and do some smoke testing without needing to create the storage side of things.

Number 6 (Update)

Using custom font files with a Web Role from Blob Storage. So this one is a bit of a weirdy and there may well be a better workaround than the one I have used which is to not store my custom fonts in blob storage but to leave them within the Web Role folder structure.

So a little background first.

I had stored my custom fonts in blob storage and was accessing them from there via a url in my css file. It all worked fine for Chrome, Opera and Safari but not for Firefox or IE.

The reason being that the font files were being served from a different domain than the website and Firefox (and IE I assume) blocks this.

Using SquishIt (item number 3 above) I combine and compress all my css files and serve them from blob storage and I cannot emphasis enough what a difference to the performance of your site combing and compressing your .js and .css files makes.

So, not serving my custom font files from blob storage meant I needed to add one additional css file that just contained my custom font reference, and refer to that locally from each Web Role and also have the font files stored locally on each instance of my Web Role.

All that this one additional css file contains is shown below. It’s tiny so I can live with this one file not being combined to get around the cross domain issue

@font-face { font-family: 'CoolveticaRegular'; src: url('/css/coolveticarg-webfont.eot'); src: local('CoolveticaRegular'), url('/css/coolveticarg-webfont.ttf') format('opentype'), url('/css/coolvetica_rg-webfont.svg.svg#webfontkvf4HhkC') format('svg'); font-weight: normal; font-style: normal; }

Apparently you can add a response header ‘Allow-Access-Control-Origin’ and give this a value of ‘*’ and this will allow the cross domain access. However, when I tried adding this header to my font files using ‘CloudBerry Explorer for Azure Blog Storage’ the header would not remain after saving.

Given that this header apparently can be used would suggest that there could be a way to leave the font files in blob storage. Then again, maybe not.

Right, gotta crack on with the work. If I think of anymore I will add them to this post as them come to me.