Forecast Cloudy – Azure Blob Storage Introduction

Windows Azure Blob storage can be used to store and retrieve Binary Large Objects (Blobs) , or what we can also call files. There are many reasons why you may want to use this storage mechanism in Azure PaaS – offloading static content from website is most common, but there could be others as well, like sharing PDF files with customers. If using Azure PaaS features such as Web Roles and Worker Roles, storing files in Blob Storage instead of in the application speeds up the publishing time, especially if the files are large.

There are two kinds of blobs – block blobs and page blobs. When you create a blob you will specify its type. Once the blob has been created, its type cannot be changed, and it can be updated only by using operations appropriate for that blob type, i.e., writing a block or list of blocks to a block blob, and writing pages to a page blob. Therefore lets take a look at differences here:

  • Block blobs let you upload large blobs efficiently. Block blobs are comprised of blocks, each of which is identified by a block ID. You create or modify a block blob by writing a set of blocks and committing them by their block IDs. Each block can be a different size, up to a maximum of 4 MB. The maximum size for a block blob is 200 GB, and a block blob can include no more than 50,000 blocks. If you are writing a block blob that is no more than 64 MB in size, you can upload it in its entirety with a single write operation
  • Page blobs are a collection of 512-byte pages optimized for random read and write operations. To create a page blob, you initialize the page blob and specify the maximum size the page blob will grow. To add or update the contents of a page blob, you write a page or pages by specifying an offset and a range that align to 512-byte page boundaries. A write to a page blob can overwrite just one page, some pages, or up to 4 MB of the page blob. Writes to page blobs happen in-place and are immediately committed to the blob. The maximum size for a page blob is 1 TB. Page Blobs are more efficient when ranges of bytes in a file are modified frequently.

When to use which ranges on scenario. However, if you need random read\write access you will use page blob, whereas if you are accessing items sequentially, like sequencing video files, you will use block blob,

Blob Storage itself consists of following components:


  • Storage Account. Self-explanatory. All storage access in Azure is done via storage account
  • Container. A container provides a grouping of a set of blobs. All blobs must be in a container. An account can contain an unlimited number of containers. A container can store an unlimited number of blobs.
  • Blob. A file of any type and size. There are two types I just discussed above.

Blobs are addressable using the following URL format –

http://<storage account><container>/<blob> :

Example from above picture-

Enough of big theory? Well lets start then.

To get started using the BLOB service, we’ll first need to have a Windows Azure account and create a Storage Account. You can get a free trial account here – or if you are using MSDN like me you can add Azure Benefits to your MSDN here – .  Granted you done that we can create Storage Account that will store blobs.

To create a storage account, log in to the Windows Azure management portal and on the large NEW icon at the bottom left hand of the portal. From the expanding menu select the Data Services option, then Storage and finally, Quick Create:



You will now need to provide a name for your storage account in the URL textbox. This name is used as part of the URL for the service endpoint and so it must be globally unique. The portal will indicate whether the name is available whenever you pause or finish typing. Next, you select a location for your storage account by selecting one of the data center locations in the dropdown. This location will be the primary storage location for your data, or more simply, your account will reside in this Data Center. If you have created ‘affinity groups’, which is a friendly name of a collection of services you want to run in the same location, you will also see that in your drop down list. If you have more than one Windows Azure subscriptions related to your login address, you may also see a dropdown list to enable you to select the Azure subscription that the account will belong to.


All storage accounts are stored in triplicate, with transactional-consistent copies in the primary data center. In addition to that redundancy, you can also choose to have ‘Geo Replication’ enabled for the storage account. ’Geo Replication’ means that the Windows Azure Table and BLOB data that you place into the account will not only be stored in the primary location but will also be replicated in triplicate to another data center within the same region.

So account is created:


Click on the ‘Manage Access Keys’ at the bottom of the screen to display the storage account name, which you provided when you created the account, and two 512 bit storage access keys used to authenticate requests to the storage account. Whoever has these keys will have complete control over your storage account short of deleting the entire account. They would have the ability to upload BLOBs, modify table data and destroy queues. These account keys should be treated as a secret in the same way that you would guard passwords or a private encryption key. Both of these keys are active and will work to access your storage account. It is a good practice to use one of the keys for all the applications that utilize this storage account so that, if that key becomes compromised, you can use this dialog to regenerate the key you haven’t been using, then update the all the apps to use that newly regenerated key and finally regenerate the compromised key. This would prevent anyone abusing the account with the compromised key.


Now that we have this account lets do something with it.  Here I will actually create a little console application to work with blob storage.


As I will not have correct reference in such project I will add via NuGet Microsoft Azure Storage Library


Finally here is quick code to upload picture called dev.jpg that I picked in random on my hard drive:

using System;
using System.Collections.Generic;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using System.Configuration;
using System.IO;

namespace BlobSample
    class Program
        static void Main(string[] args)
            string accountName = ConfigurationSettings.AppSettings["accountName"];

            string accountKey = ConfigurationSettings.AppSettings["accountKey"]; 


                StorageCredentials creds = new StorageCredentials(accountName, accountKey);

                CloudStorageAccount account = new CloudStorageAccount(creds, useHttps: true);

                CloudBlobClient client = account.CreateCloudBlobClient();

                CloudBlobContainer sampleContainer = client.GetContainerReference("gennadykpictures");


                CloudBlockBlob blob = sampleContainer.GetBlockBlobReference("dev.jpg");

                using (Stream file = System.IO.File.OpenRead("dev.jpg"))




            catch (Exception ex)



            Console.WriteLine("Done... press a key to end.");



Note that I am storing account name and account key that we got from Manage Access Keys page in my App.Config.  To check whether image was uploaded you can use a tool like Azure Explorer from Cerebrata – . As I connect my Azure Explorer to my storage account I clearly see my image under container “gennadykpictures” :


Well, how do I download file programmatically? Similarly you will still use CloudBlockBlob class, but instead of UploadFromStream method you will DownloadToStream, like:

CloudBlockBlob blob = sampleContainer.GetBlockBlobReference("gennadykpictures");

using (Stream outputFile = new FileStream("dev.jpg", FileMode.Create))




When writing solutions for the cloud, you must program defensively. Cloud solutions are often comprised of multiple sometimes-connected products/features that rely on each other to work. Any of those bits could stop working for some reason – Azure websites could go down, the network between the VM and the backend service could start denying access for some reason, the disk your blobs are stored on could hit a bad patch and the controller could be in the process of repointing your main blob storage to one of the replicas. You have to assume any of these things (and more) can happen, and develop your code so that it handles any of these cases.

Obviously my above sample isn’t written that way. I have no retry logic here on failure, nor my error handling is decent. If this was a production application I would have to think of these proper coding and architecture techniques (proper error handling and logging, retry policy on failure, storage throttling in Azure, etc.)

For more on Azure Blob Storage see –

This has been different, fun, and I hope you find it useful.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s