Monday, 28 May 2012

Configuring Azure Storage Emulator SQL Server Instance

If you're using Windows Azure Storage, you are almost certainly going to be running the storage emulator during development, instead of working directly against your storage account up in the cloud. This emulator (which comes in the Windows Azure SDK - see the "other" category), allows you to test locally against local instances of Blob, Queue and Table services.

As per the Overview of Running a Windows Azure Application with Storage Emulator reference, the emulator needs a local SQL Server instance. By default, it's configured to run against a SQL Server Express 2005 or 2008 database.

If you want to point it at a different instance of SQL Server, for example a shared development database server, you can do this using the DSInit command line tool. I originally came across this MSDN on How to Configure SQL Server for the Storage Emulator, which led me to try the following in the Windows Azure Command Prompt:

DSInit /sqlInstance:MyDevServerName\MyInstanceName
This tried to create the storage database, but failed with the following:
Creating database DevelopmentStorageDb20110816...
Cannot create database 'DevelopmentStorageDb20110816' : A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)

One or more initialization actions have failed. Resolve these errors before attempting to run the storage emulator again. These errors can occur if SQL Server was installed by someone other than the current user. Please refer to http://go.microsoft.com/fwlink/?LinkID=205140 for more details.
The correct way when trying to use a different database server instead of the local machine, is to use the SERVER switch instead:
DSInit /SERVER:MyDevServerName\MyInstanceName
See the full DSInit Command-Line Tool reference.

When you then run the Storage Emulator, it will target that database server/instance. You can easily cleardown/reset that database by right clicking the Windows Azure Emulator icon in the taskbar, select "Show Storage Emulator UI" and the click "Reset". NB. Just to be clear, this will delete everything in your local storage emulator database.

An added "gotcha" to watch out for, if you have the storage account connection stored in a web/app.config and want to specify to use the local emulated instance, you need to use

UseDevelopmentStorage=true
exactly as it appears here. If like me, you initially give an ending semi-colon, you will get a FormatException stating with the error: "Invalid account string".

Thursday, 24 May 2012

Error Creating Azure Blob Storage Container

I received a rather...vague...error when trying out a bit of .NET code to connect to a Windows Azure Blob Storage account, and create a new container in which to store some blobs.

The code

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
    "DefaultEndpointsProtocol=https;AccountName=REMOVED;AccountKey=REMOVED");

CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            
CloudBlobContainer container = 
    blobClient.GetContainerReference("TestContainer1");

container.CreateIfNotExist(); // This error'd

The error

A StorageClientException was thrown saying "One of the request inputs is out of range.". An inner WebException showed that "The remote server returned an error: (400) Bad Request."

The cause

I'd assumed (incorrectly) that a container name could be pretty much any string. But that's not the case. As per this MSDN reference, a container name must:
  • be in lowercase (this was the cause in my case)
  • start with a letter or a number and only contain letters, numbers and hyphens (multiple consecutive hyphens are not allowed)
  • be between 3 and 63 characters long
The "Naming and Referencing Containers, Blobs and Metadata" reference is worth a bookmark.

Wednesday, 23 May 2012

ASP.NET MVC Performance Profiling

Building up a profile of how a web application functions, all the database interactions that take place and where the server-side time is spent during a request can be a challenging task when you are new to an existing codebase. If you're trying to address the generic/non-specific "we need to improve performance / it's slow" issue, you need to get a good picture of what is going on and where to prioritise effort.

There are a number of ways to identify specific problems depending on the technologies. For example, if your application is backed by SQL Server, you can query a set of DMVs to identify the top n worst performing queries and then focus your effort on tuning those. Identifying a badly performing query and tuning it can obviously yield huge benefits in terms of the end user's experience. But this doesn't necessarily flag up all the problems. If you are looking at a particular page/view within the application, then you could start a SQL Profiler trace to monitor what's going on during the lifecycle of the request - this is another common and valuable tool to use. Personally, I usually have SQL Profiler open most of the time during development. If you're developing against a shared dev database with others, you can filter out other people's events from the trace - injecting your machine name into the connection string as the ApplicationName, and then filtering on this is one of a number of ways to achieve this which works nicely.

MiniProfiler For The Win

Recently, I've started using another extremely valuable tool within an ASP.NET MVC solution - MiniProfiler which is (quote):
A simple but effective mini-profiler for ASP.NET MVC and ASP.NET
It was developed by the team over at StackOverflow. Simply put, it can render performance statistics on the page you are viewing that detail where the time was spent server-side, fulfilling that request. But the key thing for me, is it provides an ADO.NET profiler. Say you're using LINQ-to-SQL - by wrapping the SqlConnection in a ProfiledDbConnection before then passing it to the constructor of a DataContext, info on the SQL queries executed within the lifetime of a request are then also included in the statistics displayed. (It can also profile calls via raw ADO.NET / Entity Framework etc, minimal effort required).

Make the invisible, visible

Since integrating this into an MVC application, the benefits have been priceless. The key thing for me is: VISIBILITY. It provides extremely value visibility of what is happening under the covers. Going back to the start of this post, if you're new to a codebase, then having this information provided to you as you browse is invaluable. It enables you to identify problems at a glance, and increases visibility of problems to other developers so the "life expectancy" of those problems is lower - they're a lot less likely to hover undetected just under the radar if the information is being pushed right in front of the developer on screen. It also helps you build up a picture of how things hang together.

MiniProfiler includes functionality to flag up N+1 and duplicate queries, a common potential problem you could encounter with ORMs if you're not careful. If a view were performing 100 low hitting queries, these may not show themselves as queries to be tuned. But the fact that 100 database roundtrips are being made, could scream out that perhaps they could be replaced with a single roundtrip and a performance improvement gained there.

I'm now a big fan of MiniProfiler, especially due to it's simplicity to integrate into a codebase. Working on ASP.NET MVC/ASP.NET applications? You might want to give it a try!