Tuesday, 11 October 2016

Getting started with SignalR

Follows on from previous post: Real-time SignalR and Angular 2 dashboard

I had a couple of questions from someone who forked the project on GitHub and is using it as the basis for their own custom dashboard, which reminded me that there are a few key things I picked up on my learning journey that I should share. These are primarily focused around SignalR aspects.

An instance of a SignalR hub is created for each request

A single instance is not shared across all clients connected to that hub. This is much like a controller in ASP.NET MVC, whereby a new instance of the controller is created for each request. Hence in my SignalRDashboard project, the mechanism for holding a single instance of shared data for all connections, is via a singleton instance that gets passed into each hub's constructor.

Setup callback methods in Javascript for SignalR hubs BEFORE starting the SignalR connection

To open a SignalR connection, you use the following:
    $.connection.hub.start();
To be able to receive messages from the server-side SignalR hub in client-side Javascript, you have to hookup a method to be called. You'd do this using something like:
    var hub = $.connection.myDemoHub;
    hub.client.updateMeWhenYouHaveUpdates = function(message) {
        // Do something useful with what we've just received from the server
    };
The import thing to remember is once you've started the SignalR connection, you can't hook up these callbacks. So make sure they are done BEFORE you start the connection. The correct ordering would be:
    var hub = $.connection.myDemoHub;
    hub.client.updateMeWhenYouHaveUpdates = function(message) {
        // Do something useful with what we've just received from the server
    };
    $.connection.hub.start();

You can connect to multiple SignalR hubs over the same connection

This is great as it means, you can connect to multiple hubs easily - all you need to do is setup the callback methods in Javascript before you open the connection, as per the previous point.
    var hub1 = $.connection.myDemoHub;
    hub1.client.updateMeWhenYouHaveUpdates = function(message) {
        // Do something useful with what we've just received from the server
    };

    var hub2 = $.connection.myOtherHub;
    hub2.client.callMePlease = function(message) {
        // Do something useful with what we've just received from the server
    };
    $.connection.hub.start();
In the SignalRDashboard project, you'll see this being handled in dashboard.js as follows:
  • each Angular 2 dashboard component registers itself with dashboard.js when it is constructed
  • once the component has been initialised (ngOnInit event), it lets dashboard.js know that it's finished and registration is completed
  • once all components have completed registration, then the initialiseComponents() method is called which then calls in to each registered component (each must expose a setupHub() method) so they can set up the callbacks they're interested in from the hubs they use
  • finally, the connection is then started

The basics are covered well in the official SignalR site, but the points above were some of the main things that I found myself looking a bit deeper for.

Friday, 7 October 2016

Real-time SignalR and Angular 2 dashboard

tl;dr Check out my new SignalRDashboard GitHub repo
Followup post: Getting started with SignalR

For a while now, I've had that burning curiosity to do more than just the bit of hacking around with SignalR that I'd previously done. I wanted to actually start building something with it. If you're not familiar with it, SignalR is a:

...library for ASP.NET developers that makes developing real-time web functionality easy.

It provides a simple way to have two-way communication between the client and server; the server can push updates/data to any connected client. There are a number of great starting examples out there, but to give a very quick overview, a typical basic example is web-based chat functionality. When a user navigates to the chat page using their browser-of-choice, a connection is made to a SignalR "hub" on the server, using Javascript. This connection provides the channel for bi-directional client-server communication. When a user enters some text in the UI and presses send, a message is sent across this channel to the server, invoking a method on the hub. In the case of a basic chat example, this then broadcasts the message out to all the clients who are connected to that hub, which results in a Javascript method on the client being called to then render that message out to each user's screen.

No need to make AJAX requests to send requests to the server. No need to make AJAX requests to make polling-style requests to the server for updates. SignalR handles the approach used to maintain the communication channel between the client browser and the server, depending on the functionality supported by the browser. It also handles the nuts and bolts of how to broadcast out to any connected clients.

Enter the dashboard

In the dev room I work in, we've had a large screen showing a dashboard of metrics for years - evolving over time as we realise the value of new metrics. I decided that a great learning project to work on in my spare time would be to create a real-time web-based replacement (the original one was a WPF app). I'm a big fan of giving visibility of relevant and timely, high-level application metrics to the business. Current CI build broken? Show it. Recent system error rates? Show it. How many stories do the test team have left to accept? Show it. That ability to take a quick glance, and get a quick feel for the current state of play across a wide area of the business, is invaluable. It's also handy to have things flash and/or play a sound, when there's something of particular importance to flag up.

From a technical perspective, I set myself a number of goals I wanted to achieve:

  • a centralised dashboard accessible across all areas of the business (it's an ASP.NET MVC 5 web app...job done!)
  • no matter how many clients are connected to the dashboard, I don't want to increase the load on the systems/services that the dashboard is drawing data from
  • try out a new (to me) Javascript framework to aid the creation of slick, dynamic views
  • create an dashboard framework that's easy to extend, with some demo dashboard components so hopefully others interested in the technology can find it useful, and better still, put it to use in their own environments

Moar learning!

AngularJS has also been on my radar for a while, and with Angular 2 coming along as the successor to Angular 1, it seemed like a chance to have a play around with that too. When I first started hacking about with it, an early beta version was available which I started off using. More recently, the Final Release has become available - so I have since upgraded to that, patching up where there were breaking changes.

GitHub repo

The result is a cunningly named (does-what-it-says-on-the-tin) SignalRDashboard project GitHub repo. There's a few demo components as examples, and I plan to add to those as and when I have time. It is still very much work in progress and evolving!

Here's a quick screenshot with some of the demo components currently written (all randomly generated data, refreshing every 15 seconds):

As always, all feedback is welcome - if you do find it useful/end up using it, drop me a message as it would be great to know!

Tuesday, 12 August 2014

dm_exec_query_plan returning NULL query plan

I recently hit a scenario (SQL Server 2012 Standard, 11.0.5058) where I was trying to pull out the execution plan for a stored procedure from the plan cache, but the following query was returning a NULL query plan:
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE  text LIKE '%MyStoredProcedure%'
 AND objtype = 'Proc'
Each time I ran the stored procedure, the usecounts was incrementing, but I just could not get the query plan to be returned. Initially I thought I'd found the answer on this blog post:
However, dm_exec_text_query_plan also returned NULL for the plan handle so it was a dead end for this scenario. So, a bit more digging around and came across this question on StackOverflow. This was pretty much the scenario I was experiencing - my stored procedure had a conditional statement that wasn't being hit based on the parameters I was supplying to the stored procedure. I temporarily removed the IF condition, ran it again and hey presto, this time an execution plan WAS returned. Re-instating the condition then, sure enough, made it no longer return the plan via `dm_exec_query_plan`. I tried to create a simplified procedure to reproduce it, with multiple conditions inside that weren't all hit, but a query plan was successfully returned when I tested it - so it wasn't as straight forward as just having multiple branches within a procedure.

I was just starting to suspect it was something to do with temporary table jiggery-pokery that was being done within the conditional statement, and trying to create a very simplified repro when...
This was pretty much exactly the scenario I was hitting. I carried on with my ultra-simplified repro example which shows the full scope/impact of this issue (see below). As noted in the forum post provided above, it's an issue that occurs when using a temp table in this context, but table variables do NOT result in the same behaviour (i.e. testing a switch over to a table variable instead of a temp table sure enough did result in query plan being returned by dm_exec_query_plan ). N.B. It goes without saying, this is not an endorsement for just blindly switching to table variables!
-- 1) Create the simple repro sproc
CREATE PROCEDURE ConditionalPlanTest 
 @Switch INTEGER
AS
BEGIN
 CREATE TABLE #Ids (Id INTEGER PRIMARY KEY)
 DECLARE @Count INTEGER

 IF (@Switch > 0)
  BEGIN  
   INSERT INTO #Ids (Id) VALUES (1)
  END 

 IF (@Switch > 1)
  BEGIN
   INSERT #Ids (Id) VALUES (2)
  END

 SELECT * FROM #Ids
END
GO

-- 2) Run it with a value that does NOT result in all conditions being hit
EXECUTE ConditionalPlanTest 1
GO

-- 3) Check plan cache - no query plan or text query plan will be returned, 
--    usecounts = 1
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 4) Now run it with a different parameter that hits the 2nd condition
EXECUTE ConditionalPlanTest 2
GO

-- 5) Check the plan cache again - query plan is now returned and 
--    usecounts is now 2.
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 6) Recompile the sproc
EXECUTE sp_recompile 'ConditionalPlanTest'
GO

-- 7) Confirm nothing in the cache for this sproc
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 8) This time, run straight away with a parameter that hits ALL conditions
EXECUTE ConditionalPlanTest 2
GO

-- 9) Check the plan cache again - query plan is returned and usecounts=1.
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 10) Now change the sproc to switch from temp table to table variable
ALTER PROCEDURE ConditionalPlanTest 
 @Switch INTEGER
AS
BEGIN
 DECLARE @Ids TABLE (Id INTEGER PRIMARY KEY)
 DECLARE @Count INTEGER

 IF (@Switch > 0)
  BEGIN  
   INSERT INTO @Ids (Id) VALUES (1)
  END 

 IF (@Switch > 1)
  BEGIN
   INSERT @Ids (Id) VALUES (2)
  END

 SELECT * FROM @Ids
END
GO

-- 11) Execute the sproc with the parameter that does NOT hit all the conditions
EXECUTE ConditionalPlanTest 1
GO

-- 12) Check the plan cache - query plan is returned, usecounts=1
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, qp.query_plan, 
    tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 13) CLEANUP
DROP PROCEDURE ConditionalPlanTest
GO

Thursday, 19 September 2013

70-486 Developing ASP.NET MVC 4 Web Applications

Last week I passed the 70-486 Microsoft exam - Developing ASP.NET MVC 4 Web Applications, so thought I'd knock up a quick post on my experience and what materials I found useful as part of my preparation.

Going into this exam, I had just over a year and a half's commercial experience using ASP.NET MVC 2 & 3. Before that, I had experience in ASP.NET and prior to that, when dinosaurs still roamed, classic ASP. It's no secret I'm a bit of a data nerd (a lot of my blog is db related, SQL Server, MongoDB...) and I have tended to be backend focused, but I wanted to even out the balance a bit by pushing myself in this area, and in JS/HTML5/CSS3. I took the opportunity to upgrade the web solution at my current company from MVC 3 to MVC 4, and being able to do that at the start of my preparation was really useful - this is how I like to learn, by actually "getting stuff done". That is the obvious, number 1 recommendation - do not just read and swot up on the theory, actually "do". My brain likes me to be stupid and make practical mistakes - when I then work out how I've done something daft, it reinforces the learning and makes the knowledge stick.

Preparation

The first thing I was disappointed to find was that there is (as of time of writing) no Microsoft Exam prep book for this exam. There is one due out I believe at the beginning of October 2013 - Exam Ref 70-486 : Developing ASP.NET MVC 4 Web Applications by William Penberthy (ISBN-10: 0735677220 | ISBN-13: 978-0735677227). So I obviously can't comment on how good that book is. The book I went with was Professional ASP.NET MVC 4 from Wrox, by Jon Galloway (Twitter), Phil Haack (Twitter), K. Scott Allen (Twitter) and foreword by Scott Hanselman (Twitter). While the book alone isn't enough for the exam, for me it gave a good coverage on quite a few areas I needed so it is definitely worth a read.

Having a Pluralsight subscription was good and while not necessarily geared towards the exam, you can never go wrong with a bit of Pluralsight training. I went through a number of the MVC courses - ASP.NET MVC Fundamentals, ASP.NET MVC 2.0 Fundamentals, MVC 4 Fundamentals and Building Applications with MVC 4. One of the things I like about Pluralsight is you can control the playback speed so for most of the stuff I already knew, I glossed over at a faster speed. I also went through the "Building Web Apps with ASP.NET Jump Start" training on the Microsoft Virtual Academy (Scott Hanselman, Jon Galloway and Damian Edwards (Twitter)) - that was quite entertaining!

I found some great study guide blog posts that collated together a lot of links to some good material:

These give a lot of useful links to MSDN articles, MS resources, blogs, interesting StackOverflow questions etc.

Last but definitely not least, there was my practical setup - Windows 8 on a VM, with Visual Studio 2012 and a Windows Azure account. To re-iterate what I said before - the best way to learn, is to do...and make daft mistakes. I started work on a new web app from scratch, with a real-world mindset on (i.e. writing production-worthy code) putting to good use the new things I was learning. I also had a scratch-pad web app where I would just dump rough code to try out short, simple snippets.

Bottom line

Overall, I put a lot of time and effort into preparation for this exam and it paid off. The biggest benefit for me is what I learned along the way and the challenge it gave me.

Wednesday, 5 June 2013

SQL Server 2008 R2 in-place upgrade error

Today I encountered the following error during the process of performing an in-place upgrade of a SQL Server 2008 instance to 2008 R2:
The specified user 'someuser@somedomain.local' does not exist
I wasn't initially sure what that related to - I hadn't specified that account during the upgrade process so I went looking in the Services managemement console. The SQL Server service for the instance I was upgrading was configured to "Log On As" that account and it had been happily running prior to the upgrade.

This domain account did exist but to sanity check, the first thing I did was switched the service to use another domain account that also definitely existed and retried the Repair process. Same error. Then I spotted further down the list, another service had specified an account in the "somedomain\someuser" form. So I switched the SQL Server instance service to specify the account in that form (Down-Level Logon Name) instead of the UPN (User Principal Name) format (reference).

Bingo.

While I was waiting for the Repair process to run through again, I carried on searching and found this question on the MSDN Forums. The very last answer there confirmed it. The SQL Server 2008 R2 upgrade does NOT like user accounts in the UPN format.

Thursday, 21 March 2013

.NET Project File Analyser

I've started knocking together a little app to automate the process of trawling through a folder structure and checking .NET project files (C# .csproj currently) to extract some info out of them. The DotNetProjectFileAnalyser repo is up on GitHub. I've been working against Visual Studio 2010 project files, but could well work for other versions assuming the project file structure is the same for the elements it currently looks at - I just haven't tried as yet.
Currently, it will generate an output file detailing for each .csproj file it finds:
  • Build output directory (relative and absolute) for the configuration/platform specified (e.g. Debug AnyCpu). Useful if you want find which projects you need to change to build to a central/common build directory.
  • List of all Project Reference dependencies (as opposed to assembly references). Useful if you want to find the projects that have Project References so you can switch them to assembly references

Usage

DotNetProjectFileAnalyser.exe {RootDirectory} {Configuration} {Path}

{RootDirectory} = start directory to trawl for .csproj files (including subdirectories)
{Configuration} = as defined in VS, e.g. Debug, Release
{Platform} = as defined in VS, e.g. AnyCpu
Example:
DotNetProjectFileAnalyser.exe "C:\src\" "Debug" "AnyCpu"

More stuff will go in over time, with ability to automatically update csproj files as well to save a lot of manual effort.

Wednesday, 13 March 2013

GB Post Code Importer Conversion Accuracy Fix

In a post last year (Ordnance Survey Data Importer Coordinate Conversion Accuracy) I looked into an accuracy issue with the conversion process within the GeoCoordConversion DLL that I use in this project (blog post). Bottom line, was that it was a minor with an average inaccuracy of around 2.5 metres and a max of ~130 metres by my reckoning. I've since had a few requests asking if I can supply an updated GeoCoordConversion DLL with fixes to the calculations.

After getting in contact with the owner of the GeoCoordConversion project, they've kindly added me as a committer. I've now pushed the fixes up to it, rebuilt the DLL (now v1.0.1.0) and pushed up the latest DLL to the Ordnance Survey Importer project on GitHub.