Tuesday, 12 August 2014

dm_exec_query_plan returning NULL query plan

I recently hit a scenario (SQL Server 2012 Standard, 11.0.5058) where I was trying to pull out the execution plan for a stored procedure from the plan cache, but the following query was returning a NULL query plan:
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE  text LIKE '%MyStoredProcedure%'
 AND objtype = 'Proc'
Each time I ran the stored procedure, the usecounts was incrementing, but I just could not get the query plan to be returned. Initially I thought I'd found the answer on this blog post:
However, dm_exec_text_query_plan also returned NULL for the plan handle so it was a dead end for this scenario. So, a bit more digging around and came across this question on StackOverflow. This was pretty much the scenario I was experiencing - my stored procedure had a conditional statement that wasn't being hit based on the parameters I was supplying to the stored procedure. I temporarily removed the IF condition, ran it again and hey presto, this time an execution plan WAS returned. Re-instating the condition then, sure enough, made it no longer return the plan via `dm_exec_query_plan`. I tried to create a simplified procedure to reproduce it, with multiple conditions inside that weren't all hit, but a query plan was successfully returned when I tested it - so it wasn't as straight forward as just having multiple branches within a procedure.

I was just starting to suspect it was something to do with temporary table jiggery-pokery that was being done within the conditional statement, and trying to create a very simplified repro when...
This was pretty much exactly the scenario I was hitting. I carried on with my ultra-simplified repro example which shows the full scope/impact of this issue (see below). As noted in the forum post provided above, it's an issue that occurs when using a temp table in this context, but table variables do NOT result in the same behaviour (i.e. testing a switch over to a table variable instead of a temp table sure enough did result in query plan being returned by dm_exec_query_plan ). N.B. It goes without saying, this is not an endorsement for just blindly switching to table variables!
-- 1) Create the simple repro sproc
CREATE PROCEDURE ConditionalPlanTest 
 @Switch INTEGER
AS
BEGIN
 CREATE TABLE #Ids (Id INTEGER PRIMARY KEY)
 DECLARE @Count INTEGER

 IF (@Switch > 0)
  BEGIN  
   INSERT INTO #Ids (Id) VALUES (1)
  END 

 IF (@Switch > 1)
  BEGIN
   INSERT #Ids (Id) VALUES (2)
  END

 SELECT * FROM #Ids
END
GO

-- 2) Run it with a value that does NOT result in all conditions being hit
EXECUTE ConditionalPlanTest 1
GO

-- 3) Check plan cache - no query plan or text query plan will be returned, 
--    usecounts = 1
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 4) Now run it with a different parameter that hits the 2nd condition
EXECUTE ConditionalPlanTest 2
GO

-- 5) Check the plan cache again - query plan is now returned and 
--    usecounts is now 2.
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 6) Recompile the sproc
EXECUTE sp_recompile 'ConditionalPlanTest'
GO

-- 7) Confirm nothing in the cache for this sproc
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 8) This time, run straight away with a parameter that hits ALL conditions
EXECUTE ConditionalPlanTest 2
GO

-- 9) Check the plan cache again - query plan is returned and usecounts=1.
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, 
    qp.query_plan, tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 10) Now change the sproc to switch from temp table to table variable
ALTER PROCEDURE ConditionalPlanTest 
 @Switch INTEGER
AS
BEGIN
 DECLARE @Ids TABLE (Id INTEGER PRIMARY KEY)
 DECLARE @Count INTEGER

 IF (@Switch > 0)
  BEGIN  
   INSERT INTO @Ids (Id) VALUES (1)
  END 

 IF (@Switch > 1)
  BEGIN
   INSERT @Ids (Id) VALUES (2)
  END

 SELECT * FROM @Ids
END
GO

-- 11) Execute the sproc with the parameter that does NOT hit all the conditions
EXECUTE ConditionalPlanTest 1
GO

-- 12) Check the plan cache - query plan is returned, usecounts=1
SELECT plan_handle,usecounts, cacheobjtype, objtype, size_in_bytes, text, qp.query_plan, 
    tqp.query_plan AS text_query_plan
FROM sys.dm_exec_cached_plans cp
 CROSS APPLY sys.dm_exec_sql_text(plan_handle) t
 CROSS APPLY sys.dm_exec_query_plan(plan_handle) qp
 CROSS APPLY sys.dm_exec_text_query_plan(plan_handle, NULL, NULL) tqp
WHERE text LIKE '%ConditionalPlanTest%'
 AND objtype = 'Proc'
GO

-- 13) CLEANUP
DROP PROCEDURE ConditionalPlanTest
GO

Thursday, 19 September 2013

70-486 Developing ASP.NET MVC 4 Web Applications

Last week I passed the 70-486 Microsoft exam - Developing ASP.NET MVC 4 Web Applications, so thought I'd knock up a quick post on my experience and what materials I found useful as part of my preparation.

Going into this exam, I had just over a year and a half's commercial experience using ASP.NET MVC 2 & 3. Before that, I had experience in ASP.NET and prior to that, when dinosaurs still roamed, classic ASP. It's no secret I'm a bit of a data nerd (a lot of my blog is db related, SQL Server, MongoDB...) and I have tended to be backend focused, but I wanted to even out the balance a bit by pushing myself in this area, and in JS/HTML5/CSS3. I took the opportunity to upgrade the web solution at my current company from MVC 3 to MVC 4, and being able to do that at the start of my preparation was really useful - this is how I like to learn, by actually "getting stuff done". That is the obvious, number 1 recommendation - do not just read and swot up on the theory, actually "do". My brain likes me to be stupid and make practical mistakes - when I then work out how I've done something daft, it reinforces the learning and makes the knowledge stick.

Preparation

The first thing I was disappointed to find was that there is (as of time of writing) no Microsoft Exam prep book for this exam. There is one due out I believe at the beginning of October 2013 - Exam Ref 70-486 : Developing ASP.NET MVC 4 Web Applications by William Penberthy (ISBN-10: 0735677220 | ISBN-13: 978-0735677227). So I obviously can't comment on how good that book is. The book I went with was Professional ASP.NET MVC 4 from Wrox, by Jon Galloway (Twitter), Phil Haack (Twitter), K. Scott Allen (Twitter) and foreword by Scott Hanselman (Twitter). While the book alone isn't enough for the exam, for me it gave a good coverage on quite a few areas I needed so it is definitely worth a read.

Having a Pluralsight subscription was good and while not necessarily geared towards the exam, you can never go wrong with a bit of Pluralsight training. I went through a number of the MVC courses - ASP.NET MVC Fundamentals, ASP.NET MVC 2.0 Fundamentals, MVC 4 Fundamentals and Building Applications with MVC 4. One of the things I like about Pluralsight is you can control the playback speed so for most of the stuff I already knew, I glossed over at a faster speed. I also went through the "Building Web Apps with ASP.NET Jump Start" training on the Microsoft Virtual Academy (Scott Hanselman, Jon Galloway and Damian Edwards (Twitter)) - that was quite entertaining!

I found some great study guide blog posts that collated together a lot of links to some good material:

These give a lot of useful links to MSDN articles, MS resources, blogs, interesting StackOverflow questions etc.

Last but definitely not least, there was my practical setup - Windows 8 on a VM, with Visual Studio 2012 and a Windows Azure account. To re-iterate what I said before - the best way to learn, is to do...and make daft mistakes. I started work on a new web app from scratch, with a real-world mindset on (i.e. writing production-worthy code) putting to good use the new things I was learning. I also had a scratch-pad web app where I would just dump rough code to try out short, simple snippets.

Bottom line

Overall, I put a lot of time and effort into preparation for this exam and it paid off. The biggest benefit for me is what I learned along the way and the challenge it gave me.

Wednesday, 5 June 2013

SQL Server 2008 R2 in-place upgrade error

Today I encountered the following error during the process of performing an in-place upgrade of a SQL Server 2008 instance to 2008 R2:
The specified user 'someuser@somedomain.local' does not exist
I wasn't initially sure what that related to - I hadn't specified that account during the upgrade process so I went looking in the Services managemement console. The SQL Server service for the instance I was upgrading was configured to "Log On As" that account and it had been happily running prior to the upgrade.

This domain account did exist but to sanity check, the first thing I did was switched the service to use another domain account that also definitely existed and retried the Repair process. Same error. Then I spotted further down the list, another service had specified an account in the "somedomain\someuser" form. So I switched the SQL Server instance service to specify the account in that form (Down-Level Logon Name) instead of the UPN (User Principal Name) format (reference).

Bingo.

While I was waiting for the Repair process to run through again, I carried on searching and found this question on the MSDN Forums. The very last answer there confirmed it. The SQL Server 2008 R2 upgrade does NOT like user accounts in the UPN format.

Thursday, 21 March 2013

.NET Project File Analyser

I've started knocking together a little app to automate the process of trawling through a folder structure and checking .NET project files (C# .csproj currently) to extract some info out of them. The DotNetProjectFileAnalyser repo is up on GitHub. I've been working against Visual Studio 2010 project files, but could well work for other versions assuming the project file structure is the same for the elements it currently looks at - I just haven't tried as yet.
Currently, it will generate an output file detailing for each .csproj file it finds:
  • Build output directory (relative and absolute) for the configuration/platform specified (e.g. Debug AnyCpu). Useful if you want find which projects you need to change to build to a central/common build directory.
  • List of all Project Reference dependencies (as opposed to assembly references). Useful if you want to find the projects that have Project References so you can switch them to assembly references

Usage

DotNetProjectFileAnalyser.exe {RootDirectory} {Configuration} {Path}

{RootDirectory} = start directory to trawl for .csproj files (including subdirectories)
{Configuration} = as defined in VS, e.g. Debug, Release
{Platform} = as defined in VS, e.g. AnyCpu
Example:
DotNetProjectFileAnalyser.exe "C:\src\" "Debug" "AnyCpu"

More stuff will go in over time, with ability to automatically update csproj files as well to save a lot of manual effort.

Wednesday, 13 March 2013

GB Post Code Importer Conversion Accuracy Fix

In a post last year (Ordnance Survey Data Importer Coordinate Conversion Accuracy) I looked into an accuracy issue with the conversion process within the GeoCoordConversion DLL that I use in this project (blog post). Bottom line, was that it was a minor with an average inaccuracy of around 2.5 metres and a max of ~130 metres by my reckoning. I've since had a few requests asking if I can supply an updated GeoCoordConversion DLL with fixes to the calculations.

After getting in contact with the owner of the GeoCoordConversion project, they've kindly added me as a committer. I've now pushed the fixes up to it, rebuilt the DLL (now v1.0.1.0) and pushed up the latest DLL to the Ordnance Survey Importer project on GitHub.

Friday, 8 March 2013

SQL Server Table Designer Bug With Filtered Unique Index

A colleague was getting a duplicate key error when trying to add a new column to a table via the Table Designer in SQL Server Management Studio (2008R2 - not tested on other versions), despite there being no violating data in the table. After a bit of digging around, I tracked the problem down to what appears to be a bug in Table Designer when there is a unique, filtered index in place on the table and the table is being recreated (i.e. you're adding a new column, but not at the end after all the existing columns).

Steps to reproduce
  1. In SSMS in Tools -> Options -> Designers -> Table and Database Designers, uncheck the "Prevent saving changes that require table re-creation" option
  2. Create table:
    CREATE TABLE [dbo].[Test]
    (
    Column1 INTEGER NOT NULL,
    Column2 INTEGER NOT NULL,
    Column3 INTEGER NULL
    );
    
  3. Create dummy data:
    INSERT dbo.Test(Column1, Column2, Column3) VALUES (1, 1, 1);
    INSERT dbo.Test(Column1, Column2, Column3) VALUES (1, 1, NULL); -- OK as Column3 is NULL
    
  4. Now run the following, duplicate key error is correctly thrown:
    -- Errors, Duplicate Key exception as expected
    INSERT dbo.Test(Column1, Column2, Column3) VALUES (1, 1, 2); 
    
    So at this point we have 2 rows in the table, no violations of the unique filtered index.
  5. Right click the table in SSMS -> Design
  6. Insert a new column "Column4" before Column3 and the press Save.
The error that occurs is:
'Test' table
- Unable to create index 'IX_Test_Column1_Column2'.  
The CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object name 'dbo.Test' and the index name 'IX_Test_Column1_Column2'. The duplicate key value is (1, 1).
The statement has been terminated.

So what it appears to be doing, is losing the WHERE filter on the index. This can be confirmed by clicking "Generate change script" in the Table Designer instead of Save - at the end of the generated script:

CREATE UNIQUE NONCLUSTERED INDEX IX_Test_Column1_Column2 ON dbo.Test
 (
 Column1,
 Column2
 ) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
Now, if there was no data in the table, or there were no rows with the same Column1 and Column2 value combination when you go into Table Designer, then you can save the table change and be blissfully unaware that the filter has been lost from the index. i.e. repeat the repro steps again, but this time move step 3 and 4 (insert dummy data) to the end of the process. The previously OK 2nd data row will now error upon insert.

Personally, I almost never use the Table Designer and as a safeguard, will be recommending to the rest of the team that the "Prevent saving changes that require table re-creation" option is checked as a basic guard.

Update: The following Connect items relating to this issue:
The filter expression of a filtered index is lost when a table is modified by the table designer
2008 RTM, SSMS/Engine: Table designer doesn't script WHERE clause in filtered indexes
Referring to comments in that 2nd item, my "Script for server version" setting was set to "SQL Server 2008 R2".

Sounds like this may have been addressed in SQL 2012, but still a problem in 2008/2008R2.

Sunday, 3 March 2013

MongoDB ASP.NET Session Store Provider v1.1.0

Since I created the MongoDB ASP.NET Session State Store Provider (v1.0.0), a few things have moved on in the MongoDB C# Driver. I've pushed a number of changes up to the project on GitHub (which I've incremented to v1.1.0), so it now uses v1.7.0.4714 of the driver. There is no change to way it is configured in web.config, so if you are using v1.0.0 of my provider it should be painless. Of course, I'd recommend thorough testing first :)

The changes relate to:


web.config recap
These web.config settings have not changed, so should continue working as before.
<configuration>
  <connectionStrings>
    <add name="MongoSessionServices" connectionString="mongodb://localhost" />
  </connectionStrings>
  <system.web>
    <sessionState
        mode="Custom"
        customProvider="MongoSessionStateProvider">
      <providers>
        <add name="MongoSessionStateProvider"
             type="MongoSessionStateStore.MongoSessionStateStore"
             connectionStringName="MongoSessionServices"
             writeExceptionsToEventLog="false"
             fsync="false"
             replicasToWrite="0" />
      </providers>
    </sessionState>
  </system.web>
</configuration>
replicasToWrite is interpreted as follows: < 0 = ignored. Treated as 0.
0 = will wait for acknowledgement from primary node.
> 0 = will wait for writes to be acknowledged from (1 + {replicasToWrite}) nodes.

Please Note, per the MongoDB Driver docs, if (1 + {replicasToWrite}) equates to a number greater than the number of replica set members that hold data, then MongoDB waits for the non-existent members to become available (so blocks indefinitely).

It still treats write concerns as important, ensuring it waits for acknowledgement back from at least one MongoDB node.

As always, all feedback welcome.