Did You Even Know There Were Missing Index DMVs

In the vain of full disclosure I am not that great a dba.  My skills are really on the development side.  The Dynamic Management Views or DMVs have been around a while.  I have on occasion made use of them but usually through the suggestion of others.  I just haven’t spent much time looking at them or trying to understand their value.  It is yet another whole world inside SQL Server.

As a Data Architect and a T-SQL developer I am acutely aware and concerned with how the structure and code techniques around persisting data affects performance.  We are better at our jobs when we understand what's over the fence in someone else’s yard.  You will be a better developer if you understand how to manage the storage of data while optimizing for performance.  There are a lot of factors.  It is a specialty of its own in SQL Server.

Laerte Junior has a post over at Simple-Talk that looks specifically at DMVs that are used for identifying missing indexes.  Personally, I have never had much significant success with the Index Tuning wizard.  Laerte has some examples of working with the data provided by the missing Index DMVs that look like a good place to start when you need to start tuning Indexes for performance.

In his article he does point out that there was some ugly code that really could be better optimized.  He was constrained by time so he needed so fast improvements.  This is where the Missing Index DMVs came into play.  I still stand by my principles of writing good code and not just throwing your performance issues over the fence to a dba to fix.  So if Laerte was able to get real benefit from tuning indexing imagine where he would be if the code was optimized too.

Reducing I/O with the ‘Missing Indexes’ DMVs


Use the Table to Be Modified As Temporary Storage For Itself

R. Barry Young is an active member and occasional presenter at the Philadelphia SQL Server Users Group, PSSUG.  His presentations are known for his touch of humor.  I still chuckle about the “Dude Where’s My Error” comment in his Object Broker presentation. 

Recently, SQL Server Central reposted an article of Barry’s that is worth a look.  He takes an old school approach to a common problem.  The technique demonstrates an “In-Place” storage of intermediary data for the modification of data in the same table.  That is he uses the existing table that is already populated with data to store a modified version of the data that is needed as a step in a data modification process.

The article is well thought out with very clear explanations and examples running from start to finish.  You may never need this technique but understanding it and having it tucked away in the tool box is a worthy investment of 15 minutes. – Good Job Barry

Just For Fun: An Impossible Delete does require an account to access content.  Accounts are free and can include a subscription to their daily news letter.


Kill SPIDs and Get Away With It By Parameter

So every now and again there is a need to kick everyone out of a database in a hurry.  The simple way is to kill the SPID.  Execute sp_who2 and look for the connections that need to die.  Using the Kill command with the SPIDS that need to die is easy enough.  Unless there are a lot of connections and more coming all the time.

This happens to me from time to time with an application where the database needs to go offline and back as quickly as possible.  Except there tend to be a fair number of users and there are services that connect automatically in short intervals of a minute or so.  The database won’t go offline if there are active connections.  The solution is to Kill the connections.  Not a big deal since they aren’t working already.  But there is a timing issue to get all the SPIDs killed qucikly before they reconnect and the database a prevent it from going offline.

This is where this little script comes in handy.  It can be run at will and will return results similar to sp_Who2.  The difference here is that there are parameters that can be set to filter the results.  Additionally the results have the code in the first column, DeathToTheseSPIDs, that can be copied and pasted from the results pane in SSMS into a command window and executed.

The parameters are all optional.  Just fill in the criteria needed to get just the SPIDs that need to die.  Perhaps you just need to kill all the connections from one  host or a particular user but only on a specific database.  Just fill in the values you need to filter and leave the rest as NULL. 

Technically the script can be executed against any database since it only draws from one table in the master db.  The use of a 3 part name in the FROM clause solves that issue.  However, it is always helpful to ensure you don’t wind up killing your own connection.

It is trivial to modify the WHERE criteria or actually execute the KILL statements.  This script doesn’t do it automatically to prevent troubling accidents.

Enjoy, modify to your own needs:

-- Simple script to return a list of SPIDs to kill
-- Purpose: Generate a list of spids to kill an the code to get it done based on
    -- input parameters
-- Date: 10/27/2009
-- Version: 1.00.000
-- Author: Joshua Lynn

-- Notes:
    -- replace some or all NULL default vaules for variables to filter the list
    -- Copy & Paste DeathToTheseSPIDs to command window and execute

DECLARE @DB_Name    sysname
DECLARE @LoginName    sysname
DECLARE @HostName    sysname

SET @DB_Name    = NULL 
SET @dbid        = NULL 
SET @LoginName    = NULL
SET @HostName    = NULL 

      'KILL ' + CAST(sp.[spid] AS VARCHAR(10)) AS DeathToTheseSPIDs
    , sp.[spid]
    , sp.[status]
    , DB_NAME(sp.[dbid]) AS dbName
    , sp.[hostname]
    , sp.[program_name]
    , sp.[cmd]
    , sp.[cpu]
    , sp.[physical_io]
    , sp.[blocked]
    , sp.[dbid]
    , CONVERT(SYSNAME, RTRIM(sp.[loginame])) AS loginname
    , sp.[spid] AS 'spid_sort'
    , SUBSTRING(CONVERT(VARCHAR, sp.[last_batch], 111), 6, 5) + ' '
    + SUBSTRING(CONVERT(VARCHAR, sp.[last_batch], 113), 13, 8) AS 'last_batch_char'
    , sp.[request_id]
    master.dbo.sysprocesses sp WITH ( NOLOCK )
    DB_NAME(sp.[dbid]) = ISNULL(@DB_Name, DB_NAME(sp.[dbid]))
    sp.[dbid] = ISNULL(@dbid, sp.[dbid])
    CONVERT(SYSNAME, RTRIM(sp.[loginame])) = ISNULL(@LoginName, CONVERT(SYSNAME, RTRIM(sp.[loginame])))
    sp.[hostname] = ISNULL(@HostName, sp.[hostname])


Code Samples from Code Camp 2009.2

As promised here are the links to the Sample code used in the Advanced Workflow Processes in T-SQL presentation from Code Camp 2009.2 held on Saturday October 17, 2009.

EXEC vs. sp_executesql & Temp Tables

DQSM – Data Quality Status Monitoring

The first zip file is the raw contents of the examples provided in previous journal posts for your convenience.

The second zip file contains the source code and sample files for demonstration.  Inside the main directory there are files with numbers leading the file name.  Execute these in order.  Some of them you just need to execute and move on.  The others you should spend some time in and look at the code and the output.  If you dig into the code, provided in the subdirectories, you will be able to explore the solution.

Most of the code can be used on any db.  Only the specific UDF and test Sprocs designed for AdventureWorks tables are not generic.  They serve as an example of what to do in you environment.

There are a few other little goodies in there to explore and play with but they are not critical to the presentation.


Bad Habits To Kick Blog Series Worth The Time

Aaron Bertrand, a blogger over at, has an excellent series on bad habits to kick for SQL developers.

I have not had a chance to read all of them yet but I have liked what I have read so far and managed to lean a few details as well.

As with all advice on the internet it is wise to use your brain and make up your own mind.  Aaron should be commended on not just stating what you should do but pointing out and explaining why his recommendations are valid.  The reader leans not only the answer but the path behind understanding the issue at hand.  It is this understanding that can help us make better choices as developers going forward.

Thanks Aaron!