Patrick Purviance – SQL Server DBA Blog
Blogging impactful solutions for SQL Server DBAs worldwide

The distribution agent failed to create temporary files in ‘C:\Program Files\Microsoft SQL Server\110\COM’ directory.

August 7th, 2013

Shortly after configuring a new SQL Server 2012 server as a pull subscriber to a publication on a SQL Server 2008 Publisher/Distributor, we began receiving the following error message from the replication agents/jobs:

SQL Server Replication Error 14151

Replication-Replication Distribution Subsystem: agent PublisherNameDBNamePublicationName-62 failed. The distribution agent failed to create temporary files in ‘C:\Program Files\Microsoft SQL Server\110\COM’ directory. System returned errorcode 5.

The root cause, after much searching and testing theories, was permissions on the subscriber COM directory. 

The fix was to simply grant the account that the distribution agent was running on appropriate write/modify permissions on the COM folder in question ON THE SUBSCRIBER, NOT THE PUBLISHER/DISTRIBUTOR

Why this was not something done automatically by the replication setup process is beyond me, but once we granted the write permission to the COM folder on the subscriber (likely not necessary, but we also restarted the SQL Agent Job responsible for the distrubution), it all worked.

Hope this helps someone!


SQL Server 2012 SP1 CU1 Installation Failure – Resolved

December 28th, 2012

So I was recently installing SQL Server 2012 on a couple of new machines and ran into the following error when attempting to apply SQL Server 2012 SP1 CU1:

Error during SQL Server 2012 SP1 CU1 Update
Script level upgrade for database ‘master’ failed because upgrade step ‘u_tables.sql’ encountered error 25641, state 0, severity 16. This is a serious error condition which might interfere with regular operation and the database will be taken offline. If the error happened during upgrade of the ‘master’ database, it will prevent the entire SQL Server instance from starting. Examine the previous errorlog entries for errors, take the appropriate corrective actions and re-start the database so that the script upgrade steps run to completion.

This causes the SQL Server installation not only to be incompletely upgraded (Data Quality Services, Full-Text and Semantic Extractions for Search, SQL Server Replication, Database Engine Services fail to be upgraded), but also renders the entire database engine instance unable to start as the master database cannot be upgraded.  Major bummer!

After a bit of digging around, it became apparent that a temporary choice I had made during the initial installation of SQL Server 2012 was the root cause, although in my professional opinion it’s a bug and I’ve submitted a Connect Issue with a provided workaround.  Hopefully MS will fix the issue, but since the workaround exists, who knows.

The root cause for this error condition is security permissions for the SQL Server service accounts.  During installation I chose to leverage the new for 2012, virtual accounts, which are the default configuration option mind you (although I don’t intend and wouldn’t recommend using these over correctly permissioned Domain Accounts).   Essentially, the NT Service\MSSQLServer did not have appropriate permissions to access file system level objects to apply the required scripts to upgrade the master database or the other features that fail.

The workaround here is to simply set the service account to LocalSystem for the duration of the Service Pack/Cumulative Update installation and then setting it back to the virtual account (better yet to a domain account with appropriate permissions permanently).

My biggest concern here is that Microsoft has chosen for the default service accounts during installation to be based on these virtual accounts that later will render the SQL Server instance unusable until you find this post or Microsoft fixes the issue.



SSRS: Failure sending mail: The user or group name ‘Domain\UserName’ is not recognized.Mail will not be resent.

February 6th, 2012

Just ran into this one.   If an SSRS report user comes to you indicating that they’ve attempted to created a subscription and, although it allowed the subscription to be created, the report subscription failed to be sent via email with the following Status:

Failure sending mail: The user or group name ‘Domain\UserName’ is not recognized.Mail will not be resent.

When troubleshooting, I found that the subscription owner account name did NOT match the Windows account name of the user contacting me.  In the end, the root cause was a domain name change for the affected user.  The login name on the SQL Server hosting the ReportServer database still had the old domain name for the user and used that name in the subscription setup, which failed when the credentials were used to generate and send the emailed report.

The quick fix was to execute the following via our SQL Server CMS (Central Mgmt Server):

IF EXISTS (SELECT 1 FROM sys.server_principals WHERE name = ‘DOMAIN\OldUserName’)
     WITH NAME = [DOMAIN\NewUserName]

Hope that helps if you run into this situation in your travels…


SQL Server Agent – Proxy/Credential Permissions for SSIS Packages

January 31st, 2012

Just a quick self-help note today……if you ever see the error message below with a SQL Agent Job calling an SSIS Package stored in SQL Server (MSDB database) and leveraging a Proxy built on a specific credential, ensure the credential under the proxy has membership in the db_ssisoperator role in MSDB.  Without it (or a bunch of individual object level permissions granted), the credential account doesn’t have permissions to objects needed to get the SSIS Package and execute it.  Remember that the credential account must also have the required permissions on all the objects within the SSIS Package as well or you’ll just be off to fix another error after you correct this one….

Could not load package “xyz” because of error 0xC001404A. Description: While trying to find a folder on SQL an OLE DB error was encountered with error code 0x80040E09 (The EXECUTE permission was denied on the object ‘sp_ssis_getfolder’, database ‘msdb’, schema ‘dbo’.).

Even after ensuring the Proxy was configured and granted SQL Server Integration Services subsystem access, I got this error.  It wasn’t until granting the credential account access to MSDB and then membership in the db_ssisoperator role did it work.  Seems like a shortcoming in MS’ design to me.  Either that or a bug.  If you intentionally gave the proxy access to the SSIS subsystem as an administrator, why not take care of granting the permissions necessary to carry out such a feat? 

To be honest, this is the first time I’ve run into this error.  I don’t recall ever having to grant MSDB permissions before, but if this helps you, glad I ran into it.


SQL PASS Summit 2011 – A First Timers View – Day 3

October 27th, 2011

This post is part 3 of at least a 4 post article on my First Timer view of the SQL PASS Summit as I experienced it personally.  If you missed the first posts, click here.

Day 3

Day 3 for me, Day 2 of the Summit (Thursday) rolls around and today I’ve gotten up a little earlier to make it on the shuttle over to the convention center for the free breakfast at the Summit.  Yesterday I mentioned in my Day 2 summation that the hot breakfast from the Marriott was a better choice.  Don’t get me wrong, the breakfast at the Summit in the dining hall was OK, with typical cereal, yougurt and continental items, but it just wasn’t as good as the hot fresh breakfast items at the hotel.  So to me, in the end, the 10 block walk is worth it if your hotel is providing a hot breakfast for free too…

Anyway, on to the Summit Day 2 Keynote.  Plus, it’s SQL Kilt Day, so Bill Graziano, PASS Executive VP, is in a kilt, like another 30 or so other attendees, to introduce Day 2 and Quentin Clark (Corporate Vice President, SQL Server Database Systems Group Microsoft Corp) as the keynote speaker.

Quentin provided a second keynote comprised of a sales pitch.  This time focused mostly on what he called the “Fantastic 12”, the top 12 important features in SQL Server 2012.  Quentin is an entertaining speaker and also brought out quite a few Microsoft technologists and experts to cover the “Fantastic 12” in the keynote:

#1 Required 9’s & Protection – SSIS is a server, AlwaysOn, and HA for StreamInsight
#2 Blazing Fast Performance improvements to the SQL Engine, SSIS, and SSAS (and the new ColumnStore index (really sweet for DWH))
#3 Rapid Data Exploration
#4 Managed, Self-Service BI – Power View, PowerPivot
#5 Credible Consistent Data – Data Quality Services; Master Data Services
#6 Organizational Compliance Auditing; User-defined server roles
#7 Peace of mind – Distributed Replay Utility; System Center Advisor / MPs; Premier Mission Critical support
#8 Scalable data warehousing – Optimized & pre-tuned appliances from multiple vendors
#9 Fast time to solution
#10 Extend Any Data, Anywhere – Drivers for PHP, Java, Hadoop; ODBC drivers for Linux & CDC for SSIS & Oracle
#11 Optimized productivity SQL Server Data Tools (“Juneau”) for both OLTP and BI; deployment + targeting “freedom”
#12 Scale on demand AlwaysOn; Elastic Scale & Federation

Again, lots of demos and a lot of time spent talking about the Parallel Data Warehouse turn-key data warehousing hardware platform.  At least it was a good amount of information shoved into the space.

To see the entire keynote for yourself, click this link.

The rest of the day was just like the first day, filled with sessions and Expert Pods and a quick stop into the SQL Server Clinic put on by Microsoft and staffed with escalation team and sqlCAT team members to answer tough questions you brought with you.  Unfortunately, I have tough questions about merge and transactional replication issues and there was no one in the clinic to answer them.  The recommended I post them to them to the MS forums or Connect.  I did get to take a bag full of SWAG for my SQL PASS Chapter though, so that was a #win.

Thursday evening was the night of the Microsoft sponsored Community Appreciation Party held across the street from the convention center at Gamewerks.  This event was awesome.  A huge sportsbar/arcade cover a massive amount of square footage.  There was free food and drinks again of course.  And then, to spur networking, there was a contest where you had to find 5 new people, get to know them a bit, and have them put their name and email addy on a card.  I met some neat people, played a bunch of games, and found out the next day that I’d won in that contest.  A new xBox360 Kinect!  #winning #sqlpass rocks!

Back to the hotel instead of where a lot of people go, SQL Karaoke, but it was a great night anyhow.  The walk back to the hotel was chilly, but I called my wife and we chatted the entire walk back.  #morewinning

Little work for the company to wind down and some shut eye to prepare for the final day….was almost sad it was this close to the end…..


SQL PASS Summit 2011 – A First Timers View – Day 2

October 25th, 2011

This post is part 2 of at least a 4 post article on my First Timer view of the SQL PASS Summit as I experienced it personally.  If you missed the first post, click here.

Day 2

So day 2 began with a fairly early rise.  I was staying about 10 blocks from the Washington State Convention Center (again, at the Marriott Spring Hill Suites), so I would need to walk or catch the shuttle at the top of each hour.  I missed the 7AM shuttle by about 5 minutes trying to get ready, so I opted for a hot breakfast at the hotel (which we’ll find later was a good choice).  After a hot breakfast, I trekked it up to the convention center and checked Guidebook (a fantastic additional scheduling and conference organizational tool that PASS decided to use again this year I hear) for any changes to schedules or sessions I was interested in, and then headed into the main conference room for the first Keynote and opening of the 2011 SQL PASS Summit. 

So although there was a lot of Tweeting and blogging going on during the keynote by Ted Kummert (Senior Vice President, Business Platform Division Microsoft Corp), I’ll highlight some things I thought were important, but below are my takeaways:

       *  SQL Server “Denali” is now officially named SQL Sever 2012 and will be released to RTM the first half of 2012
       *  Microsoft has recognized Big Data, and its associated technologies, are very real things and are releasing a provider/connector for Hadoop
       *  SQL Server Deep Dives 2 is being premiered at the Summit and you can get it signed by many of the 54 co-authors here at the Summit

To see the entire keynote for yourself, click this link.

Now, if you’re still with me, the keynote was cool as a first timer because I’d never seen this much of a production put on for a technology conference.  4000 attendees, making this the biggest Summit yet, in attendance and chomping at the bit to get the latest information and ready to get onto a full day of sessions and networking.  The anxiety and excitement was almost palpable.  As the keynote closed, everyone was again welcomed to the Summit and we all headed our separate, but collective ways, to our intended sessions. 

If I hadn’t already mentioned it, there were 170 sessions over 3 days.  With only five 45 minute to 1 hr 15 minute sessions (and some even spanned two sessions because of their depth), you do the math.  There just aren’t enough hours in the day (or even a solid week for that matter) to see it all.  Guess that’s what the DVD’s are for.

So I attended sessions on Day 2.  All good.  Some great.  Lunch was a chapter luncheon, so I sat at a table specifically designated for the Wichita, Omaha, and Lincoln SQL Server User Groups.  Unbelievably I was the only one there from the Wichita area.  Got to meet some cool people though, and hope we’ll stay connected and share often.  Later in the day I got to spend some time in the Exhibition Hall looking at the different vendor’s offerings and signing up for all the giveaways to be done the next day.  It was fun.  There were also SQL Server Expert Pods located all over the 6th floor to cover a range of topics.  Always manned by MVPs or Microsoft Experts, these are great places to spend the session hours, especially if you have a question you need answered and you’re going to buy the DVDs anyway.

After the regular conference day is thru, there are plenty of after-hours events and parties to get invites to if you spent your time well with the vendor’s.  Make an effort to be sure you have options for the evenings, and don’t just head back to your hotel.  Again, I don’t drink, but hanging out with your SQL comrades is fun even when you’re sober.

Day 2 in the books.  Tired, but really having a good time.  Met some cool people.  Chatted with MVPs, experts, and SQL Server Community celebrities.  #SQLWinning

Check back tomorrow for Day 3 (2nd conference day) details.


SQL PASS Summit 2011 – A First Timers View

October 24th, 2011

So, as promised, I’m lightly documenting my experience as a first timer at the SQL PASS Summit. In the next few posts, I’ll be specifically covering the 2011 Summit, held in Seattle, WA. I’ll break it up by day, beginning with travel and what to expect.

Also, there are plenty of other articles out there from well experienced PASS Summit attendees, so I encourage you to read all you can prior to your first trip to what is truly the largest, best supported, and well carried out technical conference I’ve ever attended. Without further ado….

Day 1 of my First Timer experience of SQL PASS Summit 2011

Travel to the Summit
Day 1 was the Tuesday before the first day of the Summit. It began with a two-stop flight from Wichita to Seattle, with the stop in Denver. No hassles anywhere along the way there. It was when I arrived at SEATAC that things began to get interesting. I had read and heard that Seattle had a Light Rail Train system and that the cost was only a few dollars to get from the airport to just about anywhere between SEATAC and downtown Seattle, so I immediately set off to find the Light Rail. That was a bit of a trek, but nothing deadly ;). It was on the third level, accessed from a walkway connecting the airport parking garage to the Light Rail. There were plenty of signs along the way. Once you get to the Light Rail, there’s no one posted to talk to or ask questions, which you’re sure to have if this is your first time. You walk thru the gates and must purchase a ticket to ride the train based upon your destination. For the Summit, the very last stop (Westlake Center) was what I wanted, but the maps are difficult to read and figure out when there’s line of people behind you at the ticket kiosk, so I chose the University destination. I then went up the escalator to the train loading areas and luckily didn’t have to wait for a train since one was there already. (SEATAC is the southern most end of the Light Rail system, so any train is going where you want to go). I jumped on and took a seat. Then waited for 5 minutes and the train took off, north bound. I only wasted all this time to talk about the Light Rail train to highlight my mistake on the destination (which forced me to walk a lot more with bags in tow to get to my hotel (Marriott Spring Hill Suites) and to let you know that it took the train about 50 minutes to get from SEATAC to Westlake.

Summit Registration
Once I got to the hotel, I checked in, dropped off my bags, and headed over to the Washington State Convention Center, the site of the Summit in 2011. Registration was easy. It was on the 4th floor, up two sets of escalators in the main hall of the center. Lots of people were posted on the 4th and 6th floors (both PASS volunteers and convention center staff) to direct you to what you needed. I registered (4th floor thru the doors on the left to enter the Summit site) around 1:15PM Tuesday afternoon and there was zero wait. The registration volunteers were cheerful and very helpful, explaining what everything was for. Not sure if that would be the case if there were lines, but it was nice for me.

PASS Chapter Leader Mtg
I then grabbed lunch at the Subway out in the main hall on the 4th floor and read thru some of the program guide and mini-schedule. For those of us who are PASS Chapter Leaders, there was a pre-conference meeting that I attended Tuesday @ 2:30PM. I met a few other chapter leaders and we chatted for a while before the meeting. Finding people I already had something in common with right off the bat was nice.

First Timer Orientation
After the chapter leader meeting, it was already time for the first timers meeting. This was something new this year. Thomas Larock ( Twitter: @SQLRockStar) and other PASS leaders and volunteers coordinated a Big Brother/Sisters solution to the issues of attendees not returning to the Summit because they felt they arrived alone, ate alone, spent the evenings alone, and left without making any real contacts. I don’t know, honestly, if the Big Brother/Sister program worked well due to the fact that is very difficult to stay in contact with any of the people in your group after you initially meet. Our group never got back together. And with all the different sessions, experience levels, and goals for the summit, I can understand how that could be difficult. It would have been nice to have some more group lunches or breakfast or something. In the end I did meet one person that I really felt we hit if off well together and did manage to hang out with a few times during the summit. So that was a win.

Speed Networking
After the first timers met their group and Big Brother/Sister, we all went into a huge room where all 800 first timers sat in on a 45 minute Speed Networking session by Don Gabor. We learned the importance of introducing and selling ourselves within a couple minute’s time during this session, which was like a massive 800 person speed dating party. Every 3 or 4 minutes you moved down to the chair and met someone new. Interesting for sure.

First Timers Red Carpet Welcome
At the end of the Speed Networking session the two large displays showed the insides of the even larger conf room next door (the main conf room) where the current PASS President (Rushabh Mehta) was on stage and welcoming everyone to the 2011 PASS Summit. At the end of that welcome, he introduced the first timers and a huge black curtain between the two rooms fell and all 800 first timers entered Welcome Reception thru lots of fog, lights, and a red carpet. Pretty sweet way to start the Summit. We were all rock stars!

Welcome Reception
Alright, so on to the lowdown on the Welcome Reception. It was around 4000 people in a huge ballroom. Dimly lit, free food and drinks, and lots of movement and noise. Some of that is where I have some complaints. There weren’t any signs or indications of where things were. As a first timer, I had no idea what was going on and decided that if I didn’t just grab some food, I wasn’t going to get any (turned out that wasn’t the case). With 4000 people and cuisine specific buffet tables in different areas of the room, getting around was tough and you didn’t know what was available unless you got around the whole room. I was also disappointed in the fact that although drinks were free, there was nothing non-alcoholic aside from water.

During the reception party, a group of SQL Server community “celebrities” were brought on stage along with 4 attendees and 4 teams played Quiz Bowl, a team based version of Jeopardy, with answers consisting of funny combinations of SQL Server technology and pop culture. Entertaining, and those attendees walked away with prizes.

First Afterhours Event
As the Welcome Reception was drawing to a close, many people began heading out to various after hours events and this is where the PASS Summit to me is vastly different from other technology conferences. As a first timer I wasn’t sure what to expect, but had heard that getting yourself invited to afterhours stuff was critical in experiencing the Summit the right way because you get to hang out with other SQL Server professionals in a mostly social way. Sure there was still a lot of shop talk, but it was usually surrounded by a party. This night, as a chapter leader, I attended an invitation only event sponsored by PASS for volunteers and speakers at the Garage, around 10 blocks from the convention center. I caught a cab with group of people, so it was only a couple bucks each. Most were already blitzed and I don’t drink, but I can have a good time w/o, so it was all good. At this party it was a little more difficult finding people to talk to here, but I think it was because most of them weren’t first timers. They were reconnecting, which is understandable. I did meet some people and have some cool conversations just by sitting down and introducing myself. I shared a cab with some other leaders back to my hotel around midnight and then did some work and hit the sack. The next day (Wednesday) was the first conference day and started at 8AM, so I needed some sleep.

At the end of the 1st day, I’d already met a handful of people I really liked and would chat with during the rest of the Summit off and on. And that’s honestly what the Summit all about. Realizing you’re all there with the same passion for SQL Server and overcoming fear and putting yourself out there. You’ll be accepted here….

As the PASS mantra goes, Connect/Share/Learn.

Well, that’s it for today. Tomorrow, Day 2.


2011 SQL PASS Summit coming…preparing for the 3-day long blog…ASUS Transformer Tablet purchase

October 2nd, 2011

Well, the 2011 SQL PASS Summit in Seattle, WA, is fast approaching….2 weeks until the biggest SQL Server based technology community convergence begins and being an official first-timer, I’m not totally sure what to expect, but I’m planning to share every day what it’s like on this blog.

To start things off right, I found a killer deal with the help of the community on a new Asus Eee Pad Transformer tablet with a FREE (yeah, I said FREE) keyboard dock….that’s a $149 value…from With this puppy in hand (or on the table in front of me to be more precise), I should have no problem typing everything up and still staying super mobile. Pretty excited to get my hands on this tablet. Almost as excited as I am to finally get the opportunity to attend the PASS Summit.

In the interest of sharing, if you’re looking (and while the deal is still good), here’s the link to the deal:

And if you go thru FatWallet and have an account with them, you can even get 3% cash back!

Anyway, I’m sure the next couple weeks will fly by, but keep an eye on this blog if you’re interested. If not, well, I’m sure you’ll find something better to do with your time.



Procedure to Validate SQL Server Replication Convergence

July 22nd, 2011

Today I’m posting a turn-key solution for those of you out there supporting SQL Server replication. The purpose of the stored procedure created by the script provided below is simply to leverage a little-known, but very useful, tool provided by Microsoft for the sake of convergence validation (validate that all of your table based articles are identical between the publisher and subscriber(s)).

The execution of the SP (once compiled) simply generates syntax in text format that can be saved as a batch file or run as-is (carefully) from a command prompt to check for convergence issues between a published DB and its subscriber’s DB tables.

The underlying tool, called TableDiff.exe, again, is provided by Microsoft and can be found with the SQL Server binaries. Because of this, the SP should be executed on the server upon which you will be executing the resulting script so that the correct binary location will be provided.

I typically compile the SP in a DBA “tools” database, execute it providing all of the required parameters, save the resulting text as .bat file on the desktop of the publisher’s host server, and then execute the .bat. The batch file will create a log report and, optionally, a fix-it T-SQL script for each article to bring the table(s) back into convergence if differences are found.

Note: There currently is NO error checking done for the the location of TableDiff.exe. It can be added if needed.

Example Usage: EXEC DBA_TOOLS.dbo.Repl_Gen_TableDiff_Code_usp @Pub_SvrName = @@servername , @Sub_SvrName = 'SERVER_A' , @Pub_DBName = 'CUSTOMER' , @Sub_DBName = 'CUSTOMER' , @Provide_Fix = 1 , @Outputdir = 'C:\' Where: @Pub_SvrName is the name of the publisher server @Sub_SvrName is the name of the subscriber server @Pub_DBName is the name of the published database @Sub_DBName is the name of the subscriber database

This has been tested on SQL Server 2005/2008/2008R2, but only with Transactional & Snapshot Replication. Merge Replication uses some alternative tables for configuration and lookups and often includes parameterized filters for subscribers, so I won’t claim this solution is appropriate for Merge or Peer-to-Peer replication, but am open to comments on the topic.

USE [DBA_Tools]
IF OBJECT_ID('Repl_Gen_TableDiff_Code_usp') IS NOT NULL
	DROP PROCEDURE dbo.Repl_Gen_TableDiff_Code_usp
CREATE PROCEDURE [dbo].[Repl_Gen_TableDiff_Code_usp] 
	  @Pub_SvrName	SYSNAME
	, @Sub_SvrName	SYSNAME
	, @Pub_DBName	SYSNAME
	, @Sub_DBName	SYSNAME
	, @Provide_Fix  BIT = 0
	, @OutputDir	VARCHAR(4000)
	, @DebugFlag	BIT = 0
--==  Procedure:		dbo.Repl_Gen_TableDiff_Code_usp  
--==  Author:			Patrick Purviance
--==  Description:		Simple SP to generate syntax that can be saved as a batch
--==					file or run as-is (carefully) from a command prompt to 
--==					check for convergence issues between a published DB and
--==					its subscriber's DB tables.  TableDiff.exe is provided
--==					with the SQL Server binaries and therefore this SP should
--==					from the server upon which you will be executing the 
--==					resulting script so that the correct binary location will
--==					be selected.  
--==					Note:  There currently is NO error checking done for the 
--==					the location of TableDiff.exe.  It can be added if needed.
--==  Usage:			EXEC DBA_TOOLS.dbo.Repl_Gen_TableDiff_Code_usp 
--==  								@Pub_SvrName	= @@servername
--==  								, @Sub_SvrName	= 'SERVER_A'
--==  								, @Pub_DBName	= 'CUSTOMER'
--==  								, @Sub_DBName	= 'CUSTOMER'
--==								, @Provide_Fix = 1
--==								, @Outputdir = 'C:\'
--==					Where @Pub_SvrName is the name of the publisher server,
--==							@Sub_SvrName is the name of the subscriber server,
--==							@Pub_DBName is the name of the published database
--==							@Sub_DBName is the name of the subscriber database
--==  Version history:
--==					v1.0 Created, PPurviance, 11/29/2010
--==					v1.1 Modified to use replication database syspublications to be sure 
--==						 to pick up all articles. I found a bad publication_id in 
--==						 distribution..mspublications causing some articles not to 
--==						 get picked up for convergence checking.
		, @SQL VARCHAR(4000)
		, @LogDate VARCHAR(13)
		, @Version VARCHAR(4)
		, @TablediffLoc VARCHAR(256)
		, @FixFile VARCHAR(256)
DECLARE @SysArticles TABLE (pubid int, artid int, artname SYSNAME)
DECLARE @Publications TABLE (pubname sysname, pubid int)
--== Determine the current SQL version:
Select @version = left(convert(varchar, SERVERPROPERTY('ProductVersion')), 2)
Select @version = 
	Case @version 
		When '7.' then '7'
		When '8.' then '2000'
		When '9.' then '2005'
		When '10' then '2008'
--== Check all supplied values:
--== First check the publisher server name. This must be run on the publisher.
IF rtrim(@Pub_SvrName) <> @@servername
	SET @MSG = 'This procedure must be run on the publisher to work correctly.  The servername provided does not match the local server name.'
IF NOT EXISTS(SELECT 1 FROM master.sys.databases where name = rtrim(@Pub_DBName))
	SET @MSG = 'Supplied Publisher Database cannot be found.'
--==  End check all supplied values
SET @SQL = 	'SELECT pubid, artid, name FROM [' + rtrim(@Pub_DBName) + '].[dbo].[sysarticles]'
INSERT @SysArticles	
IF @DebugFlag = 1
SET @SQL = 	'SELECT name, pubid FROM [' + rtrim(@Pub_DBName) + '].[dbo].[syspublications]'
INSERT @Publications
IF @DebugFlag = 1
--==Build the date and time and tablediff location portion for the log name
SELECT @TablediffLoc = CASE @version 
	WHEN '2000' THEN '"C:\Program Files\Microsoft SQL Server\80\COM\tablediff" '
	WHEN '2005' THEN '"C:\Program Files\Microsoft SQL Server\90\COM\tablediff" '
	WHEN '2008' THEN '"C:\Program Files\Microsoft SQL Server\100\COM\tablediff" '
SET @FixFile = ''
--==Now build the statement
		SELECT @TablediffLoc 
			+ '-sourceserver "' + rtrim(@Pub_SvrName)
			+ '" -sourcedatabase "' + rtrim(@Pub_DBName)
			+ '" -sourcetable "' +  sa.artname
			+ '" -destinationserver "' + rtrim(@Sub_SvrName)
			+ '" -destinationdatabase "' + rtrim(@Sub_DBName)
			+ '" -destinationtable "' + sa.artname 
			+ CASE WHEN @Provide_Fix <> 0
				THEN '" -f "' + RTRIM(@OutputDir) + '\Make_' + RTRIM(@Sub_SvrName) + '_' + RTRIM(@Sub_DBName) + '_LookLike_' + RTRIM(@Pub_SvrName) + '_' + RTRIM(@Pub_DBName) + '_For_' + RTRIM(sa.artname) + '.sql"'
			  ELSE '"'
			+ ' >> "' + RTRIM(@OutputDir) + '\Tablediff_Log_' + RTRIM(@Sub_SvrName) + '_' + RTRIM(@Sub_DBName) + '_vs_' + RTRIM(@Pub_SvrName) + '_' + RTRIM(@Pub_DBName) +  '_' + rtrim(@LogDate) +  '.txt"'
		FROM @SysArticles sa inner join
			@Publications pub on pub.pubid = sa.pubid
		ORDER BY sa.artname 


Hey DBAs, how do you provide passwords to end users?

June 6th, 2011

So and end user has decided they need a new login created on your SQL Server to complete a project/assignment. Immediately a question explosion happens in your mind, right? What will the login name be? What will this login need access to? What will be done with it? Who will have access to the password? How complicated should the password be? You get the point.

Well, today I’m offering up a solution for one of the questions that may seem menial, but to me it’s a pet peeve. How to provide the password to an end user in a simple way without putting it in an email or typing it in an IM or writing it on a sticky note (you wouldn’t do that would you??? hmmmm). Yes, a phone call is a perfectly good way of controlling the sitaution, but I often find people people always want to copy and paste and sometimes I’d rather not make myself instantly accessible to an end user and this is a nice buffer. Below is a process I’ve used for years to provide this information and it’s treated me well.

Secure-Table Method
Once you have the login and password defined in your SQL Server and have granted the appropriate permissions:
1) Create a table (often I just do this in master as every login should have public access to the DB) defined with two columns; one for the login name and one for the password
2) Insert the login name and the password into the table you created
3) Grant SELECT permissions ON the table you created TO the end user who needs to know (only you, the end user you granted permissions to, and any other System Admins should then have access to this table)
4) Once the user has confirmed they have the login name and password, simply drop the table

Additionally, you could create a database specifically for this purpose and add a layer of encryption, but that’s up to you.

I realize this may be a bit overkill for some. But I like security, and I like sleeping at night. Putting a password into any insecure media, able to be accessed by who knows who, and for how long, asks me to question how long I should expect to keep my job.

Anyway, just a solution I thought I’d document. If it helps anyone, great. If not, only cost me about 30 minutes of my life 😉

SQL Agent Jobs owned by individual users – Quick Fix

June 2nd, 2011

Today I’m sharing a “quickly whipped script” to change the owner of SQL Agent Jobs to SA (or another of your choosing), so if the individual job owner were to leave the company and their AD account dropped, the job wouldn’t just begin failing. 

Feel free to use/modify/redistribute for any reason.

SELECT 'EXEC MSDB.dbo.sp_update_job ' + char(13) +
'@job_name = ' + char(39) + j.[Name] + char(39) + ',' + char(13) + 
'@owner_login_name = ' + char(39) + 'sa' + char(39) + ', @description = ' + char(39) + 
CASE WHEN j.[description] = 'No description available.'
	THEN 'Originally owned by ' + + '.' + CHAR(39)
	ELSE j.[description] + CHAR(13) + '.  Originally owned by ' + + '.' + CHAR(39)
FROM MSDB.dbo.sysjobs j
INNER JOIN Master.dbo.syslogins l
ON j.owner_sid = l.sid
ORDER BY j.[name]

April 2011 Wichita SQL Server User Group Meeting

April 8th, 2011

April 2011 WSSUG Meeting

Topic:    Accelerating your Database and Applications with Fusion-io Technology
Date:   Thursday, April 21st, 2011, 6-8:30PM
Speaker:   Brian Terry
Location:   Glorious Bible Church
Overview:   Fusion-io is a pioneering purveyor of NAND Flash based ioMemory –a non-volatile technology that physically acts like persistent memory, yet logically acts like disk. Our presentation will describe ioMemory as a new performance tier that enables data decentralization and consolidation through performance.

 We will explore NAND Flash technology implementations and contrast these to Fusion-io’s approach to leveraging this technology to provide massive performance improvements. Additionally we’ll discuss database and application use cases to illustrate how this technology can be quickly implemented any many common environments.

Sponsor:   Sogeti USA  

The target principal name is incorrect….

January 28th, 2011

Documenting another solution for a problem I ran into today.

Whilst attempting to connect to a SQL Server 2008 Analysis Services (SSAS) host, I was prompted with the following error:

After fumbling around trying to figure out why the “princpal name was incorrect”, I tried the fully qualified domain name for the host and waalaaa, I was in. It also works via IP address. I’m guessing here that SSAS connections are even more reliant upon SPNs (Service Principal Names).

Failed to initialize the Common Language Runtime (CLR) v2.0.50727 with HRESULT 0×800

January 28th, 2011

If you’re experiencing this issue on a Windows 2003 Server, make sure you read UPDATE 3!

So today I’m posting about an issue with SQL Server replication.   This particular posting is with regards to a SQL Server 2008 Enterprise 64-bit platform and transactional replication.

The story goes like this:

I was asked to add some new tables in a published DB to the transactional publication on the publisher to get replicated to the subscriber servers.  Simple request, right?  Right.  Wait.  Wrong.  Yes, the request should have been simple to action.  But upon attempting to generate the publication snapshot for the subscribers to consume, the snapshot fails with the following error:

“Failed to initialize the Common Language Runtime (CLR) v2.0.50727  with HRESULT 0x80070005. You need to restart SQL server to use CLR integration features.”

Upon checking the status of the CLR (mind you we don’t have/use any CLR assemblies here and CLR is disabled on all of the servers here), it states “CLR initialization permanently failed“.  I’m checking that status with:

SELECT * FROM sys.dm_clr_properties

Ok, but who cares…we don’t need CLR, right?  Right.  Wait.  Wrong again.  Microsoft is using internal CLR assemblies for replication.
How’d I fix it?  Like the original error stated, restarted SQL Server.  I hate that that’s the solution.  Especially in a production environment.  But that’s the solution.  As soon as SQL starts up, and before any CLR integration is necessary, querying the state of CLR again, it’s now “loaded CLR version with mscoree“. 

Ok, it’s loaded again and not “permanently failed”.  So we try the snapshot again, monitoring the CLR state through the snapshot generation.   Guess what…..near the end of the snapshot generation the CLR state changed from “loaded CLR version with mscoree” to “CLR is initialized” and listed the directory (C:\WINDOWS\Microsoft.NET\Framework64\v2.0.50727\) and version (v2.0.50727).  So it’s now allowing the internal CLR for replication to run again.

Could it be that our .NET 2.0 installation is corrupt?  Possibly.  The other rationale behind this behaviour, at least in many similar posts (not exactly like our situation) is that memory pressure causes the problem. 

Will post back with updates as they become available. 


WSSUG website

January 13th, 2011

Check it out!!!  The Wichita SQL Server User Group has an official website.

WSSUG Website:

Facebook Page for WSSUG

December 21st, 2010

The Wichita SQL Server User Group now has an official Facebook page for promoting the group and gaining some social media inroads. Primarily to drum up interest in the group and being to provide a discussion area, I created the page and will try to update more regularly than I am here.

I’ll be sending out email blasters thru PASS avenues and am looking for more ideas on how to attract attention for the group so we get off to a decent start and have enough board members to spread the load a little.

So……if you’re following here, or happen to find this blog, jump over to Facebook and check out the Wichita SQL Server User Group page. If you “Like” the page I can send updates to the group that way.

PASS DBA Virtual Chapter Live Meeting Event

December 6th, 2010

Sharing this for those following for WSSUG (Wichita SQL Server User Group) updates. This was shared with me by PASS’ Virtual Chapter and sounded interesting, so I’m passing it along (no pun intended).

DBA Virtual Chapter Live Meeting Event
Wednesday, December 8th


Dr. DMV: How to Use Dynamic Management Views to Monitor and Diagnose Performance Issues With High Volume OLTP Workloads


SQL Server 2005 introduced Dynamic Management Views (DMVs) that allow you to see exactly what is happening inside your SQL Server instances and databases with much more detail than ever before. SQL Server 2008 R2 adds even more capability in this area. You can discover your top wait types, most CPU intensive stored procedures, find missing indexes, and identify unused indexes, to name just a few examples. This session (which is applicable to both 2005, 2008 and 2008 R2), presents and explains over thirty DMV queries that you can quickly and easily use to detect and diagnose performance issues in your environment.

Meeting link:

More official than ever….Wichita SQL Server Users Group, Inc.

November 19th, 2010

The Wichita SQL Server Users Group, Inc. is now an official non-profit organization with the State of Kansas. Filing was made successfully earlier today.

A few more hurdles to clear and we’ll be able to have our first informational & planning meeting. Woohoo!

Again, I will be welcoming just about any support and volunteerism I can get to get things rolling. Once all of the requirements are in order, I will be leveraging LinkedIn, Facebook, and Twitter to generate interest for the group meetings and would appreciate all viewing this to help where they can to get the word out.

Well, that’s it for now….and in the immortal words of Bartles and James….”Thank You for Your Support”


Open Transactions Stuck in the Log of a Database in Simple Recovery Mode

November 4th, 2010

The situation:

I found I had a transaction log that had grown somewhat out of control on a SQL Server 2008 server and pressing hard on what little disk space I had left on a system.  I needed to shrink that log file ASAP before I had a critical outage. 

I’m not going to focus on the long drawn out discussion that could be had with regards to how the log got to this condition.  I only plan to document the condition of said database and how I corrected the problem.  So…..YMMV

The facts:

  • The database was in SIMPLE RECOVERY Mode
  • The log was not automatically truncating (marking the inactive VLFs to be re-used) 
  • DBCC OPENTRAN showed NO Active Transactions
  • DBCC LOGINFO showed 187 VLFs with STATUS = 2 and 2 VLFs with STATUS  = 0

Essentially, I had active transactions stuck in the log and no open transactions to correlate to them.  Again, not going to cover how that could have occurred here.  I simply needed the VLFs containing those transactions to become inactive. I needed to be able to clear an unruly transaction log to get going again.

What I did:

Prior to SQL Server 2008, again in an emergency sitaution, I would have leveraged the BACKUP LOG WITH TRUNCATE_ONLY command, but Microsoft removed that from 2008 and BOL recommends you simply change your database recovery model from FULL to SIMPLE, then back to FULL and take a full backup to restart your backup/log chaining.  Great, but what if my database was already in SIMPLE mode?  Well, I found the same advice will help in this situation. 

The Big Warning
Although there is little chance of corruption or damage to the database, performing the steps below on a production system should only be done in an emergency situation. What I’m documenting here is a modified attack from what is found in BOL (Books Online) for a SQL 2008 system. For 2000/2005, you have the option of using the BACKUP LOG WITH TRUNCATE_ONLY to essentially do the same thing, but again, proceed with extreme caution.

Executed the following:

USE Database_Name

This effectively cleared all the uncommitted/active transactions from the log and I was then able to shrink the log back down to it’s original size.

Hope that helps someone out there.  But again, be sure you understand what you’re doing here.  And if you’re not sure, seek additional advice.


Future Wichita SQL Server Users Group

November 2nd, 2010

11/02/2010 Update:

As promised, I’m keeping you all up to date on the progress for an official Wichita SQL Server Users Group. I have submitted application to PASS for a Wichita chapter and am currently working with a regional mentor to get things started off on the right foot.  I have verbal comittment from a sponsor, access to a meeting location for as long as needed, and will be working on policies and documentation to become an official non-profit with working bylaws.  

No official meeting dates yet as I have more groundwork to accomplish, but look for something no later than early first quarter 2011.  

So again, if you’re interested in attending, get word out to your peers (DBAs, Accidental DBAs, Developers, and the like). After the first meeting or two under our belt, we’ll need to have members step up and fill board positions. Bylaws will dictate, but I imagine there will be two (2) alternating sets of (2)year term positions.  This is something to consider if you want to become a contributing member of the SQL Server community (and won’t hurt listing on your resume either). 

Anyway, this is exciting news! Please keep checking back for updates (even if it’s not for my regular posts 😉 )!  If you have ideas or questions, don’t hesitate to contact me thru the contact page and introduce yourself.