Ryan Adams Blog

SQL, Active Directory, Scripting

The North Texas SQL Server User Group is trying out a new format for the August 19th, 2010 meeting.  The first part of the meeting will be a presentation from Janis Griffin of Confio software.  She will be presenting, “Tuna Helper – Proven Process for Tuning SQL.”  The second half of the meeting is the new part.  We will be breaking up into small discussion groups to talk about various topics.  Each group will have a moderator to help facilitate the discussion.  I think this is a really great idea and I hope it works well.  My hat is off to the NTSSUG board of directors for this idea!  NTSSUG is a very large group and we all know how powerful and beneficial networking with other professionals can be, but networking in a large group can also be a challenge.  The NTSSUG board was very forward thinking in recognizing this challenge and trying something new to help the situation.  I really think this idea will help those who are not so outgoing get to know others in the group in a smaller and less intimidating setting.  SQL server has many different facets (PBM pun intended) and most professionals seem to specialize or gravitate toward a select few of those facets.  I believe this format will help connect like-minded individuals to each other, but still maintain the broad scope of interests with the standard main presentation.  I am hopeful that this will create strong bonds by connecting members to each other and connecting members to the SQL community.  May the connections spread like wild fire and our plates overrun with bacon!

As a side note I will be moderating the discussion on, “How do I get a job as <job title> without experience?”  Here is the list of topics:

  • How do you keep current with a training budget of zero?
  • How do I get a job as <job title> without experience?
  • Using blog, Twitter, and Facebook for professional networking and career improvement.
  • What benefits do you get from being part of a user group, and how could NTSSUG be a better resource for your professional development?

Also a new position has opened up on the NTSSUG board of directors and the candidates will each have a short speech during the meeting.  If you are interested in joining this awesome leadership then check out the NTSSUG website for more details, HERE.  As you can see this is a meeting you will not want to miss.  Here is what some others have posted.

Jennifer McCown (MidnightDBA)

Tim Mitchell

A co-worker messaged me today to ask for my help in giving one of our development domain controllers some additional disk space.  Due to space constraints and lack of funding I had to get creative last year in redesigning our development environment to handle four more servers for a new application with no new hardware.  I had 3 physical servers so here is how I set them up.  The first runs Hyper-V with four VMs that are our domain controllers.  The second and third servers run VMWare VSphere and each host two virtual machines for the new application.  The server he needed to get additional space on was our domestic domain controller running in Hyper-V.


I highly suggest that you make a copy of your VHD file after you shut the guest machine down.  Backups are Gold!

Prepare the Guest

Your first step is to shut down the guest machine.  Next you need to increase the virtual disk so it will be presented to the guest when you turn it back on.  It’s very simple and here is how:

  1. Open Hyper-V Manager
  2. Select the desired Hyper-V server in the left pane
  3. Select the desired virtual machine in the center pane
  4. Select Edit Disk in the right pane
  5. The edit virtual disk wizard opens. Select next on the welcome page
  6. On the Locate Disk page click browse and select the virtual disk for your virtual machine
  7. On the Action page select Expand
  8. On the Configure Disk page change the disk size to the new desired size. Current and maximum sizes are provided in the window.

Prepare the VHD

We are going to use the tried and true diskpart.exe to expand the disk.  Remember that this is a system partition and the general rule is that system partitions cannot be expanded without a third party tool.  Let’s prove them wrong by using diskpart on the drive while the OS is not running.  There are three ways I can think of to do this.  The first is booting the machine to floppy and running diskpart from there.  I know this can be done, but I personally have never been successful and have turned to third parties for the solution.  However, this is a virtual machine and not physical so I have other means of running diskpart on the drive without the OS running.  That leads me to the second option of attaching the drive to another virtual machine as a secondary drive.  You could then run diskpart on that partition from the other virtual machine.  The third option is what I used and will describe here.  It should be noted that Windows Server 2008 and Windows 7 can mount VHD drives natively and will not require the tool I will be using.  Here’s how we do it:

  1. Download Virtual Server 2005 R2 SP1 from HERE
  2. Install Virtual Server using the custom option and select ONLY the VHD Mount
  3. Open a command prompt
  4. Type vhdmount /p /f <Fully Qualified Path to your VHD File>
  5. Type diskpart.exe
  6. Type List Volume
  7. Type Select Volume <2 or whatever volume number is your VHD>
  8. Type Detail Volume to verify the free space and that your selected the correct drive
  9. Type Extend
  10. Type Detail Volume to verify the free space is now zero and the size is what you expect
  11. Type Exit
  12. Type vhdmount /u <Fully Qualified Path to your VHD File>

Start up your virtual machine and you should now see your boot partition extended!

The WP-DB-Backup plugin works great to automatically email you database backups, however the email part does not seem to work for many people.  From what I have gathered this stems from a file size upload setting in PHP.  This can be remedied in several ways, the most common of which is editing your php.ini file.  I’m not a PHP guy so I won’t go into all those methods, but feel free to research on your own.  If you find a fix please post it in the comments for others.

You don’t want to store your backups on your site because if the site data gets destroyed so do your backups.  You need to keep them somewhere else, but how can you automate this?  I have two alternatives.

SMTP Plugin

WP-DB-Backup works if you install an SMTP plugin to handle your blogs email instead of the native PHP mailer.  This plugin is great because you can do backups on the fly as well as scheduling them to your desired frequency.  However, once you cross that PHP default threshold (4mb I believe) you will magically stop receiving your database backup emails.  I love this plugin and I wanted to make it work, so I contacted the author.  If you’re thinking of contacting him then good luck, as I have never received a response.  Here is a workaround to get this plugin working for you.

Download, install, and configure WP-Mail-SMTP.  Once you have that setup the WP-DB-Backup plugin will start working again.  Well technically it never stops working, it’s just that the PHP mailer never sends the email.  Not only does WP-Mail-SMTP allow you to custom configure your SMTP server to something other than your ISP, it intercepts and handles anything coming to the PHP mailer method from WordPress.

Script a DB Download

Use DBC Backup to backup to your site.  This plugin allows you to define an export directory on your site where it will store the backups.  You can define the backup interval as well as how many backups to retain.  I suggest setting the plugin to remove all old backups so that only the newest one is available.  You will be pulling and storing them offsite anyway.  I have written a VBScript that you can schedule using windows task scheduler that will download all of the backup files in your export directory to your local machine or server.  I know it’s not PowerShell and shame on me, but I never seem to find the time to learn it as well as I know VBScript.  Make sure to schedule the task in task scheduler to run AFTER the time you have set the plugin to run and backup to your site.  I am a firm believer in code documentation through comments so I am not going to explain the code here.  You should be able to follow it easily with my comments.  If you have questions please email me or put a comment on this post as others may have the same question.

' NAME: DownloadFromWeb.vbs
' AUTHOR: Ryan J. Adams
' DATE  : 7/21/2010
' COMMENT: This script downloads a WordPress DB Backup file created by the
'			"DBC Backup" WP Plugin.
'			Remember that you are hard coding your username and password so
'			either change the script to take them as input parameters or encode the script
'			I personally prefer encoding the script and encrypting it is even better!

Dim strUser, strPass, strURL
Dim objFSO, objShell, objFile
Dim strCommand, strLocalDir, strRemoteDir

strUser = "MyUser" 'Replace MyUser with your username
strPass = "MyPass" 'Replace MyPass with your password
strURL = "MyURL" 'Replace MyURL with your FTP URL or IP Address
strLocalDir = "c:temp" 'Fully qualified path on your local system where you want the backup copied to
strRemoteDir = "httpdocs/wp-content/backup" 'directory on your FTP where the backup is located

Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objShell = CreateObject("Wscript.Shell")

'Here we delete the text file input for the FTP command if it exists
'The script re-creates the file each time it is run
If objFSO.FileExists(strLocalDir & "FTPCommand.txt") Then
	objFSO.DeleteFile strLocalDir & "FTPCommand.txt"
End If

'Here we create the FTP input command text file and populate it with our options
Set objFile = objFSO.CreateTextFile(strLocalDir & "FTPCommand.txt")
objFile.WriteLine "open " & strURL
objFile.WriteLine strUser
objFile.WriteLine strPass
objFile.WriteLine "cd " & strRemoteDir
objFile.WriteLine "mget Backup*.*"
objFile.WriteLine "quit"

strCommand = "%comspec% /c cd" & Mid(strLocalDir,4) & " && FTP.exe -i -s:" & strLocalDir & "FTPCommand.txt"
objShell.Run strCommand, 7, True

'Here we delete our input file so our passwords are not hanging out there in plain text
objFSO.DeleteFile strLocalDir & "FTPCommand.txt"

'Cleanup objects from memory
Set objFSO = Nothing
Set objShell = Nothing

Lately the organizer team for SQLSaturday #35 in Dallas has had several requests from other SQLSaturday organizers about how to get sponsors and handle the sponsorships.  For us Jen McCown (Blog|Twitter) did an unbelievable job of finding sponsors and getting them involved in sponsoring the event.  In fact, she did such a wonderful job that it quickly became overwhelming and more than one person could handle.  That’s where I came in to help, but before you finish reading this post you need to read her blog on how she found the sponsors, contacted them, and defined the sponsorship levels. READ HERE.

My part was to handle the coordination of benefits with each of our sponsors to ensure they each got what their sponsorship level entitled them to.  Fortunately for me I am a very organized person, which was a benefit to me.  You have to be organized and able to communicate well.  You have to think ahead about all the small things you need to ask them in addition to their benefits.  Things like how many people are they sending, do any of them require a vegetarian lunch (we provided lunch for all sponsors and volunteers), do they require power, and how many items are they wanting to raffle.  You also need to make a list of things they simply need to be informed about like how big the table sizes are, as they might be bringing a table cloth.  Clear communication is paramount.  I sent each sponsor an email with a bulleted list of their benefits and the status of each item.  I also included a seperate list for the questions we needed answered that did not pertain to the sponsor benefits, and finally an ending paragraph of the informational things we wanted them to be aware of.  Getting everything possible in a single easy to read format avoids sending tons of emails about every little thing as you think of it.  This will save you time (which you will need), and help prevent you from annoying the sponsor with tons of email.  Remember you not only want them to sponsor this event, but also events in the future.  Make it easy for them and yourself by only including action items in follow up emails, and be as accomodating as you can while being strict with the rules of what their sponsorship level allows.  Sometimes this means doing some of the leg work yourself like picking up flyers for event bags, having them send stuff to you for their table and making sure it is ready for them that morning, and mailing back anything they have left over after the event.

The other thing you absolutely need is a good spreadsheat to track progress with each sponsor.  Make sure to include each benefit for their level, the answers to your questions not specific to their sponsorship, and the items they need to be informed about so nothing gets through the cracks.  You have to use your best judgement on the frequency of follow up emails concerning outstanding action items.  You don’t want to be that obsessive and pesky guy.  Do you really need to follow up with them every week when the event is three months away? No.  Three to four weeks on the other hand, you probably need to if those items are regarding them getting things mailed to you.

The bottom line is that you need to be organized and communicate clearly.  Make sure you thank them in your communications, and certainly be sure to thank them in person at the event.  Again you want them to sponsor future SQLSaturdays, so make sure they know how much you value their sponsorship.  Remember that sponsorships are great targeted business for the sponsors and great for our community!

Every so often I get a call from a user saying he is trying to access some reports I wrote for him in SSRS 2005 and he gets the following error:

Cannot find or load System.EnterpriseServices.dll version=2.0.0

I’m not a .NET expert so the first thing I do is Boogle it.  You will get a ton of hits on this error, but none of them talk about the solution or cause I found.  Most of them tell you to copy c:windowsmicrosoft.netframeworkv2.0subversionsystem.enterpriseservices.dll to c:windowsassembly.  All this does is register the missing DLL and put it back into the GAC (Global Assembly Cache).  The good news is that this method can save you from having to re-install the .Net Framework.  If you find that you do need to re-install, then I would suggest to first try re-applying the latest .Net Framework service pack first.

Although this may fix the issue, it did not explain why it kept happening on this system and at no defined interval.  What I finally figured out is that it happens every time Windows Update has downloaded a .NET Framework Service Pack and is ready to install it.  I still cannot explain how or why this would unregister the DDL from the GAC, but once I install the updates, Bingo!

How did I get here?  Most of my career has been spent with Windows administration, architecture, and design.  Several years ago I started to see the writing on the wall with Microsoft using SQL Server as the backend of all their new product lines.  I have a passion for technology and a pension for learning so I dove right into the SQL pool.  After a couple of SQL certifications and a Microsoft SQL Server Launch event, I discovered the North Texas SQL Server User Group.  I still have not found a community geared toward Active Directory and Windows architecture, so finding a group and community dedicated to my new passion; I knew right away this was my new home.  After all the friends and free monthly training, it was only natural that I wanted to jump in and help wherever I could when the group announced holding a Dallas SQL Saturday.

I volunteered and ended up organizing the volunteers and also coordinating with all of the sponsors.  Jen McCown (Blog|Twitter) of MidnightDBA started the sponsor process, but had too much on her plate with other SQL Saturday volunteer responsibilities, moving her family to a new home, and being a speaker.  I’m pretty sure she got all of two hours sleep the entire week of SQL Saturday Dallas.  Thanks Jen for everything you did, you are a true SQL Superstar!

I won’t even delve into all the pre-planning activities, but start with Thursday of that week.  We hold NTSSUG meetings at the Microsoft campus in Irving, TX on the third Thursday of every month.  Since I do not live or work nearby, I office out of one of my company’s other buildings on NTSSUG meeting days to avoid traffic.  That also means I start my day at about 5:30am.  The night before, I loaded my truck (both the bed and cab) with sponsor SWAG and material.  After my regular work day, I headed over to the Microsoft campus for our monthly NTSSUG meeting where Brad McGehee (Blog|Twitter) was speaking on PAL.  After the meeting we asked for volunteers to help us stuff the attendee bags and actually got it done in about an hour.  With my truck re-packed and even fuller than before, I managed to get home around 11pm.

I was up early again on Friday, but this time to be at the Dallas Stars annual warehouse sale.  My wife and I are huge fans and this event means we can actually afford some Stars gear.  Fast forward to 4pm I arrive at the venue to unload my truck and start preparing.  My first task was to make sure all the sponsor tables were setup, along with anything any of them sent me for their setups.  We had 18 sponsors including the facility and 11 of them had tables and 4 of them had lunch sessions.  After getting all of that setup and determining where the lunch sessions would be held it was now time to work on making sure I was prepared for organizing all of the volunteers.  Early in the week I realized there was no way the morning of the event that I could handle a volunteer orientation and the sponsors as they showed up.  I had Andy Eggers, one of the NTSSUG volunteers, help me out.  That night I walked him through the room and equipment setup, and how everything needed to run the following morning.  Thank you Andy, you saved my life!  After everything was setup we headed over to the speaker party for some awesome networking.

Saturday morning I arrived at the venue at 6:15am.  I am not a morning person and do not function well without sleep, and the late night/early morning patterns were starting to catch up to me.  Fortunately I have a little one at home and have had recent training in the art of little sleep, and I knew some coffee would fix me right up.  I am a very organized person and had everything printed up, check lists ready and a pen in hand.  The event was great, even though my volunteer duties prevented me from sitting in on the entirety of any of my desired sessions.  We provided breakfast for free, lunch at a low cost, free afternoon snacks, and even free ice cream!  Every session I attended was top notch.  We ended the day with sponsor give aways and various other SWAG that included top end SQL software, XBOX360, 1.5TB hard drives, iPOD shuffles, and an iPAD.  We can’t thank our sponsors enough for their support!  After some cleanup we headed out to the after party which was an incredible time of networking.

Did I mention we had a photographer?  Thanks go out to Emad Kamel and the link for pictures is HERE

This event was so great you should take the time to read what others are saying.  Here is a list of those that have blogged about SQL Saturday Dallas so far.

Bill Fellows (Blog|Twitterhttp://bit.ly/b7q7HC

Wes Brown (Blog|Twitterhttp://bit.ly/9orJXc

David Stein (Blog|Twitterhttp://bit.ly/aFb3Yw

AJ Mendo  (Blog|Twitterhttp://bit.ly/bPptxb

Jonathan Gardner (Blog|Twitterhttp://bit.ly/a9VQh0

Allen Kinsel (Blog|Twitterhttp://bit.ly/aJ3pkB

Jennifer McCown (Blog|Twitterhttp://bit.ly/9ihxAA

Stuart Ainsworth (Blog|Twitterhttp://bit.ly/daoEzE

Here is a list of the sponsors that helped make this such a great event.

Want to speed up your AD replication?  You might want to if users call your helpdesk for a password reset, try to login right after the reset, and they still get denied.  Maybe you use some sort of identity management software or web site automation where users can update things like title or phone number, and they don’t see their changes for awhile.  That might be acceptable in some environments, but certainly not in all of them.

AD has something called Urgent Replication where certain events like account lockouts replicate immediately, as opposed to the default replication interval.  What if your network has very large pipes across the forest and you can afford to have everything replicated immediately?  Keep in mind that large network pipes might not be necessary in every scenario.  Enabling change notifications means constant updates, but they are small in size.  On the other hand, using the default replication interval means less traffic, but larger amounts of data.  It is up to you to decide what is best for your environment.

So what are Change Notifications and how do I enable them?  Enabling Change Notifications tells AD that all attributes are to be considered urgent replication.  Instead of a domain controller queuing up all the changes it receives and then replicating them at its defined replication interval, it replicates all changes as they occur.  Here is how you can enable them:

  1. Open ADSI edit
  2. Connect to and expand the Configuration container
  3. Navigate to Sites-> Inter-Site Transports container
  4. Select CN=IP – Note you cannot enable change notification for SMTP links
  5. Right click and select properties for each site link object for the sites you wish to enable
  6. Select the “Options” property
  7. In the edit attribute box, if the value is <not set> then change it to 1.  If it contains a value, you must get the new value by performing a Boolean BITWISE-OR calculation on the old value.  This would be old_value BITWISE-OR 1.


The above will enable change notifications on all connection object in those sites that are managed by the KCC.  If you have created any manual connection objects, which are not managed by the KCC they will not inherit the change notification setting from the site settings.  You will have to set each of those individually.  Here is how:

  1. Open ADSI edit
  2. Connect to and expand the Configuration container
  3. Navigate to Sites-> My sitename with manual connections-> Servers-> My Server-> NTDS Settings
  4. Right click and select properties for each manual connection object in this folder.
  5. Select the “Options” property.  Note that if the value is 1 then it is an intrasite connection object and owned by the KCC.  If the value is 5 then it is an intersite connection object and owned by the KCC.  If it is one of these values and owned by the KCC then do NOT change it.  It should be changed at the site level instead, otherwise if you change the value on a connection object that is owned by the KCC you force it out KCC control and the KCC will no longer manage it.
  6. In the edit attribute box, change the value to 12.


How much of a difference can this really make?  Well mileage will certainly vary according to network and forest architecture.  I implemented this for a large world wide fortune 100 company with large network pipes.  Here are the total convergence times across each geographic region and also forest wide.


Before – 24 minutes 25 seconds

After – 48 seconds

Asia Pacific

Before – 9 minutes 28 second

After – 51 seconds

North America

Before – 25 minutes 35 seconds

After – 58 seconds

Entire Forest

Before – 58 minutes 4 seconds

After – 2 minutes 57 seconds

I am very excited to be a part of the core planning committee for SQLSaturday 35 in Dallas on May 22nd, 2010.  The Dallas area is a technology hot spot and we have quite a local SQL community.  The North Texas SQL Server User Group averages 80 people every month.  If Sean McCown (Twitter|Blog)  is teaching a class before the meeting we rarely have under 100 people.  SQLSaturday is no exception, and current registrations put those numbers to shame.  This is our first time to hold a SQLSaturday in Dallas and planning an event this size is quite an under-taking.  I’m only handling two things and that is almost a full time job itself.  The planning committee has done quite a bang up job and this event will NOT disappoint.

There’s even a RUMOR that ice cream will be provided! I know what you are thinking, “Where do I signup?”  Well I’m sorry to tell you that you missed the bandwagon, but also excited to say that we are full and at capacity.  We have over 500 registrations with 50 people on the waitlist!  This is going to be one big event.

If you are already registered you can verify your registration and lunch paid status HERE.

A couple of years ago I had a group that used a sharepoint web front end that created AD users on the fly in the backend. The problem is that people would either not use it again after the first login, or they would just signup again if they forgot their credentials. Although all of that COULD have been handled by the developers on the front end, it was not and my concern was AD. For the sake of a clean AD, additional replication overhead, and SOX compliancy, unused accounts needed to be removed. Of course let’s not forget the security implications.

It was the application owners’ responsibility to maintain the accounts, but without any AD knowledge they needed a dummied down way to clean up the accounts. At that time I was still learning VBScript and decided to kick it up a notch and write my script in an HTA. I swear my scripts are much cleaner these days. I’m not going to go over all the code, but please leave any comments if you have questions. After you copy the code and save it as an .HTA you will need to change the LDAP paths according to your AD, the maximum age for enabled accounts, and the maximum age for disabled accounts. The reason for querying whether the account is disabled or not is so you can do something like disable an account that is 30 days past and delete it if more than 60 days past. You should also know that I use the LastLogonTimeStamp attribute since it is replicated to all domain controllers. Conversely the LastLogon attribute is not replicated and will vary from DC to DC. For more on the workings of the LastLogonTimeStamp attribute and its replication frequency (14 days) see this TechNet article.

This HTA will not only show you your unused accounts, but will let you save it in a spreadsheet, selectively disable, enable, or delete the accounts, and let you save your changes to a spreadsheet.  Just for reference you can also find this script in the Microsoft Scripting Guys’ Script Repository HERE.  Don’t panic, I didn’t steal it; I submitted it to them to add to their community submitted scripts.

Download File

I ran across something a couple years back and I just ran across this sweetness again.  I had a database populated by a VBScript that inventoried specific things on our servers with a web front end.  I wanted to make it easy for users to see if a server was still under warranty at a glance and not have to do the “datediff” in their head.  My table had a column that contained the warranty expiration date so I could have written this logic into the web page, but it would have to process it for every page load.  By doing this in SQL the computation only has to be done once.  The problem is that I wanted a string returned based on a date calculation and not just the result of some calculation on a column or two.  I discovered a cool way to do this was to write a user defined function because you can use UDFs in your computed column definitions.  Here is the UDF I created:

create function [dbo].[udf_warrantystatusupdate]
 @expdate datetime
returns char(8)
 declare @status char(8)
 set @status = case
  when @expdate > getdate() then ‘Active’
  when @expdate < getdate() then ‘Expired’
  else null

Here is my column definition in my table:

[WarrantyStatus]  AS ([dbo].[udf_warrantystatusupdate]([warrantyexpdate]))