Ryan Adams Blog

SQL, Active Directory, Scripting

T-SQL Tuesday

Be the Interpreter

Steve Jones (Blog|Twitter) is hosting this month’s T-SQL Tuesday where his topic is about getting your job done while dealing with business requirements and mandates.  Over my career I have seen many instances of management mandating that something be implemented because they saw an ad on the internet or some magazine.  The problem is that they don’t understand anything about the technology or the availability of other options.

Our job as DBAs is to be an interpreter to understand what the end result is that the business requires and the options to get that result.  The first thing you have to do is recognize the situation where interpretation is needed.  If the request from management has little technical requirements and lots of industry buzz words, that’s your first clue.  Once you have identified the situation you need to reverse engineer it and start asking questions that will allow you to discover management’s desired end result.

Recently I received a request from management to consolidate our reporting platforms.  Once we did our discovery work, we realized that a couple of groups were coding their own reports in C#.  The data they were reporting against was in SQL so it was easy to say that SQL Reporting Services needed to be the chosen platform.  Our reasons were because it is free with the product, would provide the same results, and could seriously reduce report development time.  Although this practice needed to be changed, I suspected this was not what management was really after.  I was right.  Management was not looking for a product standard, but a single report consolidated from the several they were already receiving.  Management did not clearly articulate their desired end result and used a bunch of buzz words that indicated something entirely different.  As a result, my team’s interpretation of their request was incorrect and I guarantee we will ask more questions next time.  Fortunately our detour was not a waste since product standardization is something we needed, and getting the developers onto SSRS saved tons of labor hours.

Dodging Bullets
Dodging Bullets

 

 

 

Bust out the Matrix

Management is firing out mandates so you start dodging bullets like Neo.  Not that kind of Matrix!  There is nothing management likes more than a matrix…well except for a matrix that shows a cost reduction.  One way to make sure you are interpreting requirements correctly is to document your interpretation and submit it back to make sure your solution meets their expectations.  Depending on the request you might have to write it up in another form, but over the years I have found that management really likes things presented in matrix form.

Stand-up (Slowly)

I work in a large corporation where politics abound.  It is very common to see something implemented poorly or incorrectly because an executive asked for it.  Remember that internet add your VP saw?  Well it would not be prudent to simply implement it without asking what the benefits are and how it helps the business.  This is where business requirements, technology, and your career come together on a fine line.  Will the real DBA please stand-up?  I believe you should stand up, but you should do it slowly.  You need to voice your opinion on the right method to use as a solution, but be sure to back it up with facts and explain it well.  You also need to be sure and have it documented in email for the classic CYA.  That being said, you need to be very careful in how you present these technical challenges.  I find that asking questions is a good technique.  You don’t want to overdo it to the point you come across as uninformed and incompetent, but simply spewing out facts and statements can come across is confrontational.

Be a real DBA and stand-up, but be smart about it.

I recently had lunch with my boss to pitch him (yet again) for PASS.  He does not make the actual decision, but if he feels it is worthy, he sends it up the management chain.  He sees the value and always sends it up, but it never makes it through.  During our conversation he mentioned something that sent up a red flag for me.  I found it very interesting and those trying to get their employers to send them to PASS might want to know this inside tip about “large corporation mentality”.  Here is what he said:

They (upper management) almost always turn down travel for conferences, because they are not considered training.”

Well isn’t that Special?

To be honest, he has a point.  I’ve seen plenty of people at conferences that are just there to get out of work.  I have even seen people not show up at the conference at all.  Unfortunately this seems to have given employers a bad view of conferences in general.  I have never been able to attend a PASS Summit, but I know a lot of people that have attended.  That being said, I think you would be hard pressed to not classify it as training.  You obviously cannot consider the pre and post conference sessions to not be training.  When the two things are combined you get 2 days of dedicated training followed by 3 days of specialized break-out training.

If you find yourself in this boat, then my first suggestion is to read Jeremiah Peschka’s(Blog|Twitter) post on Getting to PASS on the Cheap.  My second suggestion is to pitch the PASS Summit to your employer as a training class of 2 dedicated class days and 3 days of break-out sessions based on particular SQL Server features.  Here is part of the email I sent to my boss that more accurately explains what the PASS Summit provides, translated into acceptable corporate terms.

There is a training opportunity called the PASS Summit in Seattle, WA the week of October 10th, 2011.  I am a member of PASS (Professional Association for SQL Server) as well as on the board of directors for the local NTSSUG (North Texas SQL Server User Group) chapter of PASS.  Every year PASS holds a national training conference where the best and brightest SQL server minds come together to teach highly technical sessions.  This is not a conference but actual training sessions.  The first two days are full training sessions with a dedicated instructor and the last days are break-out sessions.  The advantage of the break-out sessions is that there are several concentrations of expertise within SQL server and it allows you to choose the instructor who is an expert in your area of concentration.  These instructors include Microsoft MVPs (Most Valuable Professionals) as well as Microsoft employees right from their headquarters in Redmond.  These instructors are the people who wrote SQL itself, members of the Microsoft product team, members of the Microsoft technical support team, and the people who write all the technical books.  There is nowhere else you can get training from these instructors.

I’m crossing my fingers that the terminology gap has been crossed and I will have more success this year in my company realizing the true value of the PASS Summit.  It might be worth PASS marketing the Summit more toward training than a conference to bridge this gap, but I’m not sure how many other companies view conferences the same way as mine.

Setting the Stage

I started my career as a desktop technician working for a small personal PC repair shop in high school.  The company started picking up contracts to do PC installs for WorldCom.  I began taking on all those contracts and the manager of their IT department was so impressed he offered me a job.  As many people know, WorldCom went bankrupt and was bought by MCI.  During that transition I was moved from the WorldCom sites I was supporting to an MCI site.  Altogether I contracted for WorldCom/MCI for 2.5 years, at which point MCI converted me to an employee.  At this point doing desktop work was no longer challenging and I moved into a server administration role.  I did server administration for several years supporting everything on the NT server platform including domain administration.  This too became no longer challenging, but right at that time Microsoft released Windows 2000 and Active Directory.  This was also the time that large companies started transitioning from the old geographic IT business support model to a role based business support model to better match the evolving technology landscape.  MCI was no exception and a virtual team made of the best domain people from all over the company was put together.  The charter was to create a new Active Directory domain that would support the entire company.  I needed a new challenge and was very happy to be chosen for this new team.  

The Lights Comes On

I supported Active Directory for many years during which I became a bit of an AD expert.  Our team architected and supported the entire AD forest both nationally and internationally for over a quarter million employees!  One weekend we had a Microsoft patching fire drill and there were some SQL 2000 servers on our list.  After installing the patch, we had to verify the version and someone on the team showed me the query and how to execute it.  This intrigued me to find out more about this product whose version could not be derived from a Help-About menu. I started learning SQL 2000, was hooked, and got my MCDBA.

It didn’t take long to see the writing on the wall from Microsoft that many of their products had SQL on the backend like SharePoint and MOM/SCOM (I’m a MOM fan as well), and that was the route everything was going.  At the time, they even rumored moving AD to SQL from the Jet Engine. For me it was a case of being proactive and “see a need, fill a need”.  I knew we would end up needing a DBA and I wanted to fill that spot.

The Show

I continued my primary role in AD for many years and simultaneously began supporting everything SQL related that came our way.  I knew this dual role would serve me well in the long run, and I continued to shove my head in every SQL book I could get my hands on.  I also earned my MCTS and MCITP.  Part of my AD support role was also supporting Microsoft Identity Integration Server (MIIS/ILM/FIM).  The company decided to make organizational changes by creating a new group that would support just MIIS and an in-house application.  I was put on this team because of my DBA and scripting skills, and my primary function finally became DBA.

Since then I have discovered the SQL Community which took my new passion and breathed a whole other life into it.  A lot has happened since I joined the community and this post has been a long time coming, so here is a list of how I became involved in chronological order.  My path was a natural progression so I hope it can serve as a guideline to those just starting to get involved.

I became an active member of my local user group the North Texas SQL Server User Group

Twitter – At the time Peter DeBetta (Blog|Twitter) was our user group President and encouraged everyone to check it out and see what the online SQL community was all about.  I admit that I was skeptical and really did not have the time, but I checked it out anyway.  The help you can get and the minds you have access to sold me instantly and I’ve never looked back.

Blogging – Once you join the SQL community on Twitter the next thing you see are people tweeting about their newest blog posts.  The information you can glean from blog posts is limitless.  I knew my time was finite, but I wanted to give back to the community all the information they have given me.  It was ultimately Brent Ozar (Blog|Twitter) with his post on “How to Start a Blog” that had me sold.  I also suggest using Google Reader to subscribe to your favorite bloggers and keep track of their latest work.

SQLSAT #35 – The NTSSUG began talking about holding its first SQLSaturday and I am so excited about SQL technology and the SQL community that I jumped right in, feet first.  I took care of organizing all of the volunteers as well as coordinating everything with our sponsors and their benefits for sponsoring the event.  You can read more about my involvement in this event HEREHERE…and HERE

Board of Directors – At this point I’ve been riding on the SQLSaturday high (what other kind is there?) for months and I wanted more.  The NTSSUG group had a position open for election on their board of directors, so I went for it and won the election!  I absolutely love serving on the BoD.  There is nothing more rewarding than serving the SQL community and serving your local user group is a great way to do it.

Speaking – Twitter and blogging are great ways to share your knowledge and help others, but speaking is a whole other avenue of interaction.  My first time to speak was at our local user group and on sqllunch.com.  I’ve spoken before and taught a few classes, but it was always internally to the company.  You can find the materials here for my “Mirroring: The Bear Necessities” presentation and you can view the recorded webcast on sqllunch.com HERE.

SQLSat #56 - The NTSSUG decided to have a second SQLSaturday focused on Business Intelligence a mere 5 months after the previous SQLSaturday.  Again I was a core organizer and we had a great event. You can read my re-cap HERE.

SQLSat #57 – The Houston Texas user group is hosting a SQLSaturday in January 2011.  Several of these folks attended the SQLSaturday in Dallas and I was very fortunate to meet this great group of folks.  This is their first SQLSaturday so I volunteered to help them out where I could.  I plan to be at this event early to help where I can, but so far I have been helping them with their sponsorship process.  In addition, I have also submitted to speak at this event and my fingers are still crossed.

SQLSat #63 – The NTSSUG has already set the date for our next SQLSaturday and it will be our third in less than a year!  I am a core organizer and plan to again handle the sponsor side of things.  It was a bit much for me to take on the sponsors and volunteers last year because those responsibilities overlap in the morning, so I more than likely will not handle the volunteer side of things this time around.  Again I have also submitted to speak at this event as well.

SQLSat #64 – This event is being hosted by the Baton Rouge user group headed up by Patrick LeBlanc (Blog|Twitter), who also runs sqllunch.com.  I have also submitted to speak here.

The Curtain Call

Well there is no curtain call for this play, because it will continue on as I learn, hone my skills, and increase my involvement in the SQL community beyond what I have already done.  This blog will continue to serve as my documenter of this journey.  It’s been a long road starting from the bottom, working my way up to a senior guy at a fortune 100 company, and transitioning my career to become a DBA.  Although this road has been full of layoffs, outsourcing, and stagnant pay I’m proud of what I have accomplished and where I am.  I’m an over-achiever by nature so my sights are already set on the goals to come.

SQL Mirroring has a TIMEOUT option, but when and why would you want to use it?  First we need to understand what this option does and that is very simple.  This option defines the amount of time that has to pass without receiving a ping response from a partner before a failover is initiated.  If you are familiar with clustering then think heartbeat (but over a public connection).  The partner will be considered disconnected and either a manual failover can be performed or an automatic failover will occur, depending on your configuration.

The other thing we need to know about the TIMEOUT option is that the default is 10 seconds if you do not change it.  Books Online tells us if you set it for anything between 0 and 4 seconds it will automatically set the timeout to 5 seconds.  If you set it for 5 seconds or greater it will be set to that value.

Network Latency

So why would you want to change this option?  The first scenario would be one of a slow network connection or where the partners are geographically dispersed.  If your network latency between partners is greater than the defined mirroring session timeout, then a disconnected state might be detected.  This could cause a failover event, even though the partner is actually online, all because the pings did not return to the partner in time.  If this scenario is true for you then you have to determine how long it takes a packet to get from partner to partner.  You can use “tracert”, but I suggest doing a simple “ping” which will return a minimum, maximum, and average response time.  To be absolutely certain you pick an appropriate timeout you need to perform the ping from Principal to Mirror, Mirror to Principal, Principal to Witness, Witness to Principal, Mirror to Witness, and Witness to Mirror.  Every network is different and you can never guarantee that a packet from Server1 to Server2 will follow the same route as a packet from Server2 to Server1.  This is why I suggest performing your ping tests in both directions.  Write down the maximum response time from each test and then select the largest maximum as your timeout along with a couple extra seconds of padding to account for times of high network traffic.  Remember that ping gives response time in milliseconds, so don’t forget to convert your time to seconds.

Failover Clustering

The other scenario where you will want to change this option is if one of the partners is a failover cluster.  In this case you not only need to factor in network latency, but also the amount of time it takes for a node to failover.  You don’t want a mirroring failover to occur during a cluster failover so you want to make sure you set a timeout that accounts for network latency plus the time it takes for a node to failover.  So how do you determine that?  We already discussed how to determine the network latency piece, so now we just need to know how long it takes a node to fail over.  Start by failing over the SQL group and look in the cluster logs to see the time the failover started and the time it came back online, and then subtract the times to get the duration.  You might be thinking about doing a ping test while you failover the group, but this will not be accurate.  A ping response will be returned as soon as the cluster IP resource comes online, but prior to the SQL resources coming online.  This is because the SQL resources depend on the disk resources, the disk resources depend on the network name resource, and the network name resource depends on the IP resource.  That means the IP resource will come online first and SQL comes online last.

Now that we have our timing ironed out and have settled on an appropriate TIMEOUT setting, we need to actually set it.  This MUST be run on the PRINCIPAL.

ALTER DATABASE MyDB
SET PARTNER TIMEOUT TimeinSeconds

The last thing you might want to know is how to find out what the current TIMEOUT setting is.  Here is how to get it and I highly suggest you explore the other information in this system view.

SELECT Mirroring_Connection_Timeout
FROM sys.database_mirroring
WHERE database_id = db_id('MyDB')

SQLSaturday 56 BI Edition was a great success according to feedback and we couldn’t be more pleased.  This event was put on by the North Texas SQL Server User Group and supported by MSBIC, Microsoft, and Artis Consulting (a Gold Certified Microsoft Partner).  I may proudly serve on the board of directors for NTSSUG, but even if I didn’t there is no way I would not consider being an organizer for an NTSSUG event.  This is our second SQLSaturday in 6 months!  If you think that is impressive….keep reading.

If you have helped organize a SQLSaturday event then you no doubt know how much time and work goes into these events.  If you have not then I highly encourage you to volunteer and get involved.  If you’re an attendee my advice is to thank every organizer you see at the event.  My second piece of advice is to fill out session evaluations and thank the speakers.  These folks are giving you free training and most pay for their travels out of their own pockets, so a simple “Thank you” goes a long way.  What does all this have to do with SQLSaturday 56?

This event was a little different than the typical SQLSaturday as it was BI focused.  It was also different in the aspect of how we managed it compared to our other SQLSaturday.  This is why I talked about the time and difficulty of putting on these events in the paragraph above.  This was our second event in 6 months and we don’t want our organizers and volunteers to get burned out, so we scaled the event way back.  First is that we used the local Microsoft facility, which meant that the rooms were already equiped for A/V and MS would handle the registration.  Second is that we really did not have any sponsorships except Microsoft for the facility and they in conjunction with Artis Consulting provided breakfast and lunch.  It might not sound like a lot, but it’s huge!  That meant there would be no coordinating sponsors and their benefits (which is a full time job itself) and it meant that breakfast and lunch took no management from us.  This SQLSaturday was designed to be a no frills, “Let’s get our learning on” event.

The only negatives or complaints I’ve heard is that there was no coffee in the afternoon and entering the Microsoft buildings was a pain.  I agree that the building access was inconvenient.  Who wants to show up at an event to find the doors locked and have to wait for security to open the door for you; so yes that was a pain point.

On the plus side we had some awesome speakers and amazing sessions with an incredible time of networking.  We even left off the last session in the largest room and dedicated it to a networking room.  I met a lot of great people.  Drew Minkin (Twitter) is a local speaker for us and I had a great time talking with him at the after party.  If you get a chance to hear him speak then you should do it, because he’ll rotate your mind like it’s his personal cube!  I also got to meet Thomas LeBlanc (Blog|Twitter) who is down to earth and a joy to talk to with his engaging personality.  They don’t call him the smiling DBA for nothing!  The other person who comes to mind was John Sterrett (Blog|Twitter).  This guy flew in from Pittsburgh!  John was great to talk to because, like me, he works with a lot of other administration things other than SQL that were a great commonality.  He and his wife are thinking of moving to Texas and I hope they do, but he might get a hard time being an OU graduate in UT country.

Let me wrap this up before I get carried away, and yes I know it’s too late for that.  If you have any comments on the event either good or bad please leave them here.  We want to know you had a good time, what you liked, and what you didn’t so we can improve next time.  Now I promised you there was something impressive coming and you hung in with me this far so here it is.  If you think 2 SQLSaturdays in a year is impressive….let’s add a third within a 12 month span.  That’s right, NTSSUG officially brings you

SQLSaturday #63

Below are the slide deck and code samples from my “Mirroring: The Bear Necessities” presentation.  A special thanks to Patrick LeBlanc [Blog|Twitter] and SQLLunch.com for giving me the oppotunity to speak and being such gracious hosts!

The first link is the slide deck, the second is the sample code to run on your Principal server, and the third is the sample code to run on your mirror server.  After you download the sample code you can simply rename the extension from TXT to SQL.  Please feel free to comment or send me an email to ryan at ryanjadams dot com if you have any questions.

If you have seen my presentation please take the time to give me some feedback on

SpeakerRate.com

 

Mirroring Presentation

Mirror Setup – Principal

Mirror Setup – Mirror

I will be doing my “Mirroring: The Bear Necessities” presentation twice this week.  Your two oppotunities to catch this presentation are below:

SQLLunch  10/19/2010  11:30AM CT
https://www.livemeeting.com/cc/usergroups/join?id=6J9BKQ&role=attend

North Texas SQL Server User Group  10/21/2010  6:00PM CT
Microsoft Campus Building LC-1
7000 North Highway 161
Irving, Texas 75039

Abstract:
Mirroring can be an integral part of your high availability and disaster recovery planning. We’ll cover what mirroring is, how it can fit into an HA/DR plan, the rules surrounding its use, configuration via the GUI and T-SQL, as well as how to monitor mirroring. This presentation is designed to not only give you an overview of mirroring, but to also walk you through a basic implementation. At the end you will have learned what mirroring is, how it can fit into your environment, what business requirements it solves, and how to configure it.

A while back I was working a DR exercise with a server that was handed down to me.  There were 15 databases mirrored from production to the DR server.  Half way through failing over the databases I got a SQL out of memory error.  The first thing I checked was to see how much RAM the server had.  It had 8 gig, had been recently rebooted, and the databases were all relatively small.  While checking the server memory and seeing the 8 gig, I also noticed this was a 32bit server and PAE was enabled.  Now can you guess what the problem was?  That’s right; AWE was not enabled in SQL server.  So the OS was set for the kernel to have 2gig RAM and any applications could use the rest.  However, without AWE enabled SQL server could not take advantage of that additional virtual address space.  I turned on AWE, restarted the SQL service for the change to take effect, and it was smooth sailing from there.

Most everything new these days is 64bit and you don’t have this problem, but not all of us have the luxury of new equipment.  If you have a 32bit system with more than 4gig of RAM you need to look into enabling PAE for the OS and AWE for SQL.  You should also look into using the /3GB switch, but make sure you research it because it is NOT a one size fits all type of solution.  I’ll try to cover the /3GB switch in a future post.

The Rant

I had someone vehemently insist on a conference call that you could not mirror more than one user database.  Although that alone is worthy of a blog post to clear up the misconception (I’ll explain the confusion in a minute), what I really wanted to talk about is making sure you know the facts before you speak about something you don’t know that much about.  The team I work on is not a SQL DBA team, but more of a development team that supports Active Directory Architecture, Identity Management (FIM), and a custom application to fill in the management blanks that Microsoft left in Active Directory.  I am the only DBA on the team and I have no problem with people asking me SQL questions, in fact I encourage it.  However, would you insist on being correct about something for which someone else on your team is responsible?  Now I’m not saying I am the ultimate “SQLinator”, but I’m certainly no Doofenschmirtz.

There is nothing wrong with being wrong, that’s how we learn.  On the flip side, if you’re not 100% sure then either ask, research it, or respond in a meeting by posing it as a question rather than a statement.  It’s better to say, “I thought you couldn’t mirror two databases on the same server, is that not the case?” than it is to say “You cannot mirror two databases, SQL does not allow it.”  Remember to be professional if someone contradicts you and simply suggest that some research is necessary to verify the correct method.  We are all learning all the time so if you’re not sure then do not insist.  Err on the side of caution and make sure you have the facts before you look like a Doofenschmirtz.

The Technical

Okay I know some of you just skipped to this paragraph after reading the second sentence, but you should really back up for an important lesson in professionalism.  I digress and on to the good stuff.  This individual had stated that you could not mirror more than one user database on an instance.  The truth is that you can mirror more than one user database on an instance.  What you cannot do is have two mirroring sessions for a single DB.  This means that you can mirror every database on an instance to any other instance you want, but you cannot mirror a single database to two different instances.  An example is that you cannot mirror ServerADB1 to both ServerB and ServerC.  If you have a need to mirror a DB to two separate servers or instances you can setup a mirroring session to one instance and log shipping to the other.

There are many ways to setup a DR site like Log Shipping, Peer to Peer Replication, Mirroring, or even backups over the WAN.  I won’t get into the methods here, but I want to discuss a common issue that many DBAs simply forget.  Once you have your DR setup running, have you tested it?  Well you better have!  It’s just as important as testing restores of your backups.  Did you remember to re-create your logins on the DR server?  Hopefully you did or when you tested your DR plan I bet you found out, didn’t you?  Let’s talk about not only how to create those logins on the DR server, but how to automate it so new logins don’t get left behind.

Windows Logins vs. SQL Logins

Windows logins are easy because you don’t have to worry about SIDS and Passwords.  Active Directory handles that for you.  If you want to automate copying Windows logins it’s very easy to create an agent job that executes a stored procedure or straight SQL query, grabs the login name from sys.syslogins, and writes a create statement.  Since AD handles the SID and Password you don’t have to worry about the login being orphaned from the database user.

SQL Logins are not so easy because SQL assigns each login a SID upon creation and you also have to make sure the passwords are the same.  These values are stored in sys.syslogins but it’s the hashed value stored in a varbinary data type.  In order to automate this you have grab those values and do some data massaging to get them into a text form to provide to your create statement.  If you don’t match the SID the database user gets orphaned and if you don’t match the password your application can’t login.

Some Options

Microsoft has an article HERE that creates two stored procedures to help with this task.  The first SP does a hexadecimal conversion for the SID and password hash values and the second one scripts out the create statements.  The problem I have is that I want this automated, so in a DR situation it is one less thing I have to be concerned about.

The other option is the “Transfer Logins” task in SSIS.  This works well, but not everyone runs SSIS in every environment.  Also if the password changes on the account it does not update it.  In this case you would have to delete the login on your DR site and re-run the task.

How to Automate Logins to Your DR Site

I was 95% done writing my own stored procedure to handle both Windows and SQL logins when I ran across a snag.  I turned to Twitter and the #sqlhelp hash tag for advice.  I received a reply from Robert Davis @SQLSoldier (Blog|Twitter) that he had some code that would work and was supplied with his Pro SQL Server 2008 Mirroring book.  I told him that I had already tried that code, but it only handled Windows logins.  Well it turns out that the publisher has the old code for the procedure so he blogged and provided the new one HERE.  It does not handle changing the passwords for accounts that already exist, but it would be very easy to make that change yourself.  This is one brilliant piece of code, particularly how he uses XML to handle the SID and password hash.  His procedure also handles explicit permissions added to the login as well as any server roles it has been assigned.  If you are looking to automate your logins I highly suggest using his code.  Thanks Robert!