Ryan Adams Blog

SQL, Active Directory, Scripting

A couple months ago, one of our local Microsoft Evangelists asked me if I would do a TechNet Radio interview about the North Texas SQL Server User Group. I’m on the board of directors for the group as the Director of Logistics and Communication, and he wanted the point of view from a user group leader.  I was more than happy to do it for two reasons.

First, I love the SQL community and our strong and vibrant user group.  We are about to host our third SQLSaturday in 12 months this weekend, we are having an awesome pre-con the Friday prior, and have petitioned to get SQLRally 2012 here in the Dallas area.  It is a very exciting and busy time for us!

My second reason is John Weston (Blog|Twitter).  John is one of Microsoft’s local evangelists for the DFW area and surrounding states.  What does an evangelist do?  Think Oral Roberts, but they are champions for Microsoft products and don’t swindle money from you.  Their role is get the word out about Microsoft products whether it be an update to an existing product or a new product.  Part of that mission is to support local community groups in getting the resources they need.

I’ve worked with other evangelists and people in Microsoft with regards to some user group needs, and none of them have been more willing and helpful than John.  We had a facilities related issue that had gone on for months with responses that were either ignored or just unwilling to provide us with what we needed.  One conversation with John and it was fixed before the next meeting.

Thanks John!

Here is the interview for your viewing pleasure, and make sure to click the links above for all the exciting things the North Texas SQL Server User Group is doing.

TechNet Radio Interview

So you created a DSN on a Windows 64bit machine and your application can’t connect to it, huh?  Even more confusing is that the DSN tests fine and connects to your SQL server without issue.  I ran across this issue back when Windows 2008 came out.  I had completely forgotten about it until recently when someone facing the same issue asked me if I knew what the problem could be.

Now in my case, I was moving everything from a Windows 2003 32bit machine to a Windows 2008 64bit machine.  The application consuming the DSN was a 32bit classic ASP site.  The first thing I did was copy the site over to the new server and configure it in IIS.  Let me also interject here and say that this took some time because MS changed the look of IIS from version 6 to 7 and in my opinion it is less than intuitive.

My next step was to setup the DSN for the application to access my SQL database hosted on another physical server.  No problem I thought, I will just go under start and administrative tools and use the “Data Sources (ODBC)” utility to set it up.  I setup the DSN in the utility and it tested fine, but my application still could not connect.  Did you remember that I mentioned the application was 32bit?  A 32bit application uses the 32 architecture and associated DLL libraries and a 64bit application uses its associated libraries.  The problem is the “Data Sources (ODBC)” utility, under start and administrative tools, is the 64bit version and Microsoft does not show that there is a 32bit version in the GUI.  The GUI strikes again!  That’s okay.  We’re DBAs and we know to not trust the GUI.

The solution is to create your DSN for 32bit applications in the 32bit ODBC utility and your DSN for 64bit applications in the 64bit ODBC utility.  So where is the 32bit version hiding?  It’s right here:

C:WindowsSysWOW64odbcad32.exe

One of the most useful things I have found with Policy Based Management is the ability enforce my corporate naming standards.  There are many reasons to have these standards, but I think supportability reigns supreme.  For more on my thoughts about naming standards you can read my post SQL Nomenclature.

Today let’s look at how to enforce the naming convention for stored procedures.  For our example we have two criteria for the naming of stored procedures.  The first is that we do NOT want it start with “sp_” since that is the system naming convention.  The second is that we want it start with “usp_”.  We can meet both of these criteria with a single expression in our policy condition.  Our first step is to create our condition by right clicking conditions under policy management and selecting “New Condition”.  Here we will give it a useful name, use the stored procedure facet, and define an expression to evaluate based on the @Name field.  This is how it should look.

Next we want to create our policy definition by right clicking policies under policy based management and selecting “New Policy”.  Here we will give the policy a useful name, select the condition we created in the previous step, accept the default targets, select “On Change: Prevent” for the evaluation mode, and enable the policy.  I want to point out two things here before I show you what it looks like.  The first is that you cannot enable the policy until you have selected “On Change: Prevent” for the evaluation mode.  “On Demand” is the default evaluation mode and policies cannot be enabled in that mode.  I’ll elaborate more on that in a future post.  The second thing to point out is that we are sticking with the default targets which are “Every Stored Procedure” in “Every Database”.  The “Every Database” target is a default target that should really be named “Every User Database”, because it does not include your system DBs like Master, Model, TempDB, and MSDB.  For more information on how to get it to apply to user and system databases, please come back next week and see my post on Why My Policy will not Evaluate System DBs.  Here is what the final policy definition looks like.

The last thing we need to do is fill out the description tab.  First we will create a new category called “All Naming Conventions” because we know we will have other naming convention policies for other database objects.  Next we will define the description, text to display, and address fields.  These fields will be displayed to the users who violate this policy.  The information on this tab is optional, but I highly recommend filling it out so users who violate the policy will understand why, how to conform to the policy, and where to find additional help.  Here is what our description tab will look like when completed.

We have now completed the creation of a stored procedure naming convention policy that will prevent the creation of any SPs that do not follow the policy.  The policy will also provide the user with any pertinent information on how to comply with the policy.  The last thing we need to do is attempt to create a policy that violates this naming convention to ensure the expected policy behavior.  Here is what we should see.

You Should Have a SQL Naming Convention

Everyone should have a naming convention for their database objects.  I’m sure I could rant on about all the reasons, but let me just draw a parallel and then give an example.  Whenever you support multiples of anything in IT, the benefits of consistency and standards will always prevail in easing their supportability.  Consider creating a desktop or server OS image if all the hardware is different.  If they were all the same you would only need one image, but if they are not you have to create an image for each underlying architecture.  The same applies with naming standards and supporting many database servers.  If you have a defined nomenclature for your database objects things are easier to find and more importantly you can write scripts that will apply across the board.  If you’ve written any code or scripts you know how much time savings can be gained from code re-use.  SQL Server also has system objects and you want to make sure you can easily identify your objects from the system objects.

Do your stored procedures start with SP_ ?  It is well known that naming your stored procedures with a prefix of SP_ is a bad idea.  Why you ask?  All of the system stored procedures start with that prefix which makes identifying your SPs from the system SPs more difficult.

Are You Enforcing Your Convention to Ensure Standards

Most standards in companies are and should be documented.  Team members are told where the document is located and that they are to follow it.  If this method works for your company then more power to you, but most of the time people rush and the rules are not followed.  So how can you mitigate that?  Here is where SQL 2008 comes to the rescue with Policy Based Management.  You can use PBM to enforce your company standards and prevent people from creating SQL objects that are not within policy.  Using PBM to enforce your naming standards is a whole other series.  Come back next week and you can find the first post here:  SQL Naming Conventions with PBM – SPs

Example

Below is a skeleton outline of a naming standard.  It is provided here as an example so it is not complete, but should give you an idea of the things you might want to build a standard around.

  • Stored Procedures – usp_[procedurename]
  • Tables – tbl_[table name]
  • Views – vw_[view name]
  • Columns – [column name]
  • Triggers -
  • Clustered Index – clidx_[table name]_[column name]
  • Nonclustered Index – nclidx_[table name]_[column name]
  • Primary Keys – PK_[tablename-column]
  • Foreign Keys – FK_[tablename-column_referencetablename-column]
  • Constraints – [Constraint Type Identifier]_[table name]_[column name]
  • Constraint Type Indentifiers
    • Default – DF

What Convention Do You Use?

I think it would be great to see the conventions that other people use.  What objects did they create a convention for and how did they decide to name them?  If you’re reading this and don’t mind, please post your convention in the comments so we can have an online brain storming session for all to see.

Other Caveats

There are several other things you should take into consideration when developing a naming standard, like avoiding reserved keywords and special characters.  For ideas on things to avoid check out Bob Pusateri’s blog post about bad table and column names HERE.

I presented two sessions last weekend 1/29 at SQLSaturday #57 in Houston.  Below are links to the slides and/or code for those sessions.  If you were able to catch one of my sessions I would appreciate any feedback you can give by commenting on this post or on SpeakerRate.com.

Manage Your Shop with Policy Based Management Server and Central Management Server

Mirroring: The Bear Necessities

SQLSaturday #57 was great.  It’s only about a 4 hour drive from Dallas, so I headed out about 12:30 and picked up Tim Mitchell (Blog|Twitter) and Drew Minkin (Blog|Twitter) on the way.  The weather was great the entire weekend which made for a pleasant drive.  We arrived at the church about 1.5 hours before the speaker dinner to get the lay of the land and see if any help was needed.  The Houston team Nancy Wilson (Blog|Twitter), Jonathan Gardner (Blog|Twitter), and Malik Al-Amin (Twitter) had everything under control so we headed to the hotel to check-in before the speaker dinner.

The speaker dinner was held at the Outback Steakhouse.  It was a little cramped for space, but that was also a good thing because it meant a high attendance.  The Houston group really took care of us by not only providing appetizers, but buying our meals as well.  Thank you guys very much for that!  The dinner was a great time where I was able to catch up with people from the SQL community I only get to see at these events.  I also got to meet a lot of new people and put faces with Twitter handles.  After everything was wrapped up, Tim and I went back to the hotel bar to have a beer with Patrick LeBlanc (Blog|Twitter).  I’ve wanted to meet Patrick ever since I presented a SQLLunch session.  We had a great time getting to know each other and some awesome conversations that will yield some future benefits for the community.  If you want free training you absolutely need to check out SQLLunch.com.

Tim and I were up at 6 the next morning getting ready to head out and pick up some others from the Dallas crew that were staying at another hotel.  I don’t travel much so I didn’t sleep well at all.  I probably only slept for about 4 hours that night.  Anyway, the venue was the Bammel Church of Christ and it was an awesome venue.  I hope the Houston crew is able to use it for next year.  The opening and closing ceremonies were held in the youth building which was huge and space was certainly not an issue.  That room and one other room in the youth building were used for sessions.  Everything else was in the main building.  This building was perfect for a SQLSaturday because all the rooms for sessions were around the outside with a huge space in the middle.  That space was awesome for networking between sessions as well as lunch.

Speaking of lunch, Houston stayed true to the Texas way by having BBQ catered.  The lunch was fantastic.  They also had Panera Bread bring in fruit, breads, juice, and coffee in the morning.  To top it all off they had an afternoon snack of cookies, water, and assorted sodas.  I would say they hit it out of the park on the food front.

At the end of the day, they had the customary closing ceremony and raffle.  This was probably the only pain point I saw, and to be honest the raffle is a tough one to organize.  One part of why this is a tough thing is that there are two facets (PBM pun intended) to the raffle.  First you have the general raffle that includes items the user group has obtained like books and items from sponsors who only provided SWAG and had no physical presence.  The second part are the sponsor raffles for those sponsors who took names at their table and had a physical presence.  The sponsor raffle is an easy one because you have them pull the winners on stage themselves.  The general raffle is the tricky part.  Houston put raffle tickets in each attendee bag, which sounds low maintenance until you realize how much attrition you had.  In addition to those who registered and did not show, are those attendees that only came for part of the day or left early.  What you end up with is calling out ticket numbers one after the other with no one to claim it.  Fortunately Nancy Wilson did a great job of improvising on the spot.

In Dallas last year we used speaker evaluation forms for the general raffle drawing.  That seemed to work fairly well because those people willing to fill out evaluations were more likely to attend the closing ceremony.  We still had “no shows”, but to a lesser extent.  Tim and I talked about this at great length on our drive back and I think we came up with a good solution.  If it works at our next SQLSaturday in Dallas, I will certainly blog and let everyone know.

Overall this was a very successful SQLSaturday, and my hat is off to the Houston Area SQL Server User Group.

Below is a link to the slide deck for my presentation on “Manage Your Shop with Policy Based Management Server and Central Management Server”.

If you have any questions please comment or send an email to “ryan at ryanjadams dot com”.
If you have seen my presentation please take the time to give me some feedback on

SpeakerRate.com

 

PBM Presentation Link

Paul Randall of SQLSkills fame has posted the contest of a lifetime.  It’s a free seat in their Master Immersion Event on SQL Internals and Performance.  This post is my entry into the contest.

I have worked for the same company for almost 13 years.  During those 13 years the company has sent me to training exactly two times.  Pretty impressive, huh?  Now think about this.  When I started I was a desktop guy on a Novell network.  Since then I have been a server engineer, NT4 domain administrator, Active Directory Administrator, Microsoft Operations Manager, Forefront Identity Manager, automation and scripting, and now a DBA.  That is just the short list, but what I want to point out is that all those skills were learned by me on my own and without training.  Guess what though?  The company has had no problem utilizing (exploiting) my skill set without compensation.  If your wondering what that’s like, imagine trololo in your head for 13 years.

I have a passion for SQL server and you can read about that here in my How I Became a DBA post.  I have an even bigger passion for the SQL community, and that’s why I want this free seat in Paul and Kimberly’s class.  I know what it’s like to have a passion for technology and no access to the training, so you better believe I will share everything I learn with the community.  I will be presenting two sessions at SQLSaturday 57 in Houston this weekend alone, but I bet they would have been much more in depth after some SQLSkills training.  The reason I started blogging and speaking was because of the infectious Brent Ozar of SQLSkills.  I owe him a debt of gratitude and if I win (and Brent makes it), his drinks are on me.  If he doesn’t then I’ll buy Paul’s in hopes he will pay it forward, or share it with his sheep.  These guys have been an inspiration to everyone in the SQL Community, and it would be an honor to attend this class.

Are your blog posts falling on blind eyes?  It can certainly seem that way sometimes.  The reason we blog is because we want to be heard, or seen as the case may be.  One of the ways we gauge our success is based on comments left by readers.  As a technical blogger, there is nothing more discouraging than spending a ton of time banging out an awesome post just to find that days later no one has even commented on it.  There are other ways to gauge your success with free tools like Google Analytics and Google FeedBurner, but they only give us raw numbers and no emotion.  I want to know that my content has helped others solve problems, provided new ideas, or added a new perspective.  As a SQL and technical community, what can we do?

The answer is obvious, leave comments for others!  When was the last time you commented on someone else’s blog?  The best way for us to encourage and support each other is to make a more vigilante effort to leave comments.  I bet that many of us would not hesitate leaving a comment to correct something we see wrong in a post, but I guarantee it takes less time to comment a simple “Thank You”.  The effort is small and it only takes a few seconds to tell someone that their post helped you out, made you want to research a new topic, encouraged you, or was bookmarked for information on a future project.  Comments are not only good for the publisher, but also for the other readers.  This could be anything from adding additional information or ideas to leaving a link to another related blog post.

The other thing we can do is make sure we attribute others for their work.  I’m talking about referencing our resources.  If you remember those dreaded research papers from school, that’s what I am talking about.  Of course I don’t mean to the detail that was required then, but just a simple link and mention of any resources you may have used.  Doing this results in ping backs through most major blogging software and is another great means for us to see those who appreciate a post.  Since we are talking about referencing others and attributing their work along with the dreaded word “plagiarism”, here is how I would attribute a post that has great additional information.

Brent Ozar (Blog|Twitter) wrote a fantastic post entitled “Plagiarism Week: Finding the Slimey Slimeballs“.

We all want validation and human nature is to compare ourselves to others to gauge that.  This is why comments are so important for both the author and other readers.  Don’t worry; you’re not blogging for the blind.  We just need to let each other know we’re reading and appreciate the content.

T-SQL Tuesday

Be the Interpreter

Steve Jones (Blog|Twitter) is hosting this month’s T-SQL Tuesday where his topic is about getting your job done while dealing with business requirements and mandates.  Over my career I have seen many instances of management mandating that something be implemented because they saw an ad on the internet or some magazine.  The problem is that they don’t understand anything about the technology or the availability of other options.

Our job as DBAs is to be an interpreter to understand what the end result is that the business requires and the options to get that result.  The first thing you have to do is recognize the situation where interpretation is needed.  If the request from management has little technical requirements and lots of industry buzz words, that’s your first clue.  Once you have identified the situation you need to reverse engineer it and start asking questions that will allow you to discover management’s desired end result.

Recently I received a request from management to consolidate our reporting platforms.  Once we did our discovery work, we realized that a couple of groups were coding their own reports in C#.  The data they were reporting against was in SQL so it was easy to say that SQL Reporting Services needed to be the chosen platform.  Our reasons were because it is free with the product, would provide the same results, and could seriously reduce report development time.  Although this practice needed to be changed, I suspected this was not what management was really after.  I was right.  Management was not looking for a product standard, but a single report consolidated from the several they were already receiving.  Management did not clearly articulate their desired end result and used a bunch of buzz words that indicated something entirely different.  As a result, my team’s interpretation of their request was incorrect and I guarantee we will ask more questions next time.  Fortunately our detour was not a waste since product standardization is something we needed, and getting the developers onto SSRS saved tons of labor hours.

Dodging Bullets
Dodging Bullets

 

 

 

Bust out the Matrix

Management is firing out mandates so you start dodging bullets like Neo.  Not that kind of Matrix!  There is nothing management likes more than a matrix…well except for a matrix that shows a cost reduction.  One way to make sure you are interpreting requirements correctly is to document your interpretation and submit it back to make sure your solution meets their expectations.  Depending on the request you might have to write it up in another form, but over the years I have found that management really likes things presented in matrix form.

Stand-up (Slowly)

I work in a large corporation where politics abound.  It is very common to see something implemented poorly or incorrectly because an executive asked for it.  Remember that internet add your VP saw?  Well it would not be prudent to simply implement it without asking what the benefits are and how it helps the business.  This is where business requirements, technology, and your career come together on a fine line.  Will the real DBA please stand-up?  I believe you should stand up, but you should do it slowly.  You need to voice your opinion on the right method to use as a solution, but be sure to back it up with facts and explain it well.  You also need to be sure and have it documented in email for the classic CYA.  That being said, you need to be very careful in how you present these technical challenges.  I find that asking questions is a good technique.  You don’t want to overdo it to the point you come across as uninformed and incompetent, but simply spewing out facts and statements can come across is confrontational.

Be a real DBA and stand-up, but be smart about it.

I recently had lunch with my boss to pitch him (yet again) for PASS.  He does not make the actual decision, but if he feels it is worthy, he sends it up the management chain.  He sees the value and always sends it up, but it never makes it through.  During our conversation he mentioned something that sent up a red flag for me.  I found it very interesting and those trying to get their employers to send them to PASS might want to know this inside tip about “large corporation mentality”.  Here is what he said:

They (upper management) almost always turn down travel for conferences, because they are not considered training.”

Well isn’t that Special?

To be honest, he has a point.  I’ve seen plenty of people at conferences that are just there to get out of work.  I have even seen people not show up at the conference at all.  Unfortunately this seems to have given employers a bad view of conferences in general.  I have never been able to attend a PASS Summit, but I know a lot of people that have attended.  That being said, I think you would be hard pressed to not classify it as training.  You obviously cannot consider the pre and post conference sessions to not be training.  When the two things are combined you get 2 days of dedicated training followed by 3 days of specialized break-out training.

If you find yourself in this boat, then my first suggestion is to read Jeremiah Peschka’s(Blog|Twitter) post on Getting to PASS on the Cheap.  My second suggestion is to pitch the PASS Summit to your employer as a training class of 2 dedicated class days and 3 days of break-out sessions based on particular SQL Server features.  Here is part of the email I sent to my boss that more accurately explains what the PASS Summit provides, translated into acceptable corporate terms.

There is a training opportunity called the PASS Summit in Seattle, WA the week of October 10th, 2011.  I am a member of PASS (Professional Association for SQL Server) as well as on the board of directors for the local NTSSUG (North Texas SQL Server User Group) chapter of PASS.  Every year PASS holds a national training conference where the best and brightest SQL server minds come together to teach highly technical sessions.  This is not a conference but actual training sessions.  The first two days are full training sessions with a dedicated instructor and the last days are break-out sessions.  The advantage of the break-out sessions is that there are several concentrations of expertise within SQL server and it allows you to choose the instructor who is an expert in your area of concentration.  These instructors include Microsoft MVPs (Most Valuable Professionals) as well as Microsoft employees right from their headquarters in Redmond.  These instructors are the people who wrote SQL itself, members of the Microsoft product team, members of the Microsoft technical support team, and the people who write all the technical books.  There is nowhere else you can get training from these instructors.

I’m crossing my fingers that the terminology gap has been crossed and I will have more success this year in my company realizing the true value of the PASS Summit.  It might be worth PASS marketing the Summit more toward training than a conference to bridge this gap, but I’m not sure how many other companies view conferences the same way as mine.