Ryan Adams Blog

SQL, Active Directory, Scripting

The syspolicy_purge_history SQLAgent job is used by Policy Based Management in SQL server to clear out the policy evaluation history kept by PBM.

The job consists of the following three steps.

  1. First to check to see if PBM is enabled and exit if it is not.
  2. Run the msdb.dbo.sp_syspolicy_purge_history system stored procedure to clear the history
  3. Use PowerShell to clear the system health phantom records

If you’re not a PBM user you might not have any idea what this job does (until now).  This job might be failing on you and the error is seen in step 3.  If this describes you then you are also probably running your SQL instance on a cluster, and that’s why we see this job fail.

The job is created by default during the SQL server install and SQL uses the system name in step 3.  It’s not a problem on a standalone SQL server, but it’s a problem on a cluster.  When you address a SQL instance running on a cluster you don’t connect using the names of the physical cluster nodes.  You have to use the virtual cluster network name given to that SQL instance on the cluster.

The fix is very simple.  You just need to open the job and edit step 3 to replace the system name with the cluster virtual name.  You don’t even need to know PowerShell, so don’t panic if you don’t.  Simply replace the server name with the virtual cluster network name for you SQL instance.  Here’s an example of the PowerShell line.

(Get-Item SQLSERVER:SQLPolicyMyServerNameDEFAULT).EraseSystemHealthPhantomRecords()

T-SQL Tuesday

This month’s T-SQL Tuesday is hosted by Allen Kinsel (Blog|Twitter) and covers “disasters and recovery”.  My favorite DR solution is mirroring.  Mirroring is easy to setup and fast, if implemented properly.  It’s a database level solution that allows you to keep a remote copy of your database in sync with your principal database.  Everything is kept in sync, transaction by transaction, in near real time.

Here are some tips and tricks for your mirroring environment:

  • If you use mirroring in SQL 2005, I highly suggest you upgrade to 2008.  I have seen several cases where OLTP systems had to use asynchronous mirroring due to load with SQL 2005, but the added log stream compression in 2008 allowed the use of synchronous mode.
  • Did you use the GUI to configure your mirroring solution?  Then you really need to look into the T-SQL way.  There are several options the GUI doesn’t show you that you need to know about like dedicating mirror traffic to a specific NIC Card.
  • Have you checked to make sure everything on your principal has also been configured on your mirror for when a failover occurs?  You need to make sure things like the database owner, CLR, linked servers, and Logins are all setup and ready to go on the mirror server.
  • Are you using a witness for automatic failover?  You should double check that your application has the partner setting in its connection string or it will not be able to find the mirror.

Last month I flew out to Tucson, AZ to film a few training sessions for SSWUG.ORG.  One of my sessions was on mirroring as part of their SQL Disaster Recovery Expo.  The timing for this event and this month’s T-SQL Tuesday topic is perfect.  The expo starts this Friday June 17th, 2011.  If you are planning a DR solution I highly suggest you check out their expo.

The expo is delivered online and split into two tracks.  The premium track is $39 for SSWUG members and $49 for non-members.  That’s really cheap for an entire day of training from industry leading SQL experts.

They also have a “Best of SSWUG” track that is absolutely free and features content from experts like Kevin Kline (Blog|Twitter).

If you’re considering mirroring in your environment and want to learn more tips and tricks like the ones above, then make sure you register for this expo and watch my session.  I will also be online in the chat and available to answer any questions you may have.

In my session you learn how mirroring can fit into an HA/DR solution, and we also discuss the other options that are available.  You can’t choose the right solution if you don’t know what the other options are, because there is no one size fits all.  In the session I also walk you through how to configure mirroring both through the GUI and T-SQL.  You will not want to miss seeing the differences.  We even take a brief look at how to monitor your mirroring configuration after you get it up and rolling.

I hope to see you there!

REGISTER <<<< REGISTER NOW!

SQL Rally

It’s Official !!

The host for SQLRally 2012 has officially been given to Dallas and the North Texas SQL Server User Group.  NTSSUG is elated to have been chosen as the host for this event.  We have hosted three SQLSaturdays in the last 12 months, so we are prepared to dig in our heals.  Well, this is Texas so we’ll be digging in our spurs.

SQLRally is still in its infancy, but it has some amazing and seasoned event leaders behind it.  That’s why it’s expected that an event this large will need continuous process tuning.  Even getting to this point there have already been a lot of lessons learned.  One of the biggest lessons comes in finding venues.  A lot of venues require hotel room blocks that can run anywhere from $50,000 to $120,000.  If the rooms don’t get filled, that’s a lot of money to put at risk.

Nashville and Denver were also on the slate to host SQLRally, but unfortunately those room blocks ultimately took them out of the running.  The SQLPASS board felt there was too much risk to the organization to put up that kind of money.  I agree, even if that would have taken Dallas out of the running.  PASS is an amazing organization with some top notch people, so when decisions like this are made we should know that they were made with the best interest of the organization at heart.  Even though it came out in our favor, I absolutely think the right decision was made and would have the same decision myself.

That being said, I still find it disappointing that there was not more than one city in the running.  If there had been more than one option on the table then it would have gone to a community vote.  That would have been an awesome thing for the SQL community to “Rally” around.  Unfortunately, life handed us lemons this time, so maybe we’ll just have to serve lemonade at the rally.

We’re hoping to have an unforgettable event in Dallas next year, so start saving your pennies now!

NTSSUG also wants to give our President Sri Sridharan (Blog|Twitter) a very special thank you!  We appreciate your hard work in advocating to get SQLRally here in Dallas.  You’re a machine and we value your leadership!

I had an interesting issue a couple of months ago while creating an SSRS report.  I developed my report in BIDS, built it, and previewed it just fine.  I was also able to deploy it to my RS server just fine, but when I viewed the report from the browser I got a cryptic error about the report not being able to be rendered.  I started digging around and somewhere during the course of my research it dawned on me that this was a new SQL 2008 R2 server and I was probably still running BIDS 2008 RTM on my laptop.

I went ahead and upgraded BIDS to 2008 R2 and opened the report.  As expected, BIDS recognized that this was an older version report and offered to update the report.  I went ahead and updated the report, built it again, and re-deployed.  Of course anytime you do this you want to make sure your settings allow for everything to be overwritten, which I did.  The deployment went fine, but I continued to get the exact same error when viewed through the browser.  I never figured out why I was still getting the error, and it annoys me to this day.

Fortunately the report was a simple one, so I blew it away and re-created it from scratch.  This time it worked without a hitch, and no rendering issues.  My advice is to always develop your reports in the same version of BIDS that your RS server is running.

I have had reports update just fine through BIDS before so I’m not sure what it was about this particular report that did not like that process.

A few years ago I had a need to find all accounts in Active Directory that had the “Password Never Expires” option set.  I wrote an HTA with VBScript to make it easy to find these accounts for anyone with little AD experience.  I posted the script in the Microsoft Scripting Guys’ community script repository HERE, but I have received several reports recently from community members about errors.  I have had the same reports with my Active Directory Last Logon Utility posted on their site, and it turns out that something gets lost in translation when copying code to the site.  For whatever reason some of the formatting gets changed, so years later I have decided to post it here by request.

The script is written to search two AD LDAP paths, so make sure to change the paths at the top of the script.  The script will return any user account objects that have “Password Never Expires” set on their account within those defined paths.  It also searches all OUs under those paths.  Once the accounts are returned you have a few choices.  You can remove the option from those accounts, you can delete the accounts if no longer required, or you can simply export them to an Excel spreadsheet.  After you have performed actions on those accounts, the script will display the accounts it made changes to and allow you to export those to a spreadsheet as well.

If you have any questions about this utility feel free to leave a comment.  Also if you find this utility helpful, let me know in the comments.  I always love hearing that I was able to help someone else.

 

Password Never Expires Utility

Today I received the following error on one of my SQL Reporting Services Servers.

“The report server is not responding. Verify that the report server is running and can be accessed from this computer”

Old Solution

This is certainly not the first time I have seen this error, and the situation is always different.  Most of the time the solution is to add the proper permissions for the reporting services service account and ASPNET account to the following folders.

C:Program FilesMicrosoft SQL ServerMSSQL.1Reporting ServicesReportManager

C:Program FilesMicrosoft SQL ServerMSSQL.1Reporting ServicesReportServer

To identify if this is your problem, then go look at the reporting services error log.  It will be obvious as you will see an access denied message with one or both of the above paths.

HINT: When you change the permissions make sure inheritance is turned on for everything under that folder.

New Solution

Today it was a new issue.  Here is what I saw in the error log this time.

ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.ReportServerDatabaseLogonFailedException: The report server cannot open a connection to the report server database. The logon failed

What makes this tricky is that your first thought is the service account running the Reporting Services Service has an issue.  That can certainly be the case if the password for that account was changed, but not updated on the service.  If the server had been rebooted or the service restarted then the service would fail to start.  Just because the service is running does not mean that there is not an issue.  If the server was not rebooted or the service was not restarted then the service will still be running and you will be none the wiser, until you see the failed authentications in the server security log.

Don’t jump to conclusions, because the reporting services service account was not the culprit this time.  Remember when you first installed and configured reporting services?  One of the options you define is the account that reporting services uses to access the report database.  That was the culprit today.  Someone changed the service account password on the account I setup for reporting services to access the report server.  Changing the password for the account in RS is very simple.  Open the Reporting Services Configuration utility and navigate to the Database Setup node.  Here you will see the account configured for Reporting Services to access the report database.  You can change the account or update the password.

 

If you have ever helped organize a SQLSaturday you know how hard it can be and how much time is required.  This was the third (one of which was the first and only BI focused event) in twelve months for the North Texas SQL Server User Group and we’re proud of that.  I’m on the board of directors for the group so I’m a little biased.  However and based on attendee feedback, all three have been hits and this one was particularly on the mark.  That’s wonderful, but it is not why I’m proud of the events we have hosted.  I’m proud that after three large events we haven’t lost touch with the real reason we do this.

It’s natural to try and out do yourself every time, but we’ve remained grounded in our passion to bring free training to SQL Server professionals.  Our charter is to not only provide free training but also an environment for SQL Server professionals and developers to come together, share ideas, and network.  I believe we have stayed true and grounded in our charter and I’m very proud we have done that.

Last year I handled both the sponsors and volunteers which proved to be a logistical complication.  I can’t be in two places at one time (at least not until Buck Woody clones me in the cloud).  This year I just handled the sponsors, but I also had my hands full with the responsibilities of being on the core organizing team, the board of directors, and also being a speaker.  Taking care of your sponsors is a bigger task than one might think, and once the event is over there are still some deliverables so your job is not done.  A lot of people don’t like the sponsors because they don’t want to be hassled by them, but those are the people who have lost sight in what they are getting from those sponsors.  If it was not for their support the event and entire day of totally free training would not have been possible.  I am personally very grateful for our sponsors and I hope I made the experience such that they will not only return again, but will choose us if there is a competing event.  To see all of the wonderful sponsors for this event that are proud to help the SQL Community CHECK THEM OUT HERE.

We’re still putting together our post mortem, but here are some things we definitely got right.

  • Lunch was a boxed lunch with a vegetarian option, a chicken wrap, or a couple of sandwich choices.  Each lunch came with a bag of chips, salad, and cookie.  The food was good quality and the second time we have used this vendor.  We always get great feedback on the lunch and have never had bad feedback on it.  A definite win for this one.
  • This year we decided to do different colored shirts for volunteers and speakers.  This really helped the attendees know who to ask when they had questions and eliminated them from asking speakers questions they do not have the answers to.  We went with wrinkle free Oxford shirts.  These are really nice shirts and we got a lot of compliments on the choice.  The only downside is that being wrinkle free also meant that the name badges did not stick to them very well.  We also went with long sleeved shirts, which got hot so we will go with short sleeved next time.
  • Anyone who has hosted a SQLSaturday will know that one of the trickier things to figure out is the closing ceremony raffle.
    • Last year we drew out of the session evaluations.  We had a room monitor in each session that collected the evaluations; we compiled them over the course of the day, and used those to draw from.  The first positive to this method is everyone gets an entry for each evaluation they complete.  The second positive is it helps the speakers by giving incentive to the attendees to provide an evaluation.  On the other hand, everyone is ready to go home at the end of the day so not everyone attends the closing ceremony.  The drawback here is you keep drawing names of folks who are not there.  The other drawback is that you have to sort, compile, scan, and email the evaluations to the speakers after the event.  That takes a lot of time and the speakers are stuck waiting to get their feedback.  Although we liked this method because we were helping our speakers get the feedback they want and need, the drawbacks were not worth the cost.
    • This year we tried to mitigate these issues with a two pronged approach.  First we had the room monitors pass out the evaluations at the beginning of the session so attendees could fill them out during the session.  This increased feedback because handing it out at the end meant attendees would skip it so they could get to the next session.  From what I have heard, this worked really well.  I think every single person in my session gave an evaluation.  The room monitor was also supposed to give the evaluations directly to the speaker at the end of the session.  This means there was no work for us after the event to compile and distribute them, and best yet the speaker got their feedback immediately.  Since we did not use the evaluations for the drawing, we hope this still helped in our goal of encouraging and providing feedback for our speakers.
    • So how did we do the actual drawing?  We put raffle tickets in each attendee bag.  I know that doesn’t seem revolutionary, but what we did differently is put both sides of the ticket in the bag.  Each side of the ticket has the same number and we just had the attendees put one side in a box as they entered the closing ceremony.  This ensured that every number that was pulled would be someone in attendance and it worked perfectly.
  • Last year the attendee bags were very noisy and distracting during sessions.  This year we got different bags made of a different plastic and noise was not an issue.
  • Paciugo gelato.  We did this last year and it was a smash hit.  This has now become our trade mark so we did it again this year during both of the afternoon breaks.
  • We brought in a TV and Netbook to have a live Twitter display.
  • Last year we had a very low after party attendance.  We advertised it much better this year and it paid off with some awesome networking.
  • Last year our sponsors were asking for event guides and we ran out.  This year I set some aside for the sponsors.
  • Last year we ordered the schedule in our event guide by track.  That’s fine if you are sticking to a single track all day long, but most attendees do not.  This time we ordered it by time and it was much easier to read and follow.
  • Although we publicized our user group, PASS, and the SQL community last year we felt we could have done a better job.  I think we delivered on doing a better job with it this year.  We talked about it more in the opening and closing ceremony and I also had a rotating slide deck I ran in the morning and afternoon that advertised our user group, PASS, the PASS Summit, and SQL Rally.

I presented my mirroring session this year and it went great.  There were a ton of questions and lots of discussion, which is what I love the most.  I compiled the scores and got 4.5 out of 5.  I also got some good comments including the one I like the most, “Excellent session, but I wish there was more time”.  In fact, that is one thing we are going to consider for next time is having longer sessions.  If you caught my session you can find my slide and code here.  Time was short, so if you have any questions feel free to shoot me an email or leave a comment.

This event was a success without question and we appreciate every single attendee that showed up.  It was an incredible time of learning and networking.  We also had 2 of the 3 fathers of SQLSaturday in attendance, Steve Jones(Blog|Twitter) and Andy Warren(Blog|Twitter), which just made it that much more special.  I know what you’re thinking now that the event is over.  The North Texas SQL Server User Group is a SQL community engine with an unparalleled passion and enthusiasm, so what comes next?

That’s right.  We submitted to host SQLRally 2012 here in Dallas.  We are really excited about this opportunity, but we need your help to make it happen.  Two other cities have also applied and PASS will hold a vote to determine the winning city.  So what can you do?  Glad you asked.  PASS will be sending out emails to all members with a link to vote, but the catch is that you need to be a registered member before the voting opens to get the invitation.  Registration is free so head over there now and get registered.

A couple months ago, one of our local Microsoft Evangelists asked me if I would do a TechNet Radio interview about the North Texas SQL Server User Group. I’m on the board of directors for the group as the Director of Logistics and Communication, and he wanted the point of view from a user group leader.  I was more than happy to do it for two reasons.

First, I love the SQL community and our strong and vibrant user group.  We are about to host our third SQLSaturday in 12 months this weekend, we are having an awesome pre-con the Friday prior, and have petitioned to get SQLRally 2012 here in the Dallas area.  It is a very exciting and busy time for us!

My second reason is John Weston (Blog|Twitter).  John is one of Microsoft’s local evangelists for the DFW area and surrounding states.  What does an evangelist do?  Think Oral Roberts, but they are champions for Microsoft products and don’t swindle money from you.  Their role is get the word out about Microsoft products whether it be an update to an existing product or a new product.  Part of that mission is to support local community groups in getting the resources they need.

I’ve worked with other evangelists and people in Microsoft with regards to some user group needs, and none of them have been more willing and helpful than John.  We had a facilities related issue that had gone on for months with responses that were either ignored or just unwilling to provide us with what we needed.  One conversation with John and it was fixed before the next meeting.

Thanks John!

Here is the interview for your viewing pleasure, and make sure to click the links above for all the exciting things the North Texas SQL Server User Group is doing.

TechNet Radio Interview

So you created a DSN on a Windows 64bit machine and your application can’t connect to it, huh?  Even more confusing is that the DSN tests fine and connects to your SQL server without issue.  I ran across this issue back when Windows 2008 came out.  I had completely forgotten about it until recently when someone facing the same issue asked me if I knew what the problem could be.

Now in my case, I was moving everything from a Windows 2003 32bit machine to a Windows 2008 64bit machine.  The application consuming the DSN was a 32bit classic ASP site.  The first thing I did was copy the site over to the new server and configure it in IIS.  Let me also interject here and say that this took some time because MS changed the look of IIS from version 6 to 7 and in my opinion it is less than intuitive.

My next step was to setup the DSN for the application to access my SQL database hosted on another physical server.  No problem I thought, I will just go under start and administrative tools and use the “Data Sources (ODBC)” utility to set it up.  I setup the DSN in the utility and it tested fine, but my application still could not connect.  Did you remember that I mentioned the application was 32bit?  A 32bit application uses the 32 architecture and associated DLL libraries and a 64bit application uses its associated libraries.  The problem is the “Data Sources (ODBC)” utility, under start and administrative tools, is the 64bit version and Microsoft does not show that there is a 32bit version in the GUI.  The GUI strikes again!  That’s okay.  We’re DBAs and we know to not trust the GUI.

The solution is to create your DSN for 32bit applications in the 32bit ODBC utility and your DSN for 64bit applications in the 64bit ODBC utility.  So where is the 32bit version hiding?  It’s right here:

C:WindowsSysWOW64odbcad32.exe

One of the most useful things I have found with Policy Based Management is the ability enforce my corporate naming standards.  There are many reasons to have these standards, but I think supportability reigns supreme.  For more on my thoughts about naming standards you can read my post SQL Nomenclature.

Today let’s look at how to enforce the naming convention for stored procedures.  For our example we have two criteria for the naming of stored procedures.  The first is that we do NOT want it start with “sp_” since that is the system naming convention.  The second is that we want it start with “usp_”.  We can meet both of these criteria with a single expression in our policy condition.  Our first step is to create our condition by right clicking conditions under policy management and selecting “New Condition”.  Here we will give it a useful name, use the stored procedure facet, and define an expression to evaluate based on the @Name field.  This is how it should look.

Next we want to create our policy definition by right clicking policies under policy based management and selecting “New Policy”.  Here we will give the policy a useful name, select the condition we created in the previous step, accept the default targets, select “On Change: Prevent” for the evaluation mode, and enable the policy.  I want to point out two things here before I show you what it looks like.  The first is that you cannot enable the policy until you have selected “On Change: Prevent” for the evaluation mode.  “On Demand” is the default evaluation mode and policies cannot be enabled in that mode.  I’ll elaborate more on that in a future post.  The second thing to point out is that we are sticking with the default targets which are “Every Stored Procedure” in “Every Database”.  The “Every Database” target is a default target that should really be named “Every User Database”, because it does not include your system DBs like Master, Model, TempDB, and MSDB.  For more information on how to get it to apply to user and system databases, please come back next week and see my post on Why My Policy will not Evaluate System DBs.  Here is what the final policy definition looks like.

The last thing we need to do is fill out the description tab.  First we will create a new category called “All Naming Conventions” because we know we will have other naming convention policies for other database objects.  Next we will define the description, text to display, and address fields.  These fields will be displayed to the users who violate this policy.  The information on this tab is optional, but I highly recommend filling it out so users who violate the policy will understand why, how to conform to the policy, and where to find additional help.  Here is what our description tab will look like when completed.

We have now completed the creation of a stored procedure naming convention policy that will prevent the creation of any SPs that do not follow the policy.  The policy will also provide the user with any pertinent information on how to comply with the policy.  The last thing we need to do is attempt to create a policy that violates this naming convention to ensure the expected policy behavior.  Here is what we should see.