Tuesday, November 12, 2013

Database Migrations Made Easy (Or At Least Easier)

I do quite a few database migrations where I work.  How many, you ask?  Just this  year I've performed around 18 customer database migrations (and I set up almost half that number of new SQL Servers to support the migrations).  Several of these migrations included high profile customers with compressed time-frames.  

Along the way, I've learned several tricks that I hope will help.

We have a couple core databases for our application that are critical to the total operation of our software.  The recovery model for these databases are set to Full and we do transaction log backups every 15 minutes.  We just restore these databases to the new server using NORECOVERY so that we can continue to apply the transaction log backups.

The rest (and bulk) of our customer databases are generated monthly through bulk loading processes.  Once the monthly cycle has closed out, the databases are cleaned and set to read-only for reporting purposes.  Obviously the databases that have aged out and are currently read-only can just be backed up and restored to the new server.

To reduce the outage window during the migration, we schedule a suspension of the bulk loading process so that all the open and future databases can be moved without taking down the entire system.  Obviously your mileage may very here depending on the database model you are working with.

The Full Recovery Model and Transaction log backups are your friends in a tight time-frame as you can get the vast bulk of the data restored on the destination server before the outage window begins.

SQL logins can be easily migrated to the new server using sp_HelpRevLogin.  You don't even need to know the password, it will extract the password hash so that the password on the new server will match the old server.  The script for this SPROC can be found here: http://support.microsoft.com/kb/918992/en-us

Once the database is online, you can use sp_abFixUserLogins to sync database users to the logins on the new server (thus preserving the exact same rights).  The code for this SPROC can be found here: http://www.sqlservercentral.com/Forums/Topic232234-257-1.aspx#bm238983

Overall Tips:
  1. Prepare a Project Plan
    • Well before the migration, write out a project plan so that you know what you need to do when and you don't miss any critical steps along the way
  2. Script EVERYTHING!  
    • Clicking through the GUI is OK, but it takes time and it's easy to miss that one little option that will cause you huge headaches down the road when you miss it.  If you're not familiar with how to script the backup or restore, don't worry!  Just go through the GUI and set the options once, but don't click OK, instead click the Generate Script button at the top of the GUI.  As the name implies, this will open a new query window with the script to perform the actions you just specified in the GUI.  Find/Replace is much faster than going through the GUI many times and you can be sure you didn't miss the options if you made sure to specify them the first time through.
  3. Test your project plan and your scripts
    • Do a dry run of your project plan and make sure that it is solid.  It's easier (and far less stressful) to find and correct mistakes when you don't have an impatient customer breathing down your neck (or worse yet, your boss' neck).
    • Be sure to test all of your scripts as you work through the dry run.  Just like your project plan, it's much easier and less stressful to test and correct everything ahead of time.
  4. Save your scripts and your project plan in a safe and well known place
    • You need to know exactly where your scripts and project plan are (yes, I've walked into a migration and totally blanked on where I put those pesky scripts that I worked so hard on).
    • Save your scripts and project plan to a location that is secure (i.e. redundant storage) and backed up.  You should NOT save them only to your desktop.  If your hard drive crashes, all your efforts are in vain (yes, I've also walked into a migration and had my PC crap out on me, but my scripts were in a safe place, so I could just use another PC and move on).
    • Having a common location for your scripts and project plan is also handy if you have a peer/code review and others need to access them.
  5.  Keep your scripts and project plan after the migration is complete
    • You may be questioned after the migration to ask if you completed a specific task.  By keeping your plan and scripts, you can easily reference your documentation to confirm exactly what actions you did (or did not) perform.
    • You never know when you may need to do another migration and Find/Replace makes quick work of adapting the existing scripts and plan to a new project.  I never delete any of my scripts (unless they are total rubbish and I've already rewritten them).
Happy database migrating!

Friday, August 9, 2013

SQL Replication - Monitoring Tips When Delivering Snapshots for Transactional Publications

Monitoring the delivery of a snapshot for Transactional Replication can be a bit tedious, especially if it is a large publication or you are delivering large tables.  However, the more you know about what to expect, the less likely you are to panic if things don't quite seem to be going the way they should.

The key steps in snapshot delivery are:
  • Run PRE scripts (drop the existing published table/article on the Subscriber)
  • Run SCH scripts (recreate the published table/article on the Subscriber)
  • Push BCP files (BULK INSERT the raw data from the published table/article)
  • Create Indexes
  • Push queued transactions (technically this step is post snapshot delivery)
Monitoring the first three steps is pretty simple as Replication Monitor gives you plenty of feedback during this process.  Unfortunately, index creation does not report any progress (and you may even get the dreaded message "The replication agent has not logged a progress message in 10 minutes" if it takes a while to create the indexes).

Note: If you get that message a lot and you know it's rogue (i.e. the Distribution Agent is still running and there is no blocking of the ALTER TABLE or CREATE INDEX processes on the Subscriber) you can change the heartbeat interval on the Distributor by following the instructions here (Thanks Amit!): http://troubleshootingsql.com/2010/05/28/replication-agent-has-not-logged-a-message-in-10-minutes/  We set ours to 20 minutes by running this script on the distributor:
EXEC sp_changedistributor_property @property N'heartbeat_interval', @value = 20

By default, SQL Server will only create the clustered indexes for tables pushed to the Subscriber.  When the clustered index creation process is complete, the Distribution Agent will report "Delivering replicated transactions".  If you are using the default settings (i.e. only recreating the clustered indexes at the Subscriber), this message means exactly what is says, queued transactions are now being pushed to the Subscriber.  

This message always appears after the clustered indexes are created.  Thus, if you also replicate the non-clustered indexes (like we do in our environment), this message means that the creation of the non-clustered indexes has started, not that it is actually delivering the queued transactions.  In some cases, the Distribution Agent may even report that it "Delivered snapshot from the unc\{path} sub-folder in ##### milliseconds" before the non-clustered index creation is complete.  The only way to know for sure is to either query the Subscriber or wait for the Distribution Agent to report ### transaction(s) with ### command(s) were delivered in Replication Monitor.

There are just a few nuances to monitoring replication and I hope that this helps to shed some light on the process.


Wednesday, July 10, 2013

Maintaining SQL Replication Publications -or- How To Add An Article To A Publication AND Generate A Snapshot

For quite some time, I was stumped as to how to properly script adding an article to an existing publication.  It's a somewhat straightforward process using the sp_addarticle command, but for some reason when I would start the Snapshot Agent, a snapshot would never be generated for the table I added to the publication (and thus it would not be synced to the Subscriber).  

In my case the sp_addarticle command looks like this (the PublicationName and TableName were changed to protect the innocent):
EXEC sp_addarticle @publication N'PublicationName'@article = N'TableName', @source_object = N'TableName', @type = N'logbased', @pre_creation_cmd = N'drop', @destination_table = N'TableName';

Simply running sp_addarticle with the correct options was not enough to allow a snapshot to be generated for that table.  When I started the Snapshot Agent, it would complete successfully with the message [0%] A snapshot was not generated because no subscriptions needed initialization.  Perplexed, I decided to perform the same actions through the GUI (I knew it worked properly there) and then run SQL Server Profiler and filter by my local HostName.  

Note: If you have enabled the @immediate_sync option (it is off by default), you will not be able to drop individual tables/articles from the publication.  However, you can turn it back off without recreating the publication by issuing the following command:
EXEC sp_changepublication @publication N'PublicationName', @property N'immediate_sync'@value = N'FALSE'

In SQL Server Profiler, I saw sp_addarticle executed and I also observed several help SPROCs that the GUI uses for validation, but then the diamond in the rough appeared.  There was an additional SPROC that was executed against the published database: sp_refreshsubscriptions.   Executing sp_refreshsubscriptions will refresh the subscription metadata for any new articles added to the publication, which the Snapshot Agent subsequently uses to determine what articles need a new snapshot.  The syntax is very simple (unlike sp_addarticle):
EXEC sp_refreshsubscriptions @publication = N'PublicationName';

Just replace the PublicationName with the name of the publication to which you have just added article(s) (you can add multiple articles and then execute sp_refreshsubscriptions only once). Once you execute it in the context of the published database, the next time the Snapshot Agent runs, it will generate a snapshot for the article(s).

Note: You do not need to run the sp_refreshsubscriptions SPROC when creating a new Subscription/Publication, as creating the Subscription after the Publication has been created populates all the article Subscription metadata from the Publication.

While I'm on the topic, I also want to quickly touch on how to drop a table from a Publication.  There are two SPROCs to run to remove a table from the publication, sp_dropsubscription and sp_droparticle:
EXEC sp_dropsubscription @publication = N'PublicationName', @article = N'TableName', @subscriber = N'all', @destination_db = N'all';

EXEC sp_droparticle @publication = N'PublicationName', @article = N'TableName', @force_invalidate_snapshot = 1;

Just update the PublicationName and the TableName as appropriate.  

The final tip for today is that you can script starting the Snapshot Agent.  It is another simple SPROC that you run against the published database:
EXEC sp_startpublication_snapshot N'PublicationName';

We use sp_startpublication_snapshot extensively when setting up or modifying a Publication (it sure beats finding the appropriate SQL Agent job on the Distributor, and we don't have to bring up Replication Monitor).

Thursday, May 2, 2013

The Road To Hong Kong - or rather MCSE: SQL Server 2012 Data Platform

While I was at the PASS Summit 2012 (THE best SQL Server conference on the planet) in November, I decided I needed to start working on updating my Microsoft SQL Server certifications.  The task seemed simple enough, but the challenge was more than I had initially anticipated.

I've been a DBA for over 7 years, so while at the PASS Summit, I took the 70-462 exam: Administering Microsoft SQL Server 2012 Databases.  I passed without studying as I've been keeping up on the new features of SQL 2012 and I manage multiple SQL 2008 R2 instances (which is administratively very similar to SQL Server 2012).  Filled with so much elation at passing the first exam, I decided to take the 70-461 exam: Querying Microsoft SQL Server 2012 the very next day.  Alas, I failed that exam (I got a 650), but seeing as I had a 50% off discount, I wasn't too concerned.  I chalked it up to a learning experience.


Fast Forward a couple months (mid-January 2013), and my company was working on renewing our Microsoft Gold Partnership.  This partnership relies on employing people with current Microsoft certifications.  I have my MCITP: Database Administrator on SQL Server 2008 (which also covers SQL Server 2008 R2), so we should be fine, right?  Nope.  Microsoft decided that all of our SQL Server 2008 (and Windows Server 2008) certifications no longer count toward Gold Partnership as they're being deprecated this year.  That seemed somewhat odd since mainstream support for SQL 2008 and SQL 2008 R2 is good until July 2014.  Not to mention the fact that you can also get the new MCSA and MCSE certifications on SQL Server 2008.


What does this mean to me?  I was given a business directive to get my MCSE certification on SQL 2012 by August of this year.  GULP!  Really?  Wow!  {deep breath}  OK.  Let's do this thing!

I decided to take a three prong approach to preparing for the MCSE exams:
  • Microsoft Training classes (there is one class that "corresponds" with each individual exam)
  • Examine the Skills Measures for each exam and review the appropriate Books Online/MSDN Library pages (I spent a lot of time doing this)
  • Play (I have SQL 2012 installed on a couple systems at home and I practiced with any topics I was not intimately familiar with)
I was most worried about the Data Warehouse exam as I have the least amount of experience with that technology.  I took the class and then reviewed all of the Skills Measured for that exam.  Most of them were pretty straightforward, but I needed to look up a lot of the terms to make sure I understood them and in what situations I would use them.

With the return of the MCSA and MCSE, Microsoft has tried to increase the value of the certifications and make sure you know what you are talking about.  In the past, just taking the Microsoft Training classes that relate to the exams have thoroughly covered the exam objectives.  If you took the class, paid attention and took good notes, and then went and took the exam, you would pass.  This is not the case with the current generation of exams and classes.  This is not necessarily a bad thing, in fact, I think it's a good thing.  In order to pass the exam, you will need to review the Skills Measured for each exam and make sure that you are familiar with the concepts, syntax, and sometimes even command line switches.  

What's the best way I found to learn and really drive these points home?  Play!  Set up an instance of SQL Server  on a system somewhere and just start playing with the features and concepts that you need to learn for the exam.  If you can find someone else who is interested in also taking the exams (or at least interested in learning the topics covered on the exams), that can help you tremendously as you need to understand a topic before explaining it to someone else.

Overall, I'm glad that this business requirement came up as it motivated me to buckle down and get 'er done!  I highly recommend to anyone that is passionate about SQL Server to go for this certification.  You'll learn a few things in the process and hold a more valuable certification.


Requirements for the SQL Server 2012 MCSE certification: