Refreshing with production databases

One of the things that organizations tend to want or need during and after implementation are database refreshes. Right now we have 3 database servers: production, staging, and test. Full backups are taken of every user database on the servers every night.

I setup a job that would automatically move them nightly after the backups have been taken. I then wrote a generic script that would restore them when needed. Therefore if I’m out sick and support or users need something refreshed it can be done with as few clicks as possible, may 4 after you have remote desktoped in?

I added a new folder where our nightly backups are stored called “FromProd” on both the test and staging database servers. I then set them up as shares so that they could be more easily accessed.

You will want to have a domain level service account that you can setup to use to run the database jobs, let’s call it serviceaccount_crm for now. This service account needs full control of where the database backups will be moved too (FromProd folders), along with the MSSQLSERVER service account. This can be a little tricky as this user may not always show up. Search your database server for NT SERVICE\MSSQLSERVER, select it and it will resolve to just MSSQLSERVER. Without this you will not be able to restore our database backups from this location, SSMS won’t even see the backup in this folder.

After this has been setup you will want to create a PowerShell Proxy on you production and staging database servers. Enter your proxy name and select your serviceaccount_crm credential name. See photo below.

powershellproxy-300x202
You can then create a new job and enter the below as your one and only step. Make sure to set the type of step as “PowerShell” and in the “Run as” area select your new PowerShell proxy name.

Below is the PowerShell code that lives on staging and moves the production backup from staging to test, it does the following:

  • Removes previous night’s production backup
  • Gets the latest file from the staging production backup
  • Copies the latest file from staging to test’s “FromProd” folder and renames it to BBInfinity.bak

Simple enough right?

Remove-Item 'Microsoft.PowerShell.Core\FileSystem::\\[TEST DB SERVER IP]\FromProd\*.*'
$mypath = 'F:\FromProd\'
$mydestination = 'Microsoft.PowerShell.Core\FileSystem::\\[TEST DB SERVER IP]\FromProd\BBInfinity.bak'
$myfile = gci $mypath | sort LastWriteTime | select -last 1 -ExpandProperty FullName
Copy-Item -path $myfile -destination $mydestination

Below is the PowerShell code that lives on production and moves the backup to staging, it’s very similar to the code above, but doesn’t rename anything.

Remove-Item 'Microsoft.PowerShell.Core\FileSystem::\\[STAGING DB SERVER IP]\FromProd\*.*'
$mypath = 'F:\FromProd\'
$mydestination = 'Microsoft.PowerShell.Core\FileSystem::\\[STAGING TEST DB SERVER IP]\FromProd\'
$myfile = gci $mypath | sort LastWriteTime | select -last 1 -ExpandProperty FullName
Copy-Item -path $myfile -destination $mydestination

These are all setup in tandem. After the nightly production backup has run, I have the production to staging move scheduled to run. Then an hour after that I have the staging to test scheduled to run.

If you have any other questions or run into issues with any of this let me know. I’ll be more than happy to explain it all.

Happy Coding!
🙂

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s