Editorials

DBA Skills – Feedback

DBA Skills – Feedback
Today I am sharing all reader feedback due to the detailed responses received. Tomorrow I’ll be sharing experience with Change Scripts and Automated Deployment in an Agile environment.

Reader Feedback – Software Release Scripts

Anonymous
Another important topic. I share my "old school" approach to managing this between software releases within my development teams (generally <10 developers), but I know there are tools that help to formalize this process, and even compare two DB versions, and I would be keen to hear of people’s experience with these. Particularly for those who manage larger development teams, where my approach might prove a little difficult to manage.

During develoment iterations of an existing piece of software, or when a legacy system is being migrated to a new development platform, I tend to use the following approach:

1) INDIVIDUAL DB CHANGES
Every developer that modifies any structural object within the database (tables, indexes, etc.) assumes the responsibility of generating the change script for his/her modifications. In SQL Server we tend to group application functionality using the built in Data Diagram feature. This does a great job of generating change scripts for multiple related tables at the same time from within the management studio (including all foreign key relationships). When a change necessitates a data transformation, the developer is also responsible for managing this manually in his SQL script.

2) TRACKING CHANGES
We tend to maintain a global DB structural change script within our source code management system containing all changes to be applied to the production environment. This has 2 benefits, 1) we can maintain a version history of the change script and 2) using checkout features, each developer can append their change scripts in chronological order, thus ensuring that multiple changes to the same structural object are processed in the order that they were performed, avoiding conflicts with partial changes.

3) FINALISING THE MIGRATION SCRIPT
When we are ready to run a mock migration, we use SQL Server’s built in script generation tool to generate a script to create all non-structural objects (stored procedures, functions, views, etc.), being sure to flag the options for checking for existing objects and dropping those that exist already. This might sound like over kill when a minor DB update is needed, but experience has shown me that functional changes are forgotten more often than structural changes, and as these don’t touch the data directly, it really doesn’t add significant running time, even in highly complex databases. We then append the global structural change script we have been maintaining to this generated script to include all changes to be applied.

4) MIGRATION PROCESS
During the development process we perform a mock migration of the production environment database every time we prepare a test environment (unit and acceptance testing), thus ensuring that all tests are performed on a migrated database for quality control. Inevitably we catch some missing change scripts at this stage, which we promptly add to the global change script as mentioned under point 2.

Since we have been performing many migrations leading up to the actual migration, it is a very familiar process that is more of a formality from the DB perspective. Most of our issues that we run into concern subtle differences in the production servers’ configurations compared to the test environment; rarely is there an issue with the DB transformations.

Reader Feedback – Automated Deployment

Matthew
I’m curious to hear what others have to say on this subject. I understand the importance of automation when it comes to DBAing, but change-scripts are a class by themselves… Scripting a new index, or a simple UPDATE statement is one thing, but when dealing with release-level deployments that alter or create many objects and move data around one needs to be very careful!

Deployment and automated deployment are two different animals.

We used to create manual change scripts for each "release", keeping track of what changed, and scripting the changes in a manner that honored precedence of entities/objects, ensuring that parent objects and/or data were in place prior to their dependents. It was a challenge, every time, and we ALWAYS tested the execution and results of the deployment on a copy of production prior to actually going "live". Automated deployments of this type were virtually impossible unless we had the exact series of iterative scripts queued up for any given environment when it came time to deploy. The automation would have been prohibitively difficult to manage, not to mention risky.

Then Gert Drapers at MS began discussing declarative DB deployments using Visual Studio (now called SSDT), which we felt were a revolutionary means of deploying databases. No longer did we have to keep in mind the original version, and upgrade through X number of deployments to get an environment up-to-date. Now we could point to the target DB, generate a differential script, and off we go. We tested this process exhaustively, and it was a tad buggy, but worth moving forward with. By the time we began full-fledged development using Visual Studio We still ALWAYS test against a copy of production before going live, but when considering automated deployments, the automator (in our case, TFS) could generate the scripts against the targets dynamically when needed- but we did not implement this in production.

We continue to use Visual Studio as a tool to develop our DB and build deployment scripts. Often times we have release-specific post-deployment scripts that populate certain data based on "recent" changes. But this is still much easier than manual scripting in every deployment scenario we encounter.

I have toyed around with automated deployments using TFS in our dev environment, and it has been successful. Call me a control freak, or someone who is resistant to letting go of their old ways, but I still prefer to run my production script manually, even though the scripts have been tested. I suppose I need to get over that, and once I do, production deployments could be scheduled and executed automatically. I think I just need someone to talk me into it.

Thanks for an excellent topic of discussion!

Get into the discussion by leaving your comments below, or send an Email to btaylor@sswug.org.

Cheers,

Ben

$$SWYNK$$

Featured Article(s)
Troubleshooting SQL Server 2012 Roles
In this article, Alexander Chigrik explains some problems that you can have when you work with SQL Server 2012 roles.

Featured White Paper(s)
Encryption Key Management Simplified
Managing encryption keys is a fundamental step to securing encrypted data, meeting compliance requirements, and preventing da… (read more)