Editorials

Miscellaneous Reader Feedback


Free Expo on Career Development for IT Professionals
Expand the scope of your career…attend our free virtual expo tomorrow and learn ways to enhance their professional life in 2012.

Through our in-depth sessions with some of the leading functional specialists in the IT field (Bill Pearson, Chris Shaw, TJay Belt and Tom Roush), you will see many examples and case studies on ways to improve communication between administrators and managers, develop a consulting business, learn from on-the-job mistakes and more.

By the end of our free event, you should have the tools and understanding needed to be a more productive and marketable IT professional who is ready for upward mobility.

To register, click here.

Miscellaneous Reader Feedback

System Testing
Michael writes in regarding System Testing. He is responding to the need to establish baseline data before beginning a test.

Wow. Just saw this article. Currently in my circles, this is certainly the 64 million dollar question!

As data permutations are the source of a majority of bugs (much more so than logic), for most enterprise system architects, dramatically increasing data permutation test coverage is something of holy grail.

I’ve posed this question to every community I’m involved in and there is little in the way of pragmatic solutions, other than the incredibly suboptimal, create-load-teardown mantra.

In my mind, this approach is simply unacceptable circa 2012!

In most discussions, everyone agrees some type of XML based ‘data injection’ at the granularity of a singular unit test would be preferred, but no one had yet taken the time to actually build such a thing, until you revealed the little known nDBUnit project.


If this actually works as advertised, it’s a gold mine, at least for the .NET dev.

But the question still remains, “Who…is gonna create all this XML!?!”…

Data Transfer Techniques Using PowerShell
Max writes in with a PowerShell scripts for database scripting…

Yes! Coming back with the subject about PowerShell, it’s a great tool where you can mix technologies already available in your system: SMO, TSQL, .NET.

It’s documented in the "SQL-DMO Mapping to SMO" (http://msdn.microsoft.com/en-us/library/ms162159.aspx) that BulkCopy has been renamed in SMO as "Transfer", and there’s a PowerShell example provides in the following MSDN document under "Transferring Schema and Data from One Database to Another in PowerShell": http://msdn.microsoft.com/en-us/library/ms162563.aspx

I have SQL Server 2012, which include the new "SQLPS" Module, and here’s my slightly (OK! very) modified version from the article:

## - To load SQL Server SMO and SQL PowerShell commands:
Import-Module SQLPS -DisableNameChecking;

## - initialize variables:
$MyServerInstance = "";
$MyDatabase = "AdventureWorks";
$MyCopyDatabase = "Dev_AdventureWorks";

## - Connect to the local, default instance of SQL Server.
$srv = New-Object Microsoft.SqlServer.Management.SMO.Server $MyServerInstance;

## - Reference the AdventureWorks2008R2 database.
$dbFrom = $srv.Databases[$MyDatabase];

## - Create a database to hold the Developer of AdventureWorks
$dbCopyTo = New-Object Microsoft.SqlServer.Management.SMO.Database $srv, $MyCopyDatabase;
$dbCopyTo.Create();

## - Script Create Database to an output file for future use: (Optional)
$dbCopyTo.Script() | Out-file c:TempMyDBtransferScript.sql
## - Trick to save a valid reusable T-SQL script merging the Create db with the Transfer of all other objects:
$AddBatchTerminator = @"
go
USE $MyCopyDatabase
go
"@;
$AddBatchTerminator | Out-file c:TempMyDBtransferScript.sql -append;

## - Define a Transfer object and set the required options and properties.
$transfer = New-Object Microsoft.SqlServer.Management.SMO.Transfer $dbFrom;
try {
## - Set this objects properties
$transfer.CopyAllTables = $true;
$transfer.DestinationDatabase = $MyCopyDatabase;
$transfer.DestinationServer = $srv.Name;
$transfer.DestinationLoginSecure = $true;
$transfer.CopySchema = $true;
$transfer.Options.WithDependencies = $true;
$transfer.Options.ContinueScriptingOnError = $true;

# - option are for creating the scripted transfer: (optional)
$transfer.Options.ScriptBatchTerminator = $true;
$transfer.Options.FileName = 'c:TempMyDBtransferScript.sql'
$transfer.Options.AppendToFile = $true;
$Transfer.ScriptTransfer() | Out-Null;

"Execute Database Transfer";
## - data transfer with TransferData method.
$transfer.TransferData();
}
Catch [System.Exception] {
$ex = $_.Exception
$Detail = $_ | Select CategoryInfo,ErrorDetails,Exception,FullyQualifiedErrorId, `
InvocationInfo,PipelineIterationInfo,TargetObject,PSMessageDetails `
| FL;
Write-Host $ex.Message
$Detail
}
##----------------------------------------------------------
## - This section will script out the database transfer: (BatchTerminator Included)

## - Script the transfer to an output (*.sql) file using the "Out-File" command: (Optional)
$transfer.ScriptTransfer() | Out-file c:TempMyDBtransferScript.sql -Append ;

Some times you will expect scripts like this to work. But I got an error, and the only way to identify it was to add a Try-Catch. But it doesn’t end there, you need to make sure yoiu get the detail information. In most case just using the $_.Exception will work, but not in this case.

So I had to include all available value to identify that my Database had an invalid user belonging to another domain. After, I cleared the error then this script work like a charm.

At the end of the transfer, I script out to a sql type output file which can be use later again in the following way:

## - To load SQL Server SMO and SQL PowerShell commands: (assuming local instance, using Windows Authentication)
Import-Module SQLPS -DisableNameChecking;
$MyServerInstance = "MySQLInstanceName";

## - execute saved T-SQL file
Invoke-Sqlcmd -ServerInstance $MyServerInstance -InputFile c:TempMyDBtransferScript.sql

At the end, this is just a basic bulkcopy of a database, and there's more options that can be use to do more things. At least, this is a start.

If you combine this scripting from Max with a little file parsing, you could produce a simple export/import capability that extends beyond the 4 gig limit presented in earlier feedback. Tomorrow we move on to new things.

A Logging Puzzle

Arvinder writes:

We are having issues trying to write log messages from the database stored procedures but don’t know what is the best way to do logging from a database stored procedure?

We have got a banking application which has got extensive stored procedures and to track an error down is a big problem at the moment as you don’t know at which point the failure occurred.

I know you can write to a log table within the database but in case of an error when doing a roll back all the messages get rolled back too except for the last one which you can write after the rollback is done. We have tried using a service broker but that too rolls back in case of an error so not much different than writing to a table itself.

Is there a way to write to a text log file on the SQL Server drive which would keep all the messages even in case of a failure?

Would calling a CLR stored procedure, which just writes the log message to the given file name on the file system be efficient enough?

You can always reach me through my Email address, btaylor@sswug.org with your comments. Perhaps you have some ideas for Arvinder’s puzzle.

Cheers,

Ben

$$SWYNK$$

Featured White Paper(s)
Structuring the Unstructured: How to Dimensionalize Semi-Structured Business Data
Written by Interactive Edge

The Business Intelligence industry … (read more)

Featured Script
dba3_sys_IndicesAudit
Audit indices ss2k5 and beyond only… (read more)