Copying a Large Database Through Scripting
I was reminding folks that sometimes we have to revert to the very basic method of copying a database by scripting SQL statements from an existing database for one reason or another.
There are many great tools out there that do it naturally. I have used DB Compare with great success for many years. There are a lot of competitors.
SQL Server has the ability built in to generate scripts assuming you wish to create the entire database. In response to this statement David Writs:
I’m neither a newbie nor a DBA. I’m an I.T. all rounder with considerable experience of SQL so this could be me being stupid, but even if I am, many others could easily suffer the same fate and possibly take even longer than me to work out what the problem is because when it happens it implies that the script itself is faulty when it isn’t. If SQLCMD from a Command Prompt is not the correct way of importing the file and an alternative method circumvents the 4GB issue outlined below it would be useful to know and perhaps you could publish it with a warning NOT to do what I did.
I used the generate scripts method recently to restore a 2008 R2 database to an SQL 2008 server. You can’t load "large files", or even medium sized files into SSMS and using SQLCMD from a command prompt is OK as long as the are not VERY, VERY LARGE.
Specifically over 4GB.
At the 4GB limit SQLCMD does not read the next character so the database engine can return almost ANY error message, depending on which character in a batch it happened to have reached. It took me ages to work out what the problem was because each time it happened (different script file) I was getting a different error message, and it even o n fast hardware it took ages to process the 1st 4GB of script before it fell over.
Editing a 4GB file to try to pin down the problem isn’t easy either!
Advice – If the WHOLE database script is over 4GB but individual tables’
scripts would not be, use the single file "per object" option, and don’t choose UniCode if you don’t need it.
If a single object’s script is > 4GB, break it up! Why don’t Microsoft AUTOMATICALLY start a new file when it’s about to reach 4GB.
Also you could ask Microsoft why they haven’t put in an option to script for export to a SQL 2008 or higher database server as that could use "row constructor (also called a table value constructor) to specify multiple rows in a single INSERT statement". If as in our case, a table has a large number of columns, every single INSERT line contains the WHOLE table definition making the file, in our case, about TWENTY TIMES bigger than it needed to be.
I thought about writing a program / script / T-SQL to convert the unnecessarily large files but as my above suggestions solved our problem, I could not be bothered.
Perhaps someone would like to produce a solution and post it.
All it needs to do is to convert the generated script from
INSERT INTO TABLE (Col1, Col2, Col3…) VALUES (……………….) INSERT INTO TABLE (Col1, Col2, Col3…) VALUES (……………….) INSERT INTO TABLE (Col1, Col2, Col3…) VALUES (……………….) INSERT INTO TABLE (Col1, Col2, Col3…) VALUES (……………….) INSERT INTO TABLE (Col1, Col2, Col3…) VALUES (……………….) ad infinitum
into a single row constructor for the batch
INSERT INTO TABLE (Col1, Col2, Col3…)
VALUES (……………….),
VALUES (……………….),
VALUES (……………….),
VALUES (……………….),
VALUES (……………….)
circumventing the 4GB limit while it’s at it!
Do Microsoft provide another solution to the 4GB issue which I’ve missed?
I haven’t looked in Denial (or is that Denali?). Perhaps they’ve improved the scripting options. All they need to do is to AUTOMATICALLY start a new file when it reaches 4GB.
My initial thoughts are that SQLCMD is a great tool. I would consider doing something in PowerShell instead. In fact, SQLCMD is integrated/ish into the SQL version of PowerShell.
I’m not real strong on PowerShell at all, finding things easier to do in C# for me personally, which is clearly overkill.
The thing I like about David’s suggestion is the conversion of multiple insert statements to more set like constructs using the VALUES technique. I don’t know how this differs in performance, but it would be a lot easier and leaner script file.
Do you have a tip you’d like to share with David and the rest of our readers? Drop me an email at btaylor@sswug.org.
Cheers,
Ben
$$SWYNK$$
SSWUG TV
With Stephen Wynkoop and Sam Brace
In this issue of SSWUG-TV we have Interviews with SolidQ Mentor Randy Dyess, and EnterpriseDB’s CEO Ed Boyajian.
Watch the Show
Featured Article(s)
10 IT Career Boosting New Year Resolutions (Part 2)
Career-driven professionals in the IT industry should want to surround ourselves with energetic and like-minded people.
Featured White Paper(s)
Structuring the Unstructured: How to Dimensionalize Semi-Structured Business Data
Written by Interactive Edge
The Business Intelligence industry … (read more)