Editorials

Small Companies Using SQL Server Standard

Virtual Eduction in SSIS and SSRS

A while back Steve provided some great insight on how to grow your skills as an SQL Professional. Take one new topic at a time, and expand on what you already know. The reality we face every day is that SQL Server consists of a number of technologies that are quite extensive; it is a big challenge to remain on the top of all the different tools. So, here is an opportunity for you to bite off a piece of the BIG Elephant a small piece at a time.

SQL Server MVP Eric Johnson has put together a virtual training series on SQL Server Integration Services (data import, export and transformation tool) and SQL Server Replication. Both of these are BIG topics. Eric provides you with access to these tools for the entire month of December, and you also have an option to purchase a DVD of the course.

Find out more about the SQL Server Integrations Services Virtual Training Course or the SQL Server Replication Virtual Training Course. Don’t wait for someone else to help you extend your knowledge. Dig in now and be ready for those opportunities when they come.

SSWUG Holiday Gift Guide
Companies and organizations! Include your products and tools in our 2010 SSWUG.org Holiday Gift Guide!

As our membership is made up of more than 525,000 database and information technology (IT) professionals with diverse interests in the high-tech industry, the Online gift guide will consist of a variety of items, ranging from the hottest electronics to the coolest tech-toys.

Our gift guide is a great way to showcase the best products of the season. With our reputation and reach to an affluent, educated and tech-savvy group of members, your products and tools will be seen and promoted with the highest standards.

Please submit all publicity requests before Tuesday, Nov. 30, 2010, as no late submissions will be considered. Our gift guide will be published on SSWUG.org on Friday, Dec. 10, 2010, which should give shoppers sufficient time to review and purchase your contributions.

Click Here for details and to enter your products to be included in our Holiday Gift Guide.

Featured Script
dba3_0010_HierarchicalDB_Schema_Article
Create example HierarchicalDB database and schema objects and populate these with sample data…. (read more)

Small Companies Using SQL Server Standard
There are many ways to backfill the limitations of SQL Server Standard vs. SQL Server Enterprise or Data Center. Today, Andre’ provides some insight into ways they have extended SQL Server capabilities. Some of those extentions have taken more work, such as using manually maintained Federated views and stored procedures.

One solution was to move certain kinds of data out of SQL Server altogether (see his comments for some of those storage mechanisms).

Due to their processing lifecycle, they were able to forgo the need to maintain or update indexes while updating data. This works great for a data warehouse, or more slowing changing data.

I’d be interested to hear from those of you who are using SQL Server Standard for OLTP purposes, and how you are adapting to the limits of SQL Server Standard. Do your business requirements allow you to use SQL Server Standard out of the box, or have you created your own work around techniques to fill the gaps?

For example, I have one database that has 500Gig of XML Blob data. Placing the data in a file doesn’t save me any disk space, and the design already has it included in the database. So, we get transactional capability without having to change the system to use a Filestream. But the space used is making defragmentation of the table very painful. Since SQL Standard doesn’t support compression, I wrote my own CLR User Defined Function to compress and decompress the data.

There were no queries filtering on the data in the first place; it was for reference purpose only, and retrieval was always less than 10 rows. I saved 90% of the required disk space overnight. Again, the business requirements may dictate possible solutions. If we were searching on the XML data (ouch) or needed to return lots of rows in a single query, this solution would never work. However, in this case, they only returned the XML data in an infrequent fashion.

I’m sure there are many other techniques you may like to share. Drop me a note with your insights, tips and ideas. Let’s get them online.

Cheers,

Ben

Andre’ writes

We are a small shop that processes a lot of data using Sql Server Standard (it’s not uncommon for us to process several 100 GBs of data on a daily basis) with databases ranging in size from several 10 GBs up to almost a Terabyte. We do miss features like partitioning (and have had to perform our own “virtual” partitioning using Views and Stored Procedures, and in some cases look at competing/other products such as MySQL and MongoDb to make up for this lack). Online indexes aren’t as much of an issue for us as we can perform most operations offline. We used to store all of our raw binary data in Sql Server, even looked at Filestreams when they came out, but found them too limiting (SPOF is the database). We currently use a MogileFS cluster of about 8 TB to store all of the binary data (we’ll probably triple that in the next year) that we need, and are looking at/have begun using S3 and Azure Blob Storage for our future products.

We’ve thought about using SQL Azure, but there are potential limitations with Microsoft being able to limit how many databases you can have off the bat without requesting a waiver from them. Ideally we’d like to have as many databases as we need to support each client of ours (I know they want us to shard the data (also future functionality that is limiting what directions we can go into), but security needs dictate that much of the data be segmented separately by client).

We’d like to utilize more of Analysis Services, but it’s been a time issue trying to come up to speed on its features and figure out how to integrate it into our development processes.