Editorials

SQL Server Updates Eye the Database Core Functionality

Microsoft is letting the cat out of the bag at last with specifics about the upcoming release of SQL Server. So many interesting and good things to think through and many are focused on the core engine this time around, which is exciting as well.

First up – in-memory processing for OLTP type workloads. Pretty incredible. I’ve seen some nearly unbelievable demonstrations of the processing power, and it’s amazing the differences. One of the cool things in this is that you don’t have to learn new features to enable it. Yes, there are some new tools and new options, but it’s all about settings and away you go. You don’t have to install separate products, figure out how to retrofit your servers, etc.

It’s also built to run on standard servers. You won’t have to go get the world’s largest systems to take advantage. A few more notes:

– You’ll be able to compile stored procedures, optimizing them to machine code. It’s not a requirement of the new Hekaton features, but it helps in the transaction throughput. In the demonstration provided, transaction processing went from 2,400/sec to 17,000 just getting started with Hekaton enabled. From there, compiling the stored procedures takes it all the way to 60,000 (!). Please don’t get caught up in the numbers specifically, it’s the scope of the capability that is so incredible. My numbers are from my notes and we’ll have to see how things flow in the release product.

– Many features were built based on "Scenarios" – a new approach that Microsoft is using to determine what gets done, what doesn’t. They identify scenarios with customers – things that are important to the customer’s world. Then those get defined, selected to be addressed and into the product as they come alive. I really like this approach – it goes beyond features and moves well into solutions. There’s a pretty big difference between a complete solution and a singular feature.

– Always-on gets a boost or two – from increasing the number of secondaries to improved diagnostics. Performance and reliability in the always-on areas were an important area in the release. Also increasing support for expanding scale is part of the release plans.

– Cloud is a big piece of the update as well, but it’s a surprising bit of work in that the emphasis is about removing the lines between your systems, rather than just being able to run your databases. So the tools are now all cloud aware to the point where it doesn’t matter where your solution is living, it will just work and you’ll be able to manage your systems using the same tools locally, in the public cloud, a third-party cloud, etc.

There is more – be sure you check out the Community Technology Preview (I still call ’em Betas). One great suggestion I’ve seen in several places is to set up a virtual machine, use that for your installation of the CTP and make your life much easier when it comes to uninstalling the CTP in favor of the release software. Great approach – and one that will save you headaches later.

You can get a copy of the CTP here.