Editorials

New Features in SQL Server 2010 – Denali

Webcast: Versions – A Document History
In this session we will discuss versioning within SharePoint document libraries. How you configure, manage and use versioning to manage the document creation process. We will also point out some of the issues to plan for when using versions with your Document libraries. Presented by: Daniel Galant Sponsored by: AvePoint

> Register Now
> Live date: 11/17/2010 at 12:00 Pacific

New Features in SQL Server Denali
Denali was released to the development community in the first CTP last week. There are so many new things in the product it is going to be a while before the Development community is going to be able to absorb the possibilities, let alone determine how to apply it to current business needs.

The next few days I will be reviewing the messages from the PASS summit and how this next release can help you perform your tasks better or provide more value to your end users.

One of the things they built in order to optimize Power Pivot was an in-memory construct called Vertipaq. Vertipaq is a compressed optimized "Column Store" that Analysis Services hosts allowing Excel like capabilities and performance at a Massive scale. One of the coolest demos from the first night Keynote at PASS was a Power Pivot model Querying 2 Billion sales records. The demo was able to filter and sort those 2 Billion rows amazingly fast. The screen updated immediately. Hosting this query capability was a Vertipaq "Column Store".

The demonstration continued to open Business Intelligence Developer Studio (BIDS) and show how you could import something developed by your end-user community, and bring it into a more managed developer environment interacting directly with the Column Store in SQL Server Analysis Services. It automatically translated the Excel Power Pivot definition into BIDS format where it could be version controlled, wrapped with security requirements, etc.

OK, I’m already impressed…but then they continued to demonstrate how they have extended the Column Store even further. They demonstrated a table in the query engine that supported the Column Store. A query was run using a covering index on the data table that contained the original 2 Billion rows of sales. This was the most optimized method to get the data out using TSQL. The query contained some filtering, grouping and sorting.

While a standard TSQL Query was running using the covering index, the demonstrator ran three other queries (using standard TSQL) without supplying any covering index hints. Each of the other queries returned immediately, while the first query using the covering index took 20 seconds. How was that done? They introduced the ability to bring the Vertipaq technology into the database engine. SQL Server was able to leverage the capabilities of Vertipaq while running a traditional TSQL query, returning data virtually immediately. So, when the covering index hint wasn’t supplied, SQL Server was able to determine that the query would be more quickly serviced using a Column Store object instead.

Let’s face it, Google has set the bar very high when it comes to performance and end user expectations. End users want to see things almost immediately regardless of how many records have to be analyzed to answer their questions. The Column Store looks like a tool that will help the rest of use provide similar performance.

Drop me a note with your thoughts or experiences with the Denali release? How are we going to use these new technologies? What are the skills we are going to need to develop in order to implement and manage them? Are these tools for everyone, or only the big business? Send your comments to btaylor@sswug.org.

Cheers,

Ben