Editorials

Bursting to the Cloud

Bursting to the Cloud
In SQL Server 2016, we’ll be seeing some pretty incredible features that will fully integrate cloud and hybrid cloud solutions. One of those is bursting to the cloud – when additional capacity is needed, it’ll be possible to dynamically extend to Azure in real-time.

This is a fantastic feature, full stop. I think it will relieve a number of capacity issues and will make it easier to build systems that can work at capacity and process very large amounts of information. As an aside, this may be more pertinent than ever, I’ve been seeing a number of articles talking about archive tendencies with companies. Specifically, people are saving more information than ever before – instead of archiving it off (or outright removing it), we’re all keeping it online and available for analysis, for processing, for reporting. It’s a solid tool in getting great insights in the systems we build out.

The risk of the cloud bursting capabilities is one of budget and management of information. While I’m not so concerned about managing that storage since Azure will be taking care of it for us, I do see this as an item that will need to be monitored and managed. What we’re essentially saying is that storage is limited only by imagination.

I don’t know about you, but I can imagine a lot of storage need if I’m given unlimited capacity. This is one area we’ll have to do some careful planning. I won’t suggest you not keep things in storage. I won’t suggest you don’t use the capabilities.

What I do suggest is that you add this as a component of your planning. Having this ability to grow databases and storage on a whim will impact performance planning in big ways. You’ll want to make sure you’re taking into account the different database uses as you lay out your designs. Make sure the hybrid environment is on the table as a tool, but also as a planning point. What goes where, and how it impacts your applications will be a big deal.

In addition, while many of us enjoy rock solid Internet access speeds and capacity, including bandwidth planning in your guidelines. Make sure you have the capacity for potentially large data pushes going through your lines. It might not seem like a big consideration, but I can tell you already that many things are tugging on the purse strings when it comes to bandwidth. Video consumption, feature-heavy web sites, security and data transfers are not trivial and all need to be considered as part of the data access management process when it comes to performance and access.

How are you adding elements for hybrid solutions to the mix? What have you found that has jumped up your own list of priorities?