Monday, May 5, 2014

The Rise of Hybrid Cloud Computing


Even if you’re not a technologist, I want you to understand that hybrid cloudcomputing is all about choice:
Choice about where your data resides. Choice about how your data is managed. Choice about where your data processing actually happens.
Choices can be used:
  • to make economic decisions to lower the total cost of ownership of data,
  • to maximize your quality of services, or
  • to comply with regulatory constraints on data sovereignty.
In today’s environment, vendors are moving fast. If we wind the clock back a year, the main cloud services like Microsoft Azure, Amazon, Terremark, or Rackspace were fairly proprietary, closed environments. But they all quickly realized that IT heterogeneity is what customers want.

If I take my infrastructure and my workloads and move them to the cloud, my ability to do so in a closed, homogeneous cloud is very limited.Customers Want Choice, Not Monolithic Options 
Managing a mix of platforms is a reality for CIOs’ deployment models. And ultimately, that’s what the cloud is: It’s a deployment model.
It’s the transportability of workloads that makes the hybrid cloud so important. Terremark, Rackspace and Amazon have visions to make this happen: To seamlessly transport workloads, so it doesn’t matter where your workload resides—whether it’s on premises or in the cloud.
Three years ago, this was called cloudbursting. This idea stalled and fizzled because the technology hadn’t arrived. But now we’re able to seamlessly transport workloads and data across multiple clouds: Public and private.
Amazon is just starting down this path—where you can submit a workload to a queue and Amazon will understand your needs for specific types of storage, compute cycles, and memory. Amazon will also give you some options for creating this cloud environment.
These options may include a priority queue where you pay extra and move to a higher priority. But if you’re okay with waiting a few hours and don’t mind the workload being run somewhere else in the world, then you’ll be charged a different fee.
In the past, high-performance computing was physically located on-premises. But with the cloud, you remove the sunk capital costs. Instead, you get on-demand access, paid for based on the urgency and priority to your organization.
Cloud computing allows anyone to gain access to supercomputer-like power, without traveling anywhere. Projects that require massive amounts of big-data manipulation and storage, like space exploration, genome sequencing, or finding energy reserves can all benefit.
Not Everyone Has A Supercomputer In The Basement 
A few years ago, I was working with a company in Boston doing human-genome sequencing. This is the perfect example of the value of big data, because it’s going to affect you and me as human beings.
To run a simulation of genomic sequencing data, this organization needed time on the IBM Blue Genesupercomputer, one of the fastest computers in the world at the time. They actually had to physically travel to the machine’s location and wait for processing time to become available.
Now, fast forward to the present: You can contact Amazon or Rackspace, who now have this type of computing capability, and you can rent the time and processing power. This really illustrates what cloud computing is all about.
I can now offer that Blue Gene machine to someone who wants to access it for a little while, in the cloud.
These Changes Are Bringing A Tectonic Shift 
And it’s all being driven by hybrid cloud computing. From the business perspective, it’s the route to a seamless, data-centric world.
Things that were limited by on-site physical capacity and storage behind my four walls are no longer holding us back. Suddenly, I’m only limited by my imagination and my ability to build the business.

No comments:

Post a Comment