Thursday, May 15, 2014

Local Clouds and The Coming Death of Legacy Stacks in the Cloud


Let me tell you a little secret about the “cloud.” It’s that right now in the enterprise, it’s a local game.

A few options come to mind when we think of enterprise, like the predominate force of AWS in test and development environments. Then you have providers like, HP-ES, IBM/SL, RackSpace, Teremark, and Google that are all trying to play enterprise production load catch-up.

For the meantime, I think it’s safe to call the enterprise private cloud a local game.

Take KIO Networks in Mexico City, or LGCNS in South Korea, or T-Systems in Germany; the strength that these local service providers have is that they are entrenched into the local economy. They are tied in with the governmental entities either with contracts, investment tax incentives, or in some cases, board relationships.  Governments encourage these types of business, as they are clean, provide local jobs, and are a good high-tech face for the country.

The key characteristics of these service providers are that they have the local connections and the P&L margin expectations for the long-tail economics of a service provider business.

In the case of KIO Networks, their margins are so tight that they build their data centers in cooler zones in Mexico City or bury them into the side of a mountain; because not running their chillers for several months out of the year is a competitive differentiator. Coming from a world of plump enterprise software/systems deployments, margins like this seemed like a foreign concept to me.

Just like our friends at AWS, service providers don’t write the books, they just sell them and are perfectly happy living with the retail economics. 

But in this never ending quest for margin, service providers need to standardize on control layers to manage each plane, like, computing, networking, storing, provisioning, and managing chargeback across these heterogeneous, at times customer dictated, and other times, commodity resource pools is vital. 
This long-term platform migration, which is primarily driven by the evolution of the service provider, is what’s posing a sea changing threat to the legacy of major profit pools from the likes of: IBM, HP, EMC, NetApp, etc.

In the early stages of a customer’s journey into the private/public cloud, they generally dictate the same legacy platforms they have run on for decades with these environments lifted and shifted into the services provider data center. In other words, let’s move my expensive proprietary boxes off of my data center floor and onto yours. 

This is precisely where the market is now.  But this phenomenon is just a hosting/colo way station on the way to the true public cloud.

Going forward, the confluence of software driven reliability, fault tolerance, and compliance are being delivered on top of commodity infrastructure selected by the service providers will be a way of life.   

Service provider’s margins will never tolerate the proprietary stacks of today.  

These business models are too far out of synch. AWS’s cloud doesn’t run on proprietary gear, why should yours?

Monday, May 5, 2014

Rewriting the Entire Customer Experience.


As I embark on my next career move as Chief Customer Officer of Formation Data Systems, I’m struck by the sheer magnitude of the opportunity. It’s the prospect of disrupting the traditional enterprise storage market and how a next generation data management layer can holistically unlock the value of traditional, No-SQL databases and AWS S3. The technical challenges and broad transformational opportunities are exhilarating.

But what gets me fired up beyond belief is the chance to truly define how a new company rewrites the entire customer experience.

Now that’s cool stuff.

To be part of a revolution in how customers obtain knowledge about: Formation Data Systems, the company, the people, and the products. Providing fuel to enable customers to make intelligent decisions and interact with a product and a company in a completely new way. 
   
In the past, IT was “sold” through traditional means of marketing awareness, campaigns, and marketing touches turning into leads, which turns into prospects, which turns into deals and sales.

We thought we were getting fancy when we started selling to LOB and IT or the populist approach of bypassing IT all together. The whole experience was an asynchronous push.  The sales “Firewall” was built to protect the customer from the technical complexities and harsh realities of the product. 

Enter the spin doctors obfuscating complexity with PowerPoint.  

I believe this cycle is antiquated and outmoded to not only how companies can and should interact with their customers, but also how customers and potential customers seek to understand disruptive technology and how it can improve their lives.

Customers expect and deserve more. 

Customer interactions should be enlightening and educational, where technical and business ideas are exchanged and refined collaboratively. Where flexible problem solving and options define customer success. 
Today, via social and affinity networks, technically savvy customers are exchanging ideas with scores of like-minded colleagues. Via these informal networks, the true customer experience begins long before a salesperson ever interacts with a customer. Customers don’t want to see high level PowerPoints because chances are they’ve already pre-read them on Slideshare.


So, after a long lineage at some of the most distinguished companies in Silicon Valley: PeopleSoft, Vignette, Documentum, EMC, and SAP, I’m truly honored to be able to take that depth of work and define the next generation of customer experience with you at Formation Data Systems.

The Rise of Hybrid Cloud Computing


Even if you’re not a technologist, I want you to understand that hybrid cloudcomputing is all about choice:
Choice about where your data resides. Choice about how your data is managed. Choice about where your data processing actually happens.
Choices can be used:
  • to make economic decisions to lower the total cost of ownership of data,
  • to maximize your quality of services, or
  • to comply with regulatory constraints on data sovereignty.
In today’s environment, vendors are moving fast. If we wind the clock back a year, the main cloud services like Microsoft Azure, Amazon, Terremark, or Rackspace were fairly proprietary, closed environments. But they all quickly realized that IT heterogeneity is what customers want.

If I take my infrastructure and my workloads and move them to the cloud, my ability to do so in a closed, homogeneous cloud is very limited.Customers Want Choice, Not Monolithic Options 
Managing a mix of platforms is a reality for CIOs’ deployment models. And ultimately, that’s what the cloud is: It’s a deployment model.
It’s the transportability of workloads that makes the hybrid cloud so important. Terremark, Rackspace and Amazon have visions to make this happen: To seamlessly transport workloads, so it doesn’t matter where your workload resides—whether it’s on premises or in the cloud.
Three years ago, this was called cloudbursting. This idea stalled and fizzled because the technology hadn’t arrived. But now we’re able to seamlessly transport workloads and data across multiple clouds: Public and private.
Amazon is just starting down this path—where you can submit a workload to a queue and Amazon will understand your needs for specific types of storage, compute cycles, and memory. Amazon will also give you some options for creating this cloud environment.
These options may include a priority queue where you pay extra and move to a higher priority. But if you’re okay with waiting a few hours and don’t mind the workload being run somewhere else in the world, then you’ll be charged a different fee.
In the past, high-performance computing was physically located on-premises. But with the cloud, you remove the sunk capital costs. Instead, you get on-demand access, paid for based on the urgency and priority to your organization.
Cloud computing allows anyone to gain access to supercomputer-like power, without traveling anywhere. Projects that require massive amounts of big-data manipulation and storage, like space exploration, genome sequencing, or finding energy reserves can all benefit.
Not Everyone Has A Supercomputer In The Basement 
A few years ago, I was working with a company in Boston doing human-genome sequencing. This is the perfect example of the value of big data, because it’s going to affect you and me as human beings.
To run a simulation of genomic sequencing data, this organization needed time on the IBM Blue Genesupercomputer, one of the fastest computers in the world at the time. They actually had to physically travel to the machine’s location and wait for processing time to become available.
Now, fast forward to the present: You can contact Amazon or Rackspace, who now have this type of computing capability, and you can rent the time and processing power. This really illustrates what cloud computing is all about.
I can now offer that Blue Gene machine to someone who wants to access it for a little while, in the cloud.
These Changes Are Bringing A Tectonic Shift 
And it’s all being driven by hybrid cloud computing. From the business perspective, it’s the route to a seamless, data-centric world.
Things that were limited by on-site physical capacity and storage behind my four walls are no longer holding us back. Suddenly, I’m only limited by my imagination and my ability to build the business.