Cloud computing’s second movement
By Drew Turney, ZDnet. April 1, 2015. See original article here.
The first years of the cloud movement saw companies both big and small dipping their toes into the water and seeing what they could do, often with non-critical applications like marketing and CRM.
We're not only seeing more critical data and processes being handled in clouds now, but also more movement than ever between private, public, and on-premises cloud architectures. It seems as though the hype is behind us, and now we're really figuring out what the cloud can do.
Welcome to cloud computing's second wave.
Seth Robinson, senior director of technology analysis for IT industry trade association CompTIA, said many users are now in the thick of second migrations.
"[After] hands-on experience gained through initial forays into the cloud, businesses are able to compare the theoretical to the practical, and make decisions on how best to optimise their IT infrastructure," Robinson said.
According to CompTIA's most recent Trends in Cloud Computing study, 44 percent of companies had moved infrastructure or applications between public clouds, 25 percent from public to private, and 24 percent from public to on-premises.
Breaking down the new paradigm
Cloud computing has undoubtedly become a mainstay, but why has there been such a marked change in its direction, especially from public systems back to private or on-premises?
One theory is that the public internet, originally built to handle HTTP requests and small HTML files, just isn't standing up to the high-bandwidth, always-on demands of today's cloud processes. There's a reason, after all, that most early cloud activity was in the dedicated pipe space.
Ian Hamilton, CTO of file movement software company Signiant, said the necessary tools just weren't part of the original tool set. "There's a lot of focus on server-to-server networking in virtualised environments, but not much on client-to-server networking," he said.
From a user's perspective, Joe Kelly, CEO of Legal Workspace, a cloud provider that hosts law firm systems, thinks access is just one piece of the puzzle.
"When I think of the public internet infrastructure, I think of access, not applications or viruses or data breaches," Kelly said. "It's enabling cloud-based technologies to work for more companies."
Theory number two: Maybe we're just learning that cloud computing simply doesn't suit some applications and processes. CEO of cloud data protection platform provider Infrascale Ken Shaw attributes the second wave to people better understanding what the cloud is good for, and what it is not good for.
"Learn from your mistakes in the first wave, or the mistakes of others," Shaw said. "Those mistakes were nearly always centred in misaligned expectations."
Ajit Gupta, CEO of cloud provider Aryaka, calls the second wave a strategic defeat.
"Many of the first-wave cloud tools were experiments, like R&D workloads," he said. "Many others, like marketing automation software, grew up with the cloud, so there was no need to shift the delivery model. That was the low-hanging fruit, and it's been picked."
One thing the early hype did was make cloud computing seem like an all-or-nothing proposition. It's something that David Polley, cloud product senior director at BI, mobile software, and cloud provider MicroStrategy, said has never been the case.
"Since a re-platform of applications is typically a major transition, companies are being more selective about what they transition to cloud," Polley said. "Most companies are resisting a complete change, because of a very real fear of error and business loss."
Aryaka's Ajit Gupta added that the rise in private clouds shines a light on some of the inherent weaknesses.
"While the public providers ignore the problem, private clouds have rushed in to fill the gap, offering better security options, the ability to track and report compliance, and, most importantly, the ability to confine applications and workloads to a nearby datacentre so networking obstacles can be overcome."
Locking down clouds
Despite the assurances, certificates, and accreditations that the big cloud providers offer, incidents such as those that impacted Target, Sony, and Home Depot mean security still casts a long shadow over any new IT strategy.
According to a recent finding (PDF) by Accenture, almost three out of five companies believed that moving everything to the cloud was riskier than removing passwords from all office computers.
Even if you believe the hype about security, speciality needs sometimes exceed even the fancy ISO badges the majors spruik.
"It's a huge mistake to believe all content is created equal," Jeetu Patel, general manager of network share provider Syncplicity by Axway, said.
"The question isn't whether the solution is working in the cloud; it's how you implement a cloud that meets the security and compliance needs of your organization. Enterprise file sync and share needs a hybrid approach that gives IT choice in how and where content is stored."
Putting data and applications in the cloud also exposes the company to risks that more departments have a stake in, and Max Dufour, a partner at Harmeda technology and strategy consulting, is seeing more people coming to the table to influence the decision.
"There was less transparency and competition in the market before," Dufour said. "Companies now lead structured vendor selection versus simply trying out a cloud vendor to understand how it actually works."
And on top of everything else, David Cope, CMO of enterprise hybrid cloud management provider CliQr, thinks a hybrid strategy offers savings in the development cycle.
"Businesses once sceptical about public and private clouds are rapidly embracing them, because they can select and deploy applications on the best execution venue," Cope said. "The real promise ... is the interchangeable movement and management of workloads on different environments."
The truth about costs
Part of the reason for a race in every direction, rather than simply towards the public cloud, might be the failure of the cloud to live up to the jaw-dropping cost savings that the original hype promised.
Jim Murphy, senior consultant at IT strategy service Garnerin Group, said there's been a disconnect between perception and reality when it comes to financial reality.
"Some organisations are finding they would have been paying the same money to keep some data and processes on-premise the whole time," Murphy said. "It's interesting that cost savings was an early motivator, and then trouble delivering on those savings is now among the top challenges organisations are facing."
Of course, this doesn't mean that everything you heard about the cloud being cheaper is wrong; it's just that there are a lot of cautionary tales.
"In the early days, many business units wanted to cloudify certain processes just to generate immediate savings without fully understanding the strategic considerations," said Trevor Nagel, who wrote the 2012 Cloud Computing Report for the World Bank Group.
"Several experienced CIOs have said it took two to three years to really understand they weren't just remediating legacy processes, but changing the way technology underpinned and structured their operations."
Learning what does and doesn't work in the cloud might also be a good thing, despite any individual losses or missteps, all of it part of the long-term maturation of the field.
"Disaster recovery and business continuity in the public cloud can provide very compelling economics," said Lynn LeBlanc, CEO of hybrid cloud provider HotLink, about one such use case. "IT applications and operations are certainly not a one-size-fits-all proposition. May the best infrastructure win."
Another reason the cloud may be getting even more hybridised is because on-premises computing has the potential to get more powerful all the time.
Supercomputer manufacturer Cray is still around, and its revenue has more than doubled since 2011, according to company marketing and business development vice president Barry Bolding. He added that Cray is finding a vibrant market because of big data analytics.
"Some applications will always be most cost and workflow efficient on-premise," Bolding said. "While we've watched the cloud sector grow steadily, so has high-performance computing. The supercomputer market is still very robust, and, according to IDC, it's projected to grow by almost 30 percent between now and 2018."
The cloud's future
As public cloud services continue to grow as fast as the on-premises and private cloud markets, it seems like mass cloud hybridisation is the only certainty.
Writing on Wired in late 2014, Ginna Raahauge, CIO of network application delivery company Riverbed, called hybrid cloud computing the "new normal", and Raahauge is hardly alone in those thoughts.
Though Raahauge spoke about having frequent "road map reviews" to combat what she calls the "architectural collision" of having too many cloud environments, she referred to Rightscale's 2014 state of the cloud report, which claims that 74 percent of enterprises surveyed have a hybrid cloud strategy.
As David Polley of Microstrategy put it: "Circumventing the long-term investment by first embracing a hybrid cloud approach lowers risk and lowers overall cost."