Data storage considerations for a DevOps environment
A comprehensive collection of articles, videos and more, hand-picked by our editors
A few months back, I was researching a framework to do container cluster deployments for the cloud. The research was pretty straightforward: twiddle with the technology to create some container instances from image and then get them into the cloud as a cluster. In this case, I was working on Amazon Web Services.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
So, I did the work, learned what I wanted and called it a day. At the end of the month, my AWS bill arrived: Turns out my little experiment in the cloud cost me about $150 in additional usage fees. Seems the framework created a few good-sized virtual machine (VM) instances behind the scenes and I forgot to nuke them when my research was over. Such is the price of learning the myth of cloud computing cost benefits.
Big Enterprise and cloud conventional wisdom
A few days later, I had lunch with a friend, David. David does Big Enterprise architecture work and was working with a Big Enterprise to move its entire infrastructure to the cloud -- specifically, AWS. David liked the work, but he found the business reasoning for the migration baffling. David did the math. According to his calculation, the millions of dollars the Big Enterprise planned to spend on a cloud provider would be more than the cost of keeping things on its own hardware.
During the drive back from my lunch with David, I got to thinking about my recent AWS bill. David's Big Enterprise was doing on a massive scale what I did in research: They were going to the cloud more out of unconscious conformity based on the conventional wisdom of cloud computing cost efficiency than thoughtful analysis of comparative costs. I had already paid the price. The Big Enterprise was about to pay the price, too, only this time on a scale thousands of times larger.
That $150 extra I spent on AWS was an unnecessary expense. I already had the hardware. For research purposes, I buy older boxes off eBay and configure them to meet my needs. My practice is to load a box up with as much RAM as it can take and install 500 GB to 1 TB of storage capacity. Then, I install a Linux server and a hypervisor. The result: I can create as many VMs as I want when I need them.
My last box cost around $200. Is this enterprise-grade computing? No way, but it's perfectly good for research.
Still, as my AWS experience exemplifies, sometimes I get lured in by the conventional wisdom of cloud computing cost benefits and use a public cloud provider, despite the fact I have a perfectly good cloud of my own. I sometimes spend money when I don't have to.
Can the same be said of David's Big Enterprise? Is the lure of conventional wisdom causing it to spend money it could be saving were it just to stick with its private data center?
Are all companies turning to the cloud?
Because my research scenario doesn't map directly onto the challenges faced by a Big Enterprise, I decided to do a reality check. I sought out Michael Curry, data protection and cloud strategy specialist at Dell EMC.
The company is still one of the biggest hardware manufacturers on the planet. It has a lot of equipment powering the clouds that make up modern computing. How many companies have been lured away from hardware by the promise of cloud computing cost efficiency?
"On prem and hybrid are both very effective models," Curry said. "Very few organizations in the midmarket or enterprise space are going pure public cloud. In the end, using someone else's hardware will always cost more. Same concept as rent to own versus buy."
Curry seemed to confirm my theory. Companies are still buying hardware. The accountants at these companies are not dumb. They want real ROI. They're not going to allow good money to be spent on boxes that turn into bricks. Seems going on premises is still a viable option.
Big data meets the cloud
Big Data Day LA is a yearly conference that tackles the state of big data. The event draws a lot of significant players in the big data space: Netflix, Hortonworks, Cloudera, Disney and Warner Bros., to name a few. It's a big deal.
One of the vendors on hand was Pure Storage Inc. It manufactures flash-based storage hardware for data centers. I asked them what a hardware manufacturer was doing at a data analytics conference.
Turns out they were doing the obvious: trying to sell hardware. Pure Storage has a compelling value proposition. For many companies, it's cheaper to churn big data on premises than with a cloud provider. Flash storage technology provides speed, versatility and cost savings.
Pure Storage has done the math. As is true of many other hardware companies, Pure Storage understands that hardware ownership is still a viable strategy for IT, given the right scenario. Its challenge is to convince those under the influence of the conventional wisdom of cloud computing cost benefits to think otherwise.
What does all this have to do with DevOps?
We in DevOps look at physical infrastructure as an abstraction and something to be programmed. Rarely do we think about the hardware at the bottom of the stack -- let alone, its cost. We take for granted cloud computing cost benefits and that providers such as AWS, Azure and Google Cloud must be the cheapest way for an enterprise to do computing.
Different enterprises have different requirements. For a company starting out, going with the public cloud makes sense. Startups rarely have the budget or staff to support their own infrastructure. The technical expertise required to map virtual environments onto hardware requires a very special and expensive skill set to employ. With most money devoted to product development, going with a cloud provider is rational.
For some very large companies that can negotiate cost-effective relationships with public cloud providers (think Netflix and AWS), pure cloud implementations make sense, too. But for midsize to large companies that presently support their computing needs on premises, rushing to the public cloud might not be the wisest decision.
DevOps makes outsourced support difficult
Even when DevOps is firing on all cylinders, you can run into issues. That's why having at least some support staff on premises can expedite any problems -- and there are always problems -- your DevOps team encounters.
It boils down to the efficiency of the dollar spend. Yes, cloud computing cost efficiency means you pay only for what you use, but the same can be said for larger companies that are maximizing the use of their on-premises resources. If your private infrastructure is operating at full capacity, your usage is cost-effective, provided your procurement department knows what it's doing.
The public cloud has its place, as long as the benefits justify the expense. But, sometimes, given a company's purpose, size, history and utilization of trends, maintaining an on-premises infrastructure is the way to go. It's a question of real cost versus real benefit, and not the mesmerizing attractiveness of a bright, shiny object. You need to do the math.
Just because you can use a public cloud, it doesn't mean you should. It's a lesson that cost me $150 to learn, but could cost a Big Enterprise millions.