Trends 2012: Big Data Necessitates DevOps

READ MORE

datacenter with macbook pro, by Leonardo Rizzi The term “DevOps” refers to a set of methods for both helping operations and engineering teams work together more efficiently and for helping operations staffs apply some of the methods associated with agile software development to their own work, often with automation tools that make managing infrastructure more similar to programming computers. We’ve been talking a lot lately about DevOps at SiliconAngle and we plan to launch a new DevOpsAngle site in 2012. So why the obsession? In part because it addresses the needs of companies that are embracing two of the trends we’ve been most focused on: cloud computing and big data. Big data in particular is driving the adoption of DevOps methods because new technologies like Apache Hadoop require a high level of cooperation between engineering and operations.

Here’s Facebook’s Jonathan Gray talking about how operations and engineering at Facebook work together to manage Hadoop and Hbase:

Watch live video from SiliconANGLE.com on www.justin.tv

I heard similar things at the Life in Hadoop Ops session at HadoopWorld, where technical staff from AOL Advertising, AT&T Interactive and Facebook discussed their experiences managing Hadoop and Hbase. There’s a growing need for the operations staff to have a good sense of the applications they are supporting. Perhaps not a knowledge of the actual inner workings of the application, but at least an understanding of what the applications are intended to do.

Companies like Facebook and Google are engineering driven have this sort of DevOps culture built-in, and cultural shifts towards having these teams work together can be difficult. Is it possible, or even desirable to make a cultural shift towards DevOps in a more conservative company?

Google CIO Ben Fried talked about his experience at Morgan Stanley before coming to Google at the Surge 2011 conference. Sean Gallagher wrote for Ars Technica about the event:

Fried said the process of fixing the problems at Morgan Stanley “forced me to rethink how we do operations, and what the culture of operations should be. Operations is engineering. We need generalists in operations, and we can’t allow the tech barriers to separate us because that will result in failure.” He said that it’s important to reward and recognize generalist skills and broad understanding of systems, and added that he thinks Google gets this right. “We go to great lengths to hire people with engineering skills, put engineers in operational roles and give them power and accountability.”

This was regarding a problematic deployment of a typical internal enterprise application, not a new fangled big data cluster. But it demonstrates how useful a DevOps approach to operations and engineering can be even within the traditional enterprise. But as more companies start to use big data, and have to manage large clusters of commodity servers and bring in multidisciplinary data scientists to create new applications, enterprises will be forced to embrace DevOps. From automation tools like Puppet and Chef to logging tools like Loggly and Splunk to virtual infrastructure and cloud computing, DevOps brings a host of tools and processes to the table that will make the transition to big data manageable.

More Trends 2012 Articles

Identity Management in Age of the Cloud, Mobile and Social

With Big Data comes Big Expectations

Enterprization of the Consumer

Big Data Necessitates DevOps

Integration-as-a-Service

Photo by Leonardo Rizzi

About Klint Finley

Klint Finley is a Senior Writer at SiliconAngle. His specialties include IT services, enterprise technology and software development. Prior to SiliconAngle he was a writer for ReadWriteWeb. He's also a former IT practicioner, and has written about technology for over a decade. He can be contacted at angle@klintfinley.com.