by Bernard Golden

Beware Cloud Computing Advice from IT Research Firms

Dec 09, 20116 mins
Cloud ComputingPrivate CloudVirtualization

It's nearly 2012, and if IT research firms are telling you to consider moving low-risk applications to the cloud, it means two things: 1) You're hopelessly behind the times. 2) You need advice that's much more cutting-edge.

I don’t know how I missed this, but at the Gartner IT Symposium in October, Darryl Plummer (Chief of Gartner Cloud Research) apparently stated that enterprises should deploy applications in a public cloud provider as a default, and only deploy them in a private cloud if the public alternative is not appropriate.

I became aware of Plummer’s recommendation, which caused quite a stir in the blog world when he first announced it, via Twitter earlier this week.

Naturally, much of the furor over Plummmer’s pronouncement was a reaction to the quick summary: Gartner prefers public cloud. Wow. That’s a big deal, right? Gartner is probably telling all of its clients that they should trim their private cloud plans and instead focus on public cloud service providers. And, in response, all of its clients are scrapping their private cloud initiatives and planning a big move to public providers, right?

Actually, that’s quite unlikely, for some very sensible reasons.

First, people misunderstand the nature of analyst firms. They assume that these firms are corporate in nature and monolithic in their positions. In fact, a better way to look at analyst firms is that they are much like professional firms (e.g., law firms, consulting partnerships, etc.). Such firms are comprised of relatively independent individuals, each with his or her own opinion.

For example, one can present the same issue to two attorneys within the same law firm and get two different recommendations about what to do (I speak here from personal experience). Likewise two analysts from the same firm will hold different opinions about the right approach to a specific technology issue.

Consequently, even if one or more (or most) analysts at a firm hold one opinion, there are probably others who hold a different opinion. At the very least, when presented with a specific issue, analysts will likely proffer different recommendations, based on their interpretation of the issue. Of course, it’s important to keep in mind that every situation is specific and different. If blanket advice were sufficient, there would be no need for analyst firms. Let me be clear, I’m discussing this phenomenon in general—not picking on Gartner specifically. As I said last week, I am not one to gainsay Gartner.

Second, as a complement to the fact that opinion at analyst firms differs, clients tend to take their recommendations selectively. Companies tend to have their goals and they seek support and affirmation for them, searching until they find third-party advice that can be cited as impartial evidence for pursuing the direction that they have already decided upon. This is crudely referred to as “shopping for an opinion.”

By citing an outside party, the client is able to justify to its own management why a particular course of action is acceptable. Some IT executives will search until they find an analyst whose recommendations match the actions they seek to implement.

Back to Gartner’s recommendation: Does the research firm recommend public cloud as the default deployment recommendation? It’s hard to know, as the company hasn’t made any follow up announcements about public cloud. However, the article about Plummer’s pronouncement that I came across via Twitter does present his specific recommendations:

“While the cloud hype has reached a fever pitch, Plummer points out that there are a number of potential risks. Those include security, transparency, assurance, lock-in and integration issues. If you do decide to start moving applications to the cloud, start at the edges and work your way into the core, says Plummer. The most common apps to start with are email, social, test and development, productivity apps, and Web servers.”

This is appropriate advice, but it raises a question: Why is this being reported as news? These recommendations seem extremely mild and unworthy of any particular note. It’s 2011, and if you’re getting advice that you ought to think about moving test/dev into a cloud environment, it suggests there is something far more worrying than whether public or private cloud should be your default deployment choice.

More disconcerting is the fact that this advice is being proffered as something IT executives should do. It implies that moving to the cloud is something they’re not yet doing. If that’s the case, they are far behind the pack with little hope of catching up.

As a CIO, if you’re going to a conference and learning that you should be working from the outside in, moving low-risk applications to a cloud environment, it’s likely that your company’s business units—your customers—have been doing this for six to eighteen months. And while you take six months to put together a strategy, and then build out your private cloud (so that you can deploy applications to either a public or an internal cloud), the momentum of the business units toward public cloud computing is only going to grow.

We’ve talked to a number of companies who seem to fall into a trap: They decide to start an internal cloud computing pilot program, get it underway, and then the initiative bogs down in the press of everyday business. While it’s understandable that today’s needs must be addressed, it seems that the crush of today prevents the critical need of tomorrow from being addressed. In effect, to use Steven Covey’s formulation, IT organizations are skewing their efforts toward “Interruptions” at the cost of “Important Goals” and “Critical Activities.”

The issue of what areas to focus on isn’t academic. As Clayton Christensen describes quite vividly, any time there’s a major shift in technology, incumbent vendors are in danger. Cloud computing represents such a shift, and IT organizations represent the incumbent vendor.

The importance and immediacy of this danger is delineated in another one of Gartner predictions: “By 2014 CIOs will have lost effective control of 25% of their organizations’ IT spending.” Coincidentally, this prediction was issued at the same IT Symposium as Plummer’s. The article in which this prediction was summed up also reported:

“Mature technologies are code for obsolete,” said Peter Sondergaard, senior vice president at Gartner and global head of research. “You must dare to employ creative destruction to eliminate legacy, and selectively destroy low impact systems.”

One wonders how the Symposium went down with attendees and whether these snippets reflect the general tone of the event. Certainly the recommendations quoted here seem quite forceful and pretty blunt about the need to shift IT attention and investment. If your cloud computing plans are at the level of researching what to do by attending a conference, you’re likely to find this level of advice disquieting, to say the least.

Stepping back a bit, it seems obvious that our industry is at a turning point. The traditional approaches to IT management that have worked for so long are colliding with new approaches that are immensely faster. CIOs who react to the new way of doing things at the pace that used to work risk falling further and further behind.

Bernard Golden is CEO of consulting firm HyperStratus, which specializes in virtualization, cloud computing and related issues. He is also the author of “Virtualization for Dummies,” the best-selling book on virtualization to date.

Follow Bernard Golden on Twitter @bernardgolden. Follow everything from on Twitter @CIOonline.