Is cloud computing a dirty word?

 I’m sitting at the airport in Montreal, waiting to head to Las Vegas for the Enterprise Cloud Summit. It’s been three years since we ran the first ECS at Interop, and much has changed. Back then, we spent a lot of time talking about what clouds were—helped by heavyweights from big public clouds like Amazon, Google, Rackspace and Microsoft.

A funny thing happened on the way to the clouds, though. The following year, incumbent vendors started preaching private clouds, preying on fears of lost control and invaded privacy. Change-resistant IT executives listened to them, and soon after, hybrid clouds were a hot topic.

I’ve no idea what a hybrid cloud is. I can’t go and buy one, or subscribe to one. I can, however, have an application that relies on hardware I own and hardware I rent. I think that’s what they mean. To me, hybrid clouds are a gateway drug to the adoption of public computing—as economies of scale and skill take over, companies will be drawn inexorably into a more on-demand world.

With much of what passes for clouds, I call shenanigans. The co-opting of cloud computing by organizations that don’t want change ignores the promise of cloud computing: the end of the IT monopoly. No surprise, then, that the monopolists are resisting.

In DC a few weeks ago, I joked that cloud computing was IT socialism. Nobody laughed. Inside the beltway, it seems, socialism is a dirty word. But the statement rings true: cloud computing is computing for the 99 percent, and the 1 percent that controls technology today is resisting change. IT conservatives worry about retaining control, when instead they should worry about delivering competitive IT services.

The ultimate purpose of clouds, in the clouds-as-utility model, is to abstract a layer of complexity away. It’s the same as payroll services, or the electrical system: immensely complex, but the business has a simple interface to them. The one percent forgets this at their peril.

That doesn’t mean a company can’t be cognizant of those services. I need to understand payroll tax, or home wiring, now that we have ADP or the utility grid. Clouds are about managing and quantifying complexity, abstracting it through simple interfaces. James Urquhart, who got me thinking about this a few months ago, is right—complexity abounds. It does so at a higher layer of abstraction: data centers instead of machines, services instead of ports, and applications instead of subroutines.

Look at it another way. Physics is applied math. Chemistry is applied physics. Biology is applied chemistry. By the time you get up the scientific stack to biology, systems are hopelessly complex, and we have to understand them by observing emerging behaviors and working with patterns. Down at the math level, we can work with the numbers themselves.

At the lower, simpler levels we use deduction, causality, and the burden of proof. At the higher, more complex levels we use induction, correlation, and the strength of probability. This is true in the sciences; it’s also true for complex computing systems. This may be why many of the best IT operators I know are biologists.

What clouds do is allow us to transition from device-level thinking to system-level thinking. Just as biologists, ecologists, and anthropologists work with observation and make up the “wet” sciences that aren’t precise, so cloud architects are doing “chaotic IT.”

This chaos has its rewards. We get a lot more emergent complexity and interestingness from biology than math, at least from a practical, hands-on perspective.

Now I’m really going to wax rhapsodic and philosophical. Gottfried Leibniz spent a lot of time trying to figure out what the best of all possible worlds was. He concluded that it’s the one of plentitude, where the fewest starting conditions give us the most outcomes. In his words, the best world would “actualize every genuine possibility.”

I suspect Leibniz would have considered today’s connected, abstracted, service-oriented Internet a better world than yesterday’s islands of client-server and mainframe computing. Biology, and cloud computing, are complex. They’re messy. And according to Leibniz, they’re also better, because they allow more possibilities.

It’s interesting to note that Leibniz also used this to argue that the world needs to have evil in it, because this is a “better” world as it has more possibilities. So the next time your cloud app dies a horrible, complex death, remember that Leibniz says it’s for the best.

The move to turnkey computing

At this year’s Cloud Connect, Werner Vogels predicted a future in which everything-as-a-service is the norm. While enterprise IT often equates virtual machines with the cloud, the reality is that virtual machines are only one of dozens of services Amazon offers. Its competitors aren’t far behind: companies like Google offer a horde of APIs, and even more traditional memory/compute/storage providers like Joyent are adding turnkey products for large storage.

In the end, nobody wants to see the sausage being made. Recent announcements by folks like VMWare, public provider acquisitions of PaaS products, competing private stacks like Openstack and Cloud.com, and private cloud tools that run higher up the stack remind us of one thing above all else: herding your boxen is a distraction from the business of building software and deploying applications.

I tried to argue this point at Cloud Connect, in a presentation entitled The Move to Turnkey Computing. Here it is on Slideshare, as a PDF with speakers’ notes.