Is cloud computing a dirty word?

 I’m sitting at the airport in Montreal, waiting to head to Las Vegas for the Enterprise Cloud Summit. It’s been three years since we ran the first ECS at Interop, and much has changed. Back then, we spent a lot of time talking about what clouds were—helped by heavyweights from big public clouds like Amazon, Google, Rackspace and Microsoft.

A funny thing happened on the way to the clouds, though. The following year, incumbent vendors started preaching private clouds, preying on fears of lost control and invaded privacy. Change-resistant IT executives listened to them, and soon after, hybrid clouds were a hot topic.

I’ve no idea what a hybrid cloud is. I can’t go and buy one, or subscribe to one. I can, however, have an application that relies on hardware I own and hardware I rent. I think that’s what they mean. To me, hybrid clouds are a gateway drug to the adoption of public computing—as economies of scale and skill take over, companies will be drawn inexorably into a more on-demand world.

With much of what passes for clouds, I call shenanigans. The co-opting of cloud computing by organizations that don’t want change ignores the promise of cloud computing: the end of the IT monopoly. No surprise, then, that the monopolists are resisting.

In DC a few weeks ago, I joked that cloud computing was IT socialism. Nobody laughed. Inside the beltway, it seems, socialism is a dirty word. But the statement rings true: cloud computing is computing for the 99 percent, and the 1 percent that controls technology today is resisting change. IT conservatives worry about retaining control, when instead they should worry about delivering competitive IT services.

The ultimate purpose of clouds, in the clouds-as-utility model, is to abstract a layer of complexity away. It’s the same as payroll services, or the electrical system: immensely complex, but the business has a simple interface to them. The one percent forgets this at their peril.

That doesn’t mean a company can’t be cognizant of those services. I need to understand payroll tax, or home wiring, now that we have ADP or the utility grid. Clouds are about managing and quantifying complexity, abstracting it through simple interfaces. James Urquhart, who got me thinking about this a few months ago, is right—complexity abounds. It does so at a higher layer of abstraction: data centers instead of machines, services instead of ports, and applications instead of subroutines.

Look at it another way. Physics is applied math. Chemistry is applied physics. Biology is applied chemistry. By the time you get up the scientific stack to biology, systems are hopelessly complex, and we have to understand them by observing emerging behaviors and working with patterns. Down at the math level, we can work with the numbers themselves.

At the lower, simpler levels we use deduction, causality, and the burden of proof. At the higher, more complex levels we use induction, correlation, and the strength of probability. This is true in the sciences; it’s also true for complex computing systems. This may be why many of the best IT operators I know are biologists.

What clouds do is allow us to transition from device-level thinking to system-level thinking. Just as biologists, ecologists, and anthropologists work with observation and make up the “wet” sciences that aren’t precise, so cloud architects are doing “chaotic IT.”

This chaos has its rewards. We get a lot more emergent complexity and interestingness from biology than math, at least from a practical, hands-on perspective.

Now I’m really going to wax rhapsodic and philosophical. Gottfried Leibniz spent a lot of time trying to figure out what the best of all possible worlds was. He concluded that it’s the one of plentitude, where the fewest starting conditions give us the most outcomes. In his words, the best world would “actualize every genuine possibility.”

I suspect Leibniz would have considered today’s connected, abstracted, service-oriented Internet a better world than yesterday’s islands of client-server and mainframe computing. Biology, and cloud computing, are complex. They’re messy. And according to Leibniz, they’re also better, because they allow more possibilities.

It’s interesting to note that Leibniz also used this to argue that the world needs to have evil in it, because this is a “better” world as it has more possibilities. So the next time your cloud app dies a horrible, complex death, remember that Leibniz says it’s for the best.

The Democratization of IT slides from Interop

Last week, I presented a session on the democratization of IT. The short version is this: When every employee has better technology in their pocket than they do on their desk at work, and when it’s easy and cheap to deploy new applications that fly under the radar of enterprise IT controls, IT is no longer a monopoly, and it needs to shift what it does dramatically in order to stay relevant.

IT needs to stop being so Canadian

In modern companies, information drives everything from product planning to sales to finances. The flow of knowledge throughout a company is a critical asset.

There’s gold in that traffic—real-time business intelligence, risks and threats, customer insight. IT is custodian of that information, but most of the time it simply passes on raw data to the rest of the company. And that’s wrong.

If it is to remain relevant, IT must stop being a resource economy and become a producer of finished goods. This has happened before, and it’s a history lesson anyone in information technology needs to study.

Continue reading “IT needs to stop being so Canadian”