The Democratization of IT slides from Interop

Last week, I presented a session on the democratization of IT. The short version is this: When every employee has better technology in their pocket than they do on their desk at work, and when it’s easy and cheap to deploy new applications that fly under the radar of enterprise IT controls, IT is no longer a monopoly, and it needs to shift what it does dramatically in order to stay relevant.

What Kitchen-Aid taught me about cloud computing

If you’re even slightly interested in utility computing — the move towards on-demand, pay-as-you-go processing platforms — then Nick Carr’s The Big Switch is a must-read. You may not agree with everything he says, but his basic thesis is compelling: Just as we went from running our own generators to buying electricity from the power company, so we’re going to move from running our own computers to buying computing from a utility.

Because I spend a lot of my time writing, I’m constantly trying to out-guess the future. And something I’m obsessed with right now is appliances. Not virtual appliances, or network appliances, but simple appliances like pasta makers, bread machines, meat grinders, blenders, and so on.

If you look at the history of the electrical industry, the businesses that became interesting immediately after ubiquitous power was available were those you could plug into it. Generators were boring; but fans, irons, and fridges were really, really cool.

I touched on the topic back in May at Interop (there’s a Slideshare of the deck here on Bitcurrent.) And I think it’s worthy of a lot more consideration because, well, Costco had a sale on Kitchen-Aid mixers.

My wife is an extraordinary cook and an even better baker. And she’s long lusted after a Kitchen-Aid. They’re something of a cult, with a powerful motor, a custom-fit bowl, and dozens of attachments. Most people are happy with a hand-mixer, or a whisk, but there’s an obsessed segment of the market, the Really Serious Home Baker, full of those who simply must have a Kitchen-Aid. So this grey, intimidating, vaguely Cylon-like appliance dominates our countertop.

The Kitchen-Aid is at its core a motor. Its most common use is as a mixer, whisk, or dough hook. But it has attachments that can grind sausage, make ice-cream, roll pasta, shuck peas, and so on. It was conceived in an era where motors were expensive, and attachments were cheap. Here’s a great photo of a precursor to the modern Kitchen-Aid.

Today, motors are cheap. We don’t even think about them. We build them into everything, which is why gift tables at weddings are festooned with single-purpose appliances. And the Kitchen-Aid is the workhorse of near-professionals who demand a 600-watt motor that can tug even the toughest foods into submission.

User interfaces are the modern equivalent of appliances. Until recently, the Internet’s user interface was a desktop computer. Connecting to the Internet was a lot of work for a device: Network signaling, properly rendered graphics, keyboard and mouse, a display with enough resolution, and so on. It required a dedicated machine. The “motor” was expensive, the attachments were cheap. So we put many applications on our PC: Mail, Instant messaging, games, document viewers, file storage, mapping software, videoconferencing, and so on.

But all that has changed. We now have set-top boxes, game consoles, PDAs, cellphones, book readers, SANs — hundreds of devices, all able to access the Internet, all purpose built. That PC in the room is increasingly the jack of all trades, and master of none. The motor is cheap; the attachments matter now.

There are things the PC is still best for: Workstation tasks, like graphic design or software development. But if you want to understand the future of consumer electronics and user interfaces when CPUs are ubiquitous, consider what happened to kitchens when the motor was everywhere.

Self-powered appliances were all about convenience and portability: You don’t have to set up, dismantle, and clean your Kitchen-Aid every time you want to do something, and you can use an immersion blender single-handed over a hot stove-top. In other words, while many cooks crave a Kitchen-Aid, few use it to grind their morning coffee.

We still have to deal with gadget sprawl. Just as everyone has spare hand mixers and blenders secreted away at the back of their kitchen cupboards, so we’re struggling with multiple devices and seeking a way to reduce them. Certainly, high-end PDAs like the Blackberry, iPhone, Windows Mobile devices or the Nokia N95 are tackling this challenge.

It’s also important to remember we’re not just dealing with physical devices, we’re dealing with information. Having multiple blenders isn’t bad–it just wastes space. But having multiple gadgets, each with a part of your digital life on it, is horrible. Which is why synchronization and architectures like Microsoft’s Live Mesh, Google Apps/iGoogle, and Apple’s Mobile Me are so important: It’s not just about decentralizing the physical interface, it’s about decentralizing the information.

When I talk with people about cloud computing and SaaS, I’m always surprised how little mention is made of mobility and ubiquitous computing. To me, these are as big a driver of on-demand platforms like Amazon Web Services or Google App Engine as any of the cost savings or fast development cycles that a cloud can offer.

Future of computing: Forecast calls for partly cloudy

Cloud computing is the hottest Internet insider buzzword since the technologies to which it owes its existence: Virtualization and Grid Computing.

In May’s Interop Unconference, we explored their intersection in an informal jam session with enthusiastic audience participation starring Jinesh Varia (Amazon), Kirill Sheynkman (Elastra), Rueven Cohen (Enomaly), Jacob Farmer (Cambridge Computer), and Louis DiMeglio (ScienceLogic).

It’s taken some time to fully digest the results.

To many of us, the cloud is that amorphous blob of semicircular squiggles the IT crowd has been using on whiteboards to represent the internet since the mid-nineties. Clouds mean we don’t care what’s in them.

Cloud Computing - everything and the kitchen sinkOnce upon a time, that cloud in the middle of the whiteboard used to just represent the network — how to get from here to there. All the interesting stuff happened outside its borders. More recently, however, we’ve started moving the rest of the shapes on the whiteboard into the cloud. Applications and infrastructure are now drawn within the borders of that formerly ill-defined and anarchic etherspace.

If you listen to some overzealous cloudnuts, you’ll will hear that pretty much everything is rushing headlong into the Internet’s troposphere. But the truth is much more complex, and rational opinions seem to favor a hybrid future of rich clients, hardware, and software. We’ll have a hugely diverse mix of private and public cloud-based services providing both a back-end and a matrix for device interaction.

Aside: I’ll leave defining cloud computing ad nauseam to other bloggers. For our purpose it is the trend of outsourcing what you would normally run in your datacenter to an indefinitely flexible computing platform which is billed to you as a utility. Traditional hosters don’t count (for me) as cloud providers, but newer managed service hosters might, depending on the level of automation and scalability they employ.

So what did the Interop crowd conclude?

Continue reading “Future of computing: Forecast calls for partly cloudy”

Defining cloud computing: It's all about the layers

Cloud concepts can be pretty confusing. But when you tell a small business owner or early-stage startup it means not having to spend a lot of money, it gets simple fast.

Denise Deveau wrote about this recently in the Globe and Mail (and I got quoted a bunch, which was nice.) But defining what “cloud” really means is a contentious subject. At the upcoming Cloudcamp in San Francisco (running before Structure, and organized by the energetic Reuven Cohen) this is sure to be a subject of debate.

My overly simple soundbite for the Globe article was that cloud computing was “having computing resources available to you when you don’t own the machines.” But that might get me into trouble: There’s a taxonomy of on-demand services, from platform-as-a-service to hardware-as-a-service. And then there’s grid computing. And of course SaaS gets lumped in with this.

So I’m going to try a more detailed description:

Cloud computing means having a set of abstracted resources available to you, and not worrying about what’s below that abstraction.

Continue reading “Defining cloud computing: It's all about the layers”

Moving to SaaS

Here’s a copy of the presentation I’m giving today at Interop. It looks at the perils and best practices of moving an application from internally-run to software-as-a-service.