If you’re involved in any aspect of the web or technology, you’re probably aware of Web2Expo. Spun off the Web2Summit, it’s now become the go-to conference for people who have to deliver on the things that VCs and board members dream up in their reality distortion fields.
We’re involved in a couple of things with Web2Expo. First of all, Sean Power and I are teaching a three-hour course on web monitoring, which is a synopsis of our Complete Web Monitoring book (AKA the Raven Book) coming out from O’Reilly later this year. It’s called Watching Websites: A Report from the Frontlines of Web Monitoring. We’re hoping to provide a holistic view of all the tools and techniques companies need to use to understand their presence online (no small task for just three hours!)
Second, I’m moderating a panel on cloud computing with Benjamin Black of Opscode, Lew Moorman of Rackspace.com, Kevin Gibbs of Google, and possibly a fourth participant we’re still confirming. This will be a great discussion — everyone’s been so busy talking about how cloud computing is utopian idealism that we often forget about the job of managing all those virtual components.
There’s an interesting response from Chris Hoff over at Rational Security to my GigaOm piece about cloud computing and security. Chris makes some great points (and flagged a good study on computer fraud that refutes some of what I said.)
Worth a read. What do you think? Are clouds less secure than in-house computing? The usual answer seems to be “it depends” — but what does it depend on? Can we come up with some rules for what’s safe to do in a cloud and when?
Maybe I can convince Chris to come to Vegas and get into a pointed argument about cloud computing risks.
Jennifer Bell and the folks at Visible Government took the covers off their much-needed I Believe In Open project. If you’re a Canadian, you should go sign up. Simply put: any elected official who isn’t willing to be transparent and accountable to their electorate has something to hide, and we now have the technology to track their record.
Which makes me wonder what Bitcurrent’s record is. Once upon a time, many of the folks behind Bitcurrent were part of Networkshop, a consulting firm that became Coradiant, a web performance company that helped create the end user experience management space.
Back then, Networkshop talked a lot of trash. We blew the whistle on SSL performance issues, and wrote a huge (250+ page) study on load balancing. We also prognosticated a lot.
Using the Internet Way-Back Machine, I decided to go scoop up some issues of Networkshop News and see how they stood up to scrutiny nine years later. Here’s one on how networks change if the PC is no longer the dominant client, from March, 2000.
How do you think it stacks up?
Continue reading “Keeping ourselves honest”
We speculated on vertical stratification of clouds at Interop Unconference back in May: demand for specialized cloud platforms will arise despite the availability of highly centralized low cost utility computing (i.e. Google, Amazon) since specific requirements of privacy or business process will require value added services and specialized architectures. Could one imagine a cloud provider specializing in HIPAA compliance?
Well an example just hit our radar: Fedcloud offers “Federally Compliant Trusted Cloud Computing.” (thanks Data Center Knowledge!) “A Trusted Cloud Computing Environment: Apptis and ServerVault combined our capabilities to provide you computing in an on-demand infrastructure that enables you to acquire, utilize, and disengage without contractual dependency (subscription fees, licenses, or long-term commitments). This extraordinary capability offers a utility bundle inclusive of hardware, software, personnel (24x7x365 engineering and operations, and application management) all with federally compliant security, processes, and procedures.”
Why this verticalization? Architecture and operations can matter a lot when specific requirements are introduced. There is an opportunity for premium margins for utility computing that addresses specific industries. You may need to be in a very narrow geographical area, or need technologies specific to your trade to be running in the cloud data center. Perhaps you aren’t allowed to share a hypervisor with other organizations? Or you might need on-site staff trained in particular arcane skills. Some types of vertical clouds could theoretically rest on top of infrastructure service clouds in the same way that Rightscale and Elastra sit on top of AWS, others will need to have an entirely difference architecture. Look for wide diversification and layering of these “vertical clouds” in the next few years, and a healthy ecosystem of options for cloud consumers!
While on the topic of cloud computing, Todd Hoff has an excellent short list of other cloud computing blogs to check out!
Google’s new Insights feature, which shows statistics on search terms, yields some interesting results when it comes to Cloud Computing.
There’s no doubt that it’s a hot topic; Insights shows important events related to that topic over time, which is fascinating: Like a stock ticker but for ideas.
Continue reading “Cloud computing, worldwide”
I wrote a piece a while back about how centralized computing makes a cloud a big target. I didn’t want to get into the biological origins of this stuff, but one commenter was right: Monoculture is a precursor to extinction.
In university (which seems a long, long time ago) I wrote my thesis on evolutionary theory and product life cycles. Admittedly, not a screamingly fun topic, but it did give me a chance to read up on the Burgess Shale and other such things.
Now comes word that Amazon’s EC2, by virtue of the independence it affords hosters, is being used by bad guys for nefarious misdeeds (thanks to Rachel Chalmers of The 451 for pointing it out.) This provides an additional risk: Many of the Internet’s defense mechanisms involve black-holing specific hosters when the sites they’re operating do bad things.
Of course, when you’re hosting many applications, having one of them get blacklisted can be a nuisance for all the others. What’s interesting is the back-pressure we’re seeing arise against the popularity of cloud computing: At Structure, we debated the fear of lock-in; Stacey has a great piece on enterprise obstacles to adoption; and here, we’re seeing the downside of on-demand, easy-access platforms.
In other words, the bigger they are, the harder they fall. And that doesn’t just apply to dinosaurs.
Cloud computing is the hottest Internet insider buzzword since the technologies to which it owes its existence: Virtualization and Grid Computing.
In May’s Interop Unconference, we explored their intersection in an informal jam session with enthusiastic audience participation starring Jinesh Varia (Amazon), Kirill Sheynkman (Elastra), Rueven Cohen (Enomaly), Jacob Farmer (Cambridge Computer), and Louis DiMeglio (ScienceLogic).
It’s taken some time to fully digest the results.
To many of us, the cloud is that amorphous blob of semicircular squiggles the IT crowd has been using on whiteboards to represent the internet since the mid-nineties. Clouds mean we don’t care what’s in them.
Once upon a time, that cloud in the middle of the whiteboard used to just represent the network — how to get from here to there. All the interesting stuff happened outside its borders. More recently, however, we’ve started moving the rest of the shapes on the whiteboard into the cloud. Applications and infrastructure are now drawn within the borders of that formerly ill-defined and anarchic etherspace.
If you listen to some overzealous cloudnuts, you’ll will hear that pretty much everything is rushing headlong into the Internet’s troposphere. But the truth is much more complex, and rational opinions seem to favor a hybrid future of rich clients, hardware, and software. We’ll have a hugely diverse mix of private and public cloud-based services providing both a back-end and a matrix for device interaction.
Aside: I’ll leave defining cloud computing ad nauseam to other bloggers. For our purpose it is the trend of outsourcing what you would normally run in your datacenter to an indefinitely flexible computing platform which is billed to you as a utility. Traditional hosters don’t count (for me) as cloud providers, but newer managed service hosters might, depending on the level of automation and scalability they employ.
So what did the Interop crowd conclude?
Continue reading “Future of computing: Forecast calls for partly cloudy”
Craig Balding recently launched a blog, cloudsecurity.org, looking at the intersection of cloud computing and security.
The challenges are significant: The concept of cloud computing is that you’re a tenant within someone else’s world (which is generally achieved through virtualization.) Consequently, you can’t ever see your entire environment down to the hardware; that would defeat the economics.
Being a virtual machine is like being Neo in the Matrix: You don’t know if the machines are benevolent or not.
Wonder if Craig runs his own machines?
Web2Expo San Francisco is coming up. The conference has grown quickly; initially, as a reaction to the popularity of O’Reilly’s more blue-sky conference, Web2.0 (which is now called Web2Summit.) Following the success of Web2Summit — the organizers sold out their 2,000 registrations quickly and had to turn away thousands more — they launched the Expo.
Web2Expo isn’t just a spill-over conference; it’s now got an identity of its own. If the folks at Web2Summit speculate and scheme, then it’s left to the Web2Expo attendees to figure out how to build what their less grounded peers have already promised.
Continue reading “Web2Expo San Francisco”
After an interesting weekend, I wrote an article for GigaOm about WordPress Themes and vulnerability. Got lots of press — even made the front page of Digg! — and several people propelled the story to new levels.
Nice to see the amount of activity on the topic and how much coverage it got. Derek, Paul, and Mark had all rung the warning bell earlier on.