Keeping ourselves honest

Jennifer Bell and the folks at Visible Government took the covers off their much-needed I Believe In Open project. If you’re a Canadian, you should go sign up. Simply put: any elected official who isn’t willing to be transparent and accountable to their electorate has something to hide, and we now have the technology to track their record.

Which makes me wonder what Bitcurrent’s record is. Once upon a time, many of the folks behind Bitcurrent were part of Networkshop, a consulting firm that became Coradiant, a web performance company that helped create the end user experience management space.

Back then, Networkshop talked a lot of trash. We blew the whistle on SSL performance issues, and wrote a huge (250+ page) study on load balancing. We also prognosticated a lot.

Using the Internet Way-Back Machine, I decided to go scoop up some issues of Networkshop News and see how they stood up to scrutiny nine years later. Here’s one on how networks change if the PC is no longer the dominant client, from March, 2000.

How do you think it stacks up?

How do networks change if the PC isn’t the dominant client any more?

Ask any network manager to draw a client-server relationship on a dusty white-board, and you’ll get a squiggly box. The assumption is that the client is a PC. The sooner we recognize this as false, the less likely we are to miss a major shift in clients.

Here come the games
Sony’s new Playstation 2 will play DVDs and surf the web. The barriers to entry — and inherent complexity — in a set-top box are slim. A teenager can buy one with a few mowed lawns these days. And the lean boxes have no legacy of ISA busses, serial ports, or interrupts to contend with.

The PS2 (no relation to IBM’s failed attempt at a legacy-free PC) has a fast processor at a fraction of PC prices, free-PC models notwithstanding. But there’s more to this than a price war. The PS2 is a different context for networking.

What if the processor’s not the core?
We assume that the processor is the critical piece of a computer. Until now, the CPU has indeed been the bottleneck, and continues to be the bulky center to which all peripherals are tethered. My Toshiba ultraportable looks like an octopus when CD-ROM, LAN, modem, mouse, keyboard, and monitor cables are attached.

As Donald Norman has pointed out, when electric motors first hit the market, you bought one motor and connected it to things — from the washing machine to the vacuum cleaner. As time went by, the motors became cheap, almost standardized, and manufacturers could include one in every device. This greatly reduced the complexity of each device to the operator.

Handspring’s Palm-like PDA features a slot for expansion. The assumption is that this slot adds features to the PDA such as MP3 decoding or GPS location. I think we may have this backwards: I see devices like Handspring’s as plug-on consoles to larger devices. I could, for example, plug my Visor onto a car and get a display, along with contact information and directions. I could plug it onto a stereo to program song names. I could plug it onto a VCR to set recording schedules.

So is the interface the key?
I tend to think that the interface has more sustainable “user gravity” than the processor. As everything in our lives grows a CPU, from our car to our cell phone, we get a disparity of content and usability. I don’t want a contact list on my car, another in my phone, and another on my computer. Similarly, I don’t want to have to learn multiple interfaces — I want to access them in a consistent manner.

We get shared, structured information from networks, with directories behind them. Problem solved. No, it’s not perfect yet, but it’s good enough for most of us. But what about the interfaces? Let’s look at some consequences:

  • As the number of network clients explodes, network designers need to recognize that a “user” may be a person accessing information from various locations (car, phone, computer, set-top game box) using a common client (a plug-in, clip-on interface of some sort.)
  • There’s the issue of context. The user’s frame of mind — and expectations of responsiveness — will change when they’re playing games, or on the phone, or driving.
  • Wireless users have to contend with other factors, such as their proximity to one another (does this mean the network neighborhood really is the neighborhood? Do you really want to browse Ned Flanders’ hard drive?)
  • Ergonomics will alter surfing patterns; I’m certain that people browse differently on a Playstation controller, a roll-and-click interface, a mouse, and a steering wheel.

I don’t really want to delve into wireless issues just yet (I’ll be stealing my thunder for another installment) but if the interface, rather than the processor, becomes the critical factor, the types of users and the way they use the network may change significantly. When a dating website for singles ties in to wireless PDAs so you can track down compatible partners in a bar, the surfing experience is radically different. If you want to mail your Playstation’s saved games to a friend’s house, who runs the mail server?

Bottom line: the cell phones, PDAs, and gaming consoles of the next year are going to change things radically. New people will be defining the protocols, as well as trying to shoehorn old ones into new roles for which they weren’t designed (if I see another acronym ending in ML, I’m going to run for the hills.)

Consider the context of use
Increasingly, e-business application design will have to recognize the context of the user — mobile, gaming, in public, and so on. It will also have to consider the expectations in terms of performance, scalability, security and availability that this implies. This means different navigational methods, different network architectures, and different types of information such as voice and iconographics.

Nets will change
We’re kidding ourselves if we think this is all just layer 7 stuff. The underlying networks will have to change to support this traffic. With 65% of all Internet traffic being web-based on port 80, networking devices have web functioned burned into their silicon. We optimize for application protocols. As WAP and instant gaming changes the traffic dynamic, the nature of the Internet itself will shift.

I don’t pretend to know how this will play out. Will Sony and Nintendo, the PC and Mac of the gaming world, invite Sega (Sun?) to their party, and have Microsoft’s set-top box along for the ride? Will we see RFCs for gaming, and will specifications start to recognize user context and proximity? Books like About Face deal with interactions and workflows, while networking protocols ignore them at their peril. If devices are going to be lean, cheap, and portable then the underlying networks will need to be smart enablers of advanced services.

I tend to think that Japanese networking firms, skilled in “gradual evolution” of products, will challenge networking incumbents. Nokia may also surprise us. Its forays into firewalling, wireless LANs, and clustering (through a recent acquisition of clustered VPN vendor Network Alchemy) have all met with approval, and it certainly has the handshare to command a portable market.

Anyway, the next time I surf a site, I’m going to ask myself what I’d do with a fire button and a joystick or a cell-phone scrolling wheel — and what this means for the underlying network infrastructure. A million gamers can’t be wrong.