There are hundreds of tools available to show what’s going on with a production website. But the problems arise when people try to use the wrong tool for the job, which often leads to bad conclusions. In my experience, operational questions fall into four major categories:
- What did my users do?
- Could they do it?
- Why did they do it?
- How did they do it?
There are four classes of tool that answer these four questions. But they’re all similar enough to cause confusion. Here’s a clarification.
What did they do?
This is the domain of web analytics. Omniture, having swallowed Visual Sciences and WebSideStory, is the big player here on the commercial side. When we talked with Omniture’s CMO, Gail Ennis, at the firm’s annual summit, she said the firm was “spending a lot of time” on the economies of scale and on keeping users of its competitors’ platforms satisfied.
“They’re a passionate group. When you’re thinking about acquiring a competitor it isn’t always so obvious. But it feels really good to everybody-we find they’re very passionate about web analytics, even though it’s interesting how unintegrated [Websidestory and Visual Sciences] were. [Visual Sciences] are coming from an offline, multi-channel perspective, but [Omniture and Websidestory] are coming from an online perspective. But when the two of those come together, you watch the sales organization interact around the possibilities you can get online and offline. They say things like, “what about my [interactive voice response] system?”
Web analytics started out as a way of tracking users, but now it’s evolved into a set of tools for maximizing desired outcomes such as purchases or enrolment. By combining a desireable outcome (buying) with a visitor’s history (clicking on an ad, navigating through a site a certain way) companies can tune their marketing, designs, offers, and partners to get better results.
Most analytics is report-centric, but in the real-time Internet we’re starting to see self-optimizing analytics. Omniture purchased Offermatica (now called Test and Target), which can dynamically modify a site to get the best results. And startup Adchemy does what it calls “operational” analytics to maximize the return on ad spending.
Of course, analytics only show you what happened. But if, for example, a marketing campaign is outrageously successful — and brings down a site as a result — then what happened is that very few people came to the site. Which brings us to the next category.
Could they do it?
For folks who run sites, it’s less about the ROI of the site — that’s marketing’s concern — and more about performance and availability. This is where End User Experience Management (EUEM) comes in; it’s the IT complement to analytics tools. Companies like Coradiant* and Tealeaf, as well as many of the large network management companies, have products that will watch a site’s users and see where they get stuck. This generally happens at the aggregate level — “the login page is unusually slow” — and at the individual level of “here’s where Bob got that error.”
Initially deployed as collectors, EUEM has evolved over the years with automation, advanced reporting, and other enhancements to let IT do everything from alerting to capacity planning to SLA reporting. While these products were first deployed in high-value applications (such as Software-as-a-Service firms) and e-commerce sites, they’re starting to be used in enterprise software applications as well.
There’s one other important thing to consider. Tools that watch end user experience only have data when there are users. All site operators rely on some form of synthetic testing to ensure their site is reachable from the Internet. There are lots of hosted services that do this, including Keynote, Gomez, Webmetrics, Alertsite, and Dot-com Monitor. Lately, I’ve been impressed by Pingdom and its integration with SMS for notifications.
But let’s say you have a site where users are arriving and booking hotel rooms, then abandoning at the last minute. This happens all the time. Your site has the right offers — it’s pulling people in — and the servers and network are delivering pages flawlessly. What’s going on? To answer this, you need to get inside the head of your customer.
Why did they do it?
In a classic example of this, iPerceptions’ CEO Jerry Tarasofsky cites a hotel chain experiencing exactly the problem described above. Following the installation of iPerceptions’ surveys, the hotel chain learned that people were checking availability but had no intention of making a reservation. So the hotelier was able to adjust its site to accommodate them, and to encourage them to book.
Of course, surveys aren’t always filled out — many users ignore them, either because they’re too busy or because they don’t want to share data. But brief survey models (like the free, four-question-long one that iPerceptions recently launched in conjunction with blogger/analytics guru Avinash Kaushik) can yield useful insights into why users visited the site and whether they accomplished what they’d hoped to.
How did they do it?
The final part of the visibility problem is understanding how someone went about accomplishing a task. Did they click on the blue text or the pink button? In navigation, there are many ways to use an application–not all of them obvious to designers.
Clicktale is a notable company doing this, and they have a number of interesting reports on usability analysis. Robot Replay has a similar offering. But recording every mouse and key gesture puts a lot of burden on the client, particularly if that client has to record not only user actions but what the page looked like at the time they interacted with it.
On the other hand, a lightweight version of the user analysis is available from Crazyegg, which shows mouseclicks on a page and groups them by traffic sources. It’s much more lightweight because it doesn’t try to record every gesture on the page, just the outcome. But it’s a great way to tell whether, for example, users are clicking on a picture thinking it’s supposed to lead somewhere.
Putting it all together
These four perspectives give a web business a far greater understanding of their site. Each set of tools answers a different, and complementary, question. Armed with the data about what they did, whether they could, why they did, and how they did something, operators can offer a much more satisfying experience to their users. Support is easier, and ROI goes up.
But what I find is most revealing about these tools is the accountability they produce within organizations. Instead of going by instinct, opinion, or what the guy in the suit thinks, companies start to make decisions based on fact.
[* Disclaimer: Alistair Croll co-founded web performance firm Coradiant, which makes End User Experience Monitoring appliances.]