Tuesday 15 March 2011

Answering Avinash: is Time on Site a useful metric?

I'm currently reading Web Analytics 2.0 by Avinash Kaushik. It's an amazing book and I'm learning more and more every time I read a bit more.

But there is one thing I don't agree with: Avinash argues in chapter 3 that Time on Site is a very important and useful metric. Here is why I think Time on Site is not a good metric.

First of all a question: is it better when people spend more time on your site? Yes?
Let's put this in another way: Do you have a better website then your competitor if people spend more time on your site then on the competitor's site? The answer seems so obvious: of course it is better – more time means that visitors are happier with what they see. It means more 'engagement with your brand', like marketers would say. We all have very little time, so spending more time on a site must be better.

We could consider this as our basic hypothesis: 'more time on site = a better site'.

Not that good a site?

This also seems to be the general opinion when I talk to people from agencies and clients who work in on-line marketing: Time on Site is an important metric. And everybody seems to say: more time on site is better; we should take action to increase the time on site.

Let's try and test our hypothesis.

We all know one obvious example where more Time on Site isn't better: Google's search engine has a very clear strategy to decrease the Time on Site. Google understands that people come to their site and they want to find what they're looking for ASAP. Google optimizes it's search engine to reduce every microsecond possible and to ship every visit in the shortest time possible to one of the search results. They optimize their algorithm, and implement Google Suggest and Google Instant so we can search faster and leave their page quicker.

So if the success of Google is defined by the idea that more time on site isn't necessarily better, maybe we need to rethink this a bit?

Let us have a look at this example: imagine these 4 visits who come to my e-commerce site.

Visitor 1 comes to my e-commerce site to choose and buy a product. He finds the info he needs and buys the product he likes. He stays 5' on my site and is very happy with his visit.

Visitor 2 comes to my site to see where his order is, since it wasn't delivered on time and he hasn't heard anything yet. Visitor 2 comes to the site, tries to find a 'track order' section or function, doesn't find it, tries to log in but doesn't recall his password. He gets lost in the 'recover password' procedure. His only other options is calling an expensive 902 customer-care number and he gives up after a while. Visitor 2 stays 8' on the site and leaves frustrated and angry.

Visitor 3 doesn't like buying on-line, and wants to talk directly to a rep about some technical issues of your product. He comes to the site, locates easily the 'contact' function and gets a toll-free number in no-time. He stays 2' on the site and is very satisfied about the info the sales-persons gives him on the phone.

Visitor 4 comes to your site after clicking on a banner ad with a promotion. He doesn't see the promotion on the landing page. He's a good sport, so he clicks on a 'promotions' button, but this link is broken and he gets to see a 404 error-page. Visitor 4 stays just 1' on your site and needless to say, he's not a satisfied customer.

So we have an average Time on Site of 4'.


Visitor 1 stayed 25% more time than average on your site, which seems to confirm that people spending more time on site are happier and that it is 'better for your brand'.

Visitor 2 stayed double as average on your site and had an awful experience. This makes us question the hypothesis that a longer visit is a better visit.

Visitor 3 was happy, but stayed just half of the average time. So maybe we have to turn around our hypothesis. Maybe it's not true that 'more time = better site'. Maybe it should be: 'less time = better site'.

Visitor 4 stayed the shortest period of time, just 1', and wasn't happy either so 'less time' isn't guaranteeing the 'better site'.

So as you can see: we had 2 happy visitors and 2 unhappy visitors. From the happy visitors, 1 stayed 'above average' and another 'below average'. And for the unhappy visitors, we had 1 staying 'above average' and another 'below average'. By just looking at the Time on Site metric, we can't say if 'more time' is better or 'less time' is better. So we have to start interpreting. Maybe somebody thinks that for the majority of the visits 'more time' is better. Another person will argue the opposite.
This brings us back to Avinash, which states the following when writing about 'Exit Rates'.

If you have to overlay your own opinions and interpret any metric whether the data is 'good' or 'bad ', then you have a bad metric on your hands. 

Avinash states this when discussing whether a high Exit Rate of a page is good (people got to do what they wanted to do and left happily) or bad (people got stuck and left on that page).

In my opinion, the same thing happens to 'Time on Site'. There is no way to know if the data, for 'Time on Site', relates to a good or a bad thing.

What's the solution?

A better metric than Time on Site would be 'Time to Task Completion'. It's a metric that Gerry McGovern (pictured right) uses in his 'Top Task Management' strategy. McGovern argues that first you have to find out what the Top Tasks are on your site: the most important tasks for which visitors come to your site. These tasks can be subscribing for a newsletter, booking a flight, or buying a product. Once you've got the top tasks, McGovern argues that you have to use remote usability tests to see what percentage of your users can complete these top tasks and how long it would take. For this last one he uses 'Time to Task Completion' metric. And he's clear about it: the time to completion should be as low as possible. On the web, people are very impatient, and you do these people a favour when completing a task takes as little as possible.

Let's look at some practical examples:

YouTube: you would think that it's important for YouTube that people stay 'as long as possible' on their site. The main purpose of the site is to let people see videos, so more Time on Site means more time to watch videos.
But we could do the exercise again like above:

Visitor 1 comes to YouTube, looks for a video that he wants to see. It's a video of 2'. He watches it and leaves the site. He stayed 3'.

Visitor 2 comes to YouTube, looks for a video, only finds it after 3 searches, starts watching, he doesn't like it, searches again a couple of times and leaves unhappy after 5'. So is it really better to have a bigger Time on Site?

If I were YouTube, I'd find out my Top Task, which could be 'Find video' (= how good does my search work), 'Find and watch another video' (= retain people who've watched a video) and I'd measure the time it took people to complete these tasks. That would give ideas to give a better experience.

Facebook: Instead of measuring Time on Site for Facebook, I'd optimize the 'time to comment on wall-post of friends', 'time to upload pics', 'time to create an invite for an event'.

News Sites: especially for content sites, like news sites, it is said that 'Time on Site' is an important metric. When people stay longer on your site, they read more articles and that would mean a better value-offer.
Again, I'm not sure about this: personally I also use these sites to have a quick look on what's going on and 'long articles' isn't what I'm especially looking for. So maybe a 50'' visit in which I can see the 5 highlights at a glance is what I'm looking for, not the qualitative 30' visit.

Last remark on Time on Site: Metrics should always invite for taking action. In my experience, Time on Site is a very difficult metric to improve with marketing actions. So  if you're using the metric for your action plan, you'll find that it is very difficult that your actions influence this metric the way you want to.