Visitor Monitoring The New Perspective on Performance | Keynote

The New Perspective on Performance

Call it the Moore’s Law of User Impatience: The more time spent online and the more rich and complex Web pages get, the less time users are willing to wait for pages to load. In 2006, users were willing to tarry a leisurely four seconds while a page loaded. By 2009, that time dropped to two seconds, and users exited a site in droves at the three-second mark., "Akamai Reveals 2 Seconds as the New Threshold of Acceptability for eCommerce Web Page Response Times," 9/14/2009 Today, Microsoft purports that a miniscule difference of 250 milliseconds (that’s one-quarter of a second; if you just blinked, that took longer) is enough to give one site a competitive advantage over another. 2The New York Times, "For Impatient Web Users, an Eye Blink is Just Too Long to Wait," by Steve Lohr, 2/29/2012

The need for online speed is well-documented through stats and stories of abandoned carts, visitors lost, brands tarnished. The rule today more than ever is that every millisecond of delay costs real money in lost revenue.

Speed is an elusive and moving target, in large part because of the tremendous complexity of Web pages today. Websites strive to create ever more engaging, interactive and visually dazzling experiences to satisfy users who expect to be impressed.

For perspective, in 1995 the average Web page was just 14.1k and contained 2.3 objects. In 2010, the average page was 498k with 75 objects. If that trajectory holds, this year the average page will have 83 objects weighing in at 684k, or nearly three-quarters of a megabyte., "Visualizing Web Performance" (infographic)

It’s not dissimilar to the trajectory of desktop software (remember desktop software?) where, as computers gain more and more horsepower (thanks to the real Moore’s Law), software gets bloated at a slightly greater pace, ensuring that the hardware will always be taxed. So too, as broadband has gained speed and penetration, websites have put on more and more weight, perpetuating the struggle to meet users’ expectations for speed.

In a world where the blink of an eye means the difference between success and failure, site developers and owners face the ongoing challenge of shaving milliseconds anywhere they can and ensuring that their pages are delivered quickly no matter how many users are accessing their site or where in the world those users might be. Fortunately, new tools just introduced to the marketplace are designed to give site owners the data to do just that.

Focusing performance on the user

There’s always been a danger in performance management of getting caught up in the data and potentially forgetting what it means for the user. Page load time is the holy grail; it’s the number everyone looks at, the number everyone tries to shrink. But can a single number, representing a start-to-finish process, really reflect what the experience is like for the user on the other side of the browser?

“It’s very dangerous when you disconnect performance from the user experience,” says Ben Rushlo, Keynote’s director of Internet technologies. “Numbers in and of themselves can be misleading and not necessarily very useful. But if you start marrying those numbers with questions like, ‘What does it mean to the user? Are they getting content quickly in the browser? Are they getting to interact with this page very quickly? How does that flush out in terms of page performance?’ That’s when I think performance management becomes very powerful.”

The answers to those questions recently became a lot more accessible. The W3C’s Web Performance Working Group has released its Navigation Timing API, which collects timing data directly from within the browser, and provides visibility into each of the major phases that make up page load time. It’s currently available in the latest releases of Internet Explorer, Chrome, and Firefox.

With Navigation Timing, says Internet Explorer Program Manager Jatinder Mann, who also works in the W3C working group, "the browser itself saves timestamps from various events in the process of navigating to a page, including timestamps for the starting and ending of phases like prompting to unload the previous document – the time it takes to run any unload script of the previous document – re-directions, hitting the application cache, the DNS lookup time, TCP connection time, server request response time, as well as processing time for DOM operation, like DOM loading, DOM interactive, DOM content loaded, DOM complete events."

This is "timing information on things that developers can change," Mann continues, that’s accurately coming directly from the browser, as opposed to some kind of JavaScript or other intercept.

The next generation of performance management

Being able to see page performance at this more granular level enables two significant types of insight. First, it enables a more accurate understanding of what the site experience is for the user, which can be represented in ways that are important to the business–not just IT. Second, it can help site developers and operations teams work together to shave precious milliseconds from users’ perceived wait times.

Not all of this data is new. For some time, Keynote has measured and reported critical page-load milestones – such as URL redirect time, resource timing, etc. Operations teams have long used this information to tune their websites and correct performance issues. But collaborating with developers on problems impacting user experience was more difficult.

Now operations teams can monitor, measure, and parse page performance in a way that offers a far more telling picture of user experience–information that’s both actionable to developers, and of concern to business owners.

To give business owners this insight, Keynote in Transaction Perspective 11 leverages Navigation Timing to measure distinct phases of User Experience:

Time to First Paint: When the user sees something happening on the screen; the site has begun to render in response to their request. This critical first step tells the user that the site is responding to their action.

Time to Full Screen: When most users would perceive their browser space is filled above the fold; rendering may still be happening out of sight, but from the user’s perspective, they’re looking at a full page.

User Experience Time: The total elapsed time the page took to complete. The browser is done with the page and is now waiting for and responding to additional user input. This is analogous to the standard page load time or user time; it can also be used to measure a complete multi-page transaction.

Is it a page yet?

In a hyper-impatient world where success for one website over another happens in the blink of an eye, a single page-load metric may not be precise enough to represent how a user experiences the page. With increasingly sophisticated websites, the perception of how quickly a page loads can be quite different from the reality of how it is technically fully loaded.

“There’s a disconnect that’s starting to happen between user experience metrics and performance metrics,” Rushlo says. “It used to be that they were both more or less synonymous, because you could measure performance and it was very close to what the user experienced.”

In reality, though, because many things can be happening behind the scenes even as the user sees the page on the screen, the page load number may not reflect what the user is perceiving.

“The end-to-end number might say ten seconds,” Rushlo points out, “but the reality is you may experience it in two or three seconds as a customer of that site.”

So two sites, each with a page load time of five seconds and therefore offering identical “performance,” may offer very different user experiences.

“From a user’s perspective,” Rushlo says, “if that end-to-end page load is five seconds, but they’re getting content on one in say, half a second, and on the other page they don’t get any content until four seconds, the user’s perception of that is going to be completely different. They’re going to be much more satisfied with that five-second end-to-end where I start to get content in half a second – what we call ‘time to first paint’ – versus the five-second end-to-end where I get content in four seconds.”

When users click, they want to see something happening; they want to know the site is responding. The visual cue of seeing something begin to render on the page, as long as it doesn’t take forever to finish, may be enough to hold on to that user. A page load time of four, five, or more seconds may be acceptable, if the user sees enough activity happening earlier on.

Measuring what’s important

Of course, nothing is quite so simple. Different websites have different priorities, ideally reflecting the priorities of their users. It is those user priorities that a business should be focused on measuring and satisfying.

“A site like Twitter might want to measure when the first tweet on the page is displayed,” says Keynote Senior Product Manager Dan Galatin. “Other sites, a news site for example, may be looking for particular articles to display. We can now set a marker to indicate when that event has occurred. We can measure it with Time to Full Screen.”

“Users probably don’t expect a travel site or a banking site to have the same performance as Google,” Rushlo adds. “You always have to keep user context in mind. This data is going to help get to the next level of user context. It’s going to marry performance metrics, which are objective – no grey areas – with user experience metrics, which will offer more detail on what the customer’s actually experiencing on the site.”

Additionally, sites typically want users to do more than visit a single page; success is gauged by how sticky visitors are, which in turn is impacted by how smoothly and quickly they progress through various pages and transactions. This, too, can now be measured. User Experience Time can be measured for an entire transaction as well as individual pages.

The new standard for performance precision

What this all means is that, using the new data now available through the browser, if you can determine what’s important to the user, you can measure it and monitor it. And everyone, from the boardroom to the development lab, can be on the same page. Which means no excuses for not continuously improving the user experience.

“Customers now have a much more pinpoint laser focus on the things that need to be optimized,” Rushlo says, “They can take a hard look at the critical path, the number of elements and objects between the zero second and the one second, and that will become the area of focus.”

And that can spell the difference between a transaction completed, an ad sold, a sale made, and blink! the user is gone.

Back to Top