Visitor Monitoring NetSuite Discusses: When Is Good, Good Enough? | Keynote
Webcast

Keynote Tech Talks: NetSuite Discusses, When Is Good, Good Enough?

About the Webcast

In this live roundtable discussion, Head of E-Commerce Performance & Scalability at NetSuite, Jeff Binder, and Keynote senior technology experts Dave Karow and Ben Rushlo discuss performance in context: first, with regards to the relationship between business metrics, such as abandonment rate with site performance; and secondly, how to use competitive intelligence to benchmark site performance and target meaningful improvements that positively affects your standing against the competition.

Webcast Transcription

Ben:                            

This is a great magazine cover. I'm sure many of you subscribe to Tails magazine, which I guess is about dogs. And I love dogs. I thought it was describing the magazine, but I did find this online. And if you look at the context there under the caption for Rachel Ray, it basically says Rachel Ray finds inspiration in cooking her family and her dog. I've always thought Rachel Ray is a little shady, to be honest, but I don't think she’s a cannibal, at least not that we would want to prove or talk about.

But what you see in that is the missing commas, which really are a way of providing context in our language, make all the difference. We go from an understanding of the sentence to not understanding it at all, and in fact taking it completely out of context and taking it to a different place. And so I thought this was pretty funny. But our approach at Keynote is that context does matter. And so while Keynote does do a lot with data, and we provide data to our customers, it’s really about putting that data in context. So collecting data, collecting alarms, collecting metrics in big metrics buckets—even in this world of big data, it’s only interesting when you know how to put that in context.

And that really means looking at the right data with the right understanding of is it good, bad, how is my competition faring, how are other customers faring; and then taking the right action. And so today I think we want to talk a little bit more practically about what does Keynote mean when we talk about context? And what are the tools and products and experts that are available to you as a Keynote customer that can help put your data in context?

So some examples of this at a very high level – I'll talk about each of these very briefly—but if you’re a My Keynote user, which I think all of you probably are, you’ll notice that in the last month, we’ve released our health score in beta. And so some people have looked at us and said, “Oh, wow, this is shocking. I'm getting Ds or Fs, or I thought my site was better than it was.” But the idea of this is that we know what makes a good site good, right? We have tons of competitive data, we have lots of big data.

Keynote measures something like 700 million data points a day across all our customers. And our team particularly does lots of work with customers to help optimize their site. And so we know from the best practice framework what makes a good site. And it’s not just us. If you read books by Steve Souders or if you looked at YSlow or PageSpeed, there’s a lot of agreement about what makes a good site. So the idea of the health score is that we add a grade – and not just the grade. The grade is sort of the high-level eye catcher that says, “Hey, I have a problem.” But then underneath that grade, you can open up and actually talk about “What are the key areas where I'm outside of balance?” So I have too many CSS files, and so that will show up as red. And that gives me an area of focus. Or maybe I have too slow of a render start, so time to first paint should be half a second and mine’s three seconds. And so that will show up as red. And there’ll be some verbiage around why that is.

And I think that’s super interesting. It doesn’t replace, thankfully, the need for consulting services or analytic services. But it does start the conversation about adding context to data. So instead of just the waterfall with a bunch of metrics, I now have something that say this is good or bad based on what Keynote knows about mobile performance or desktop performance. And I think that’s a really good start. So I think you’ll see more about it in our tool as we begin to bring more analytical context to our data set. And the goal is to make it easier for the customer so that you can take action, so that you can make the right decisions.

The middle section is really about data mashups. So we have all this great data, but we also realize that Keynote data is not the only data source, and that in some cases, data becomes exponentially more powerful as you add data sources. So you take data and system data, or Keynote data and Omniture analytic data, or you take conversion data and Keynote data. And I know, Dave, you’ve had some experience recently with customers talking about how they’ve used this sort of data feed, data mashup idea, and maybe you can share a little bit about this one.

Dave:                          

What I would say is those customers who have a fairly large investment in performance, who have teams of multiple people, the larger ones have, for several years now, been walking down a path of mashing up more and more data. So they take our data and they have multiple streams of internal data and mash them together. And it gives them not just context but even faster time to respond. And they put things into business context, so the business users know what this means and how is it affecting our orders or something else.

And where we’re going is that while the large firms who have huge teams that are building these mashups have been doing this for a little while. We want to make that more and more approachable to more and more customers so that you don’t have to have a team of coders mashing it up for you, but we would provide more and more of the foundation so you don’t have to be a giant to be doing best practices. So we’re kind of excited about where we’re going with more and more analytics in our products.

Ben:                            

Yes, and that makes sense. So right now, everybody has access to the API, data feed, and data pulse. But going forward, as you said, it will probably be a two-way system where we’re going to push data into Hadoop and big clusters and Splunk. But also maybe have our own Data Ingestor analytics kind of model. And I think both of those things make sense for different types of customers. So that’s exciting.

And then the last area is competitive. I talked at Velocity a little bit about performance and context, and if you’re bored and you want to look that up on YouTube, go ahead. But one of the big things I think that makes sites change and kind of improve the most rapidly is when you have business sponsorship. When you kind of come out of that IT bat cave and you start talking to product owners and business owners and line of business people. And they start to speak the language of performance and start to care about it.

And one of the best ways I've seen to do that, actually, to start that conversation at an organization is competitive data. Because it’s one thing to go to somebody and say, “Here’s some metrics about our site and I think they’re too slow,” or even, “Here’s a health score and Keynote thinks this is too – we get a D.” It’s easy for that to be dismissed. “Oh, well that’s your opinion, that’s Keynote’s opinion, that’s some data.” But once you have competitive data and you can kind of compare and contrast in a very fair and straightforward way, I think that can actually drive a lot of change within the organization.

So you start to say, “It’s not just our opinion that our cart is too slow or that our checkout pages are too slow. It’s actually, compared to the top 10 e-tailers that we consider our competition, or compared to some public available data that Keynote publishes on our website,” and we have public data that we publish all the time on keynote.com. So I think that competitive data is really, really important for context. I don't think you need it necessarily – unless you’re a very advanced customer – every day, every moment all the time. But I do think that having it in a quarterly context or in a quarterly deliverable or some sort of weekly publication like we do on the index on our site is very helpful in moving that discussion forward. Are we improving while our competitors are also improving, or are there major changes in the market that we need to be aware of and respond to? And that data really in that context can only come from a competitive viewpoint. So I think that’s a third area where Keynote is very focused and will continue to expand.

And I think later we’ll talk a little bit more about our newer product, Keynote Competitive Ranking (KCR). In general, we’re going to see a lot more in the Keynote toolset as well as in the analytics product space around competitive data.

So here’s a great example of why context really, really matters. And the merging of UX and performance. And if you have heard our CEO talk, she’s always talking about how performance relates to business outcomes. And so there’s a merging, in her mind, of performance and business conversion, or time on site, or user engagement or effectiveness of a marketing campaign. And I think the same is really about performance in UX, that how a page renders and how it performs has to be understood in context.

This is data from a cruise study, one of the KCRs that I was just mentioning earlier, that we just completed about two months ago. And this is comparing the home page render of the top 10 cruise sites. What’s interesting is that if you were just using metrics – and Jeff and I, who is on the call, as well, talk about this all the time – what’s the perfect metric to compare? And we say things like, “we’ll use on load, use time to first paint, and use end to end.”

And if you did that, you would still not get a full picture. And Norwegian is a little hard to see, there, but it’s the second from the bottom. They begin to paint relatively quickly. In fact, they’re the quickest to paint. They go from a blank screen to something very, very quickly. But what they actually paint is only the top nav. And then it takes them a very long time to actually paint the search bar, and then a very long time to actually paint the core content. And so if you’re a business person, what you want to drive people to on your cruise site is searching for cruises and then buying a cruise, or even just researching it, but probably searching and buying. And that search box, which is critical to that experience, doesn’t come until almost 60 percent of the page is loaded.

And so using video capture, which is what we’re doing here and something my team does in our analysis, helps kind of bring the metrics and the experience together. And you can start having a discussion with the business owner, with the technology owner around yes, these metrics are important, but how does this actually affect the rendered view of the page? And this is the sort of data that we’re bringing in the KCR, which I think is quite interesting. And we also do it in some of our other analytics products.

Dave:                          

Great. Thanks, Ben. So this is a graph to show – in the load testing practice, historically our biggest raving fans are business owners with a bottom line to defend or a risk to watch out for, for a big day or a big season, peak traffic events. And we’ve often used our consulting expertise to deliver something we would call a lost value analysis, which is where you’d say, “well, at this load level you were doing so poorly that these units wouldn’t get a usable experience and so you’re leaving this much money on the table.”

What you’re looking at now, here, is – soon to ship – is I've put that same expertise live into the product. So rather than an analysis that has to be done after the test and takes a couple days, this is something that during a test you would look and see “wow, right now I'm losing users at a pretty high rate and it’s costing me about $400,000 an hour in lost business at this particular load level.” And the way this works is, you can read the bullets on the left. We have some very sophisticated algorithm that models the users. And each one carries their own individual – just like real humans do – their own threshold of patience. And when they hit that threshold, they leave. And we then total up the cost to all that.

So we’re pretty excited that this is yet another example of taking our consulting expertise and pushing it more into the product so that when you’re using the product live, you’re getting that data without any delay. And we can focus our consulting on providing even more value add on top of that.

So as an example of context, a business user watching a load test would probably – that our eyes would get kind of swimmy with a lot of technical data about time to first bite and all that stuff. But when they see a graph that shows that the people are leaving at a certain rate and the cost of that is a six-figure number, they get pretty clear on whether they’re happy or not happy about how the test is going. So we love this stuff.

Ben:                            

Great. Thanks, Dave. So now we’re going to transition into our roundtable discussion. As a reminder, please feel free to submit any questions during this discussion. You can do this in the QA panel in the bottom section of the window. We’re going to go ahead and bring in our three presenters, and we’ll start asking a few questions.

Male Speaker:            

Great. Hi, everybody. So let me start with Ben. I know this all sounds great. Obviously, context is important, performance makes sense. So Ben, the first question is for you. When it comes to context, obviously it’s important. What is the value of this context in the work that you do and the work that you do for your customers?

Ben:                            

I think I sort of spoke to it, but the idea is that without that context, it’s hard to get motivation behind technical changes or business funding. Because if you don’t understand how your site compares, and part of comparison is how far off of the average or how outside of normal am I, and if you don’t know that, then you can’t take action. It’s much like when you go to the doctor and they take all your blood work and they do a huge amount of tests. And at the end of the day, what they’re looking for is where are you outside of normal?

And the way that they do that is with context. They say for someone who’s your age and your gender, etc., this is what is normal. And that then allows them to pinpoint an issue and then diagnose or give you a treatment plan. It’s really very similar in our work, is without the context for what is normal, what is healthy, what are our competitors doing, where is the industry going, how does this affect the render, for example – you don’t know where to focus.

And so a lot of the work that we do around this is really putting the data through kind of that context engine to come out with here are the five, six, seven areas that are out of balance, and here’s my prescription pad and go do that, go fix those things. And so I think that’s really important.

Male Speaker:            

Great, thanks. Dave, how about you or Jeff, you want to answer that question?

Dave:                          

I actually want to hear what Jeff has to say about this because I think he spends most of his days worrying about it.

Jeff:                            

You’re right, Dave, actually. I guess for us at NetSuite, we’re a little different because we are a platform as opposed to a retailer. But being a platform, of course, we power thousands of retailers. And as a platform, context is really everything because for us, we want to make sure that A) the product that we’re delivering is fast, we want to make sure that the service that we’re delivering, i.e. the sites that we build for our customers, in conjunction with our customers, are also fast. And so we use Keynote in a variety of ways, but one of the ways that we use them is by profiling major retailers in the world and looking at their performance.

And so by doing that, we understand how fast a home page should be, how fast a proceed-to-checkout should be, a log in, a guest check out, an add-to-cart, etc. And really without doing that type of analysis, it’s impossible to know where you should be. Because when someone comes to me and says, “Is two seconds fast enough?” I say it totally depends. What are we talking about and how are you measuring it? Because two seconds is, you're going to get different speeds from browser to browser, depending on whether you’re doing a last mile test or you’re doing a back phone test that’s going to vary; and then what metric are we measuring? Are we talking about – as Ben was alluding to earlier – are we talking about end to end time, are we talking about time to first bite, time to interactive page, etc., etc. So by profiling the industry, we as a company have a much better understanding of where we need and want to be for establishing our own standards of delivery.

Ben:                            

And Jeff, that kind of filters out both kind of backwards in the development cycle to when you’re making technical decisions in Dev and QA, and also kind of pours into the business community so that you get our marketing folks, so that you get the funding and the support that you need, or is there a certain kind of area where you see that context having the most effect at NetSuite?

Jeff:                            

Well, really for us, it’s first and foremost in two areas. One in terms of really product development, so everything rolled away right down to infrastructure, all the way up really to our own professional services. So, effectively, when we’re working with the customer to build a site for them on the platform, or work with their respective partners involved in that process, all of that ties into performance for us.

Ben:                            

Makes sense.

Dave:                          

Right, makes sense.

Ben:                            

Fantastic, thanks. I guess this next question is for you, Jeff. What are the ways that your customers are getting context, and how are they using it to drive improvement?

Jeff:                            

I think for us, again, because we have essentially our own performance standards that we have built up as a company for the sites that we build and for the platform, and those numbers and those targets are all based again on industry data and us making decisions based on our platform and measuring our sites and our customers, as well as the industry, so that when we’re basically working with a customer, my experience is a lot of customers, again not all customers – probably very few customers – do this type of competitive analysis.

And by virtue of the fact that we do it, we essentially can tell them the average top US retailer home page is this fast using this form of analysis, and we’ve run that exact same approach looking at your site, and so you’re here, and you really should be here. So then we can start to have an intelligent discussion really on a business management basis to say you’re here and you should be here, and then dive into the details and show them why and what they need to do differently or what we need to do differently, etc., etc.

And that kind of again covers all 360 degrees of the business. So we help our customers really to get some of that context from a performance perspective.

Ben:                            

Do you feel like that opens up the conversation in a way with the customer, maybe even if they’re not super-technical, that you wouldn’t have without that context? I kind of feel like in our practice with that data, then you’re not talking about Jeff’s opinion, Ben’s opinion, some esoteric book that was published three years ago about performance; but you’re really kind of showing them something that they can get their teeth into and also feel like it’s a fair comparison. Do you feel like that kind of changes the dynamic?

Jeff:                            

Oh, absolutely. Because again, a lot of times customers will typically come in and have some – maybe a number in their head. But without it being based on quantitative data is very – it’s somewhat meaningless. So being able to do that makes a huge difference, really, with customers, in my opinion.

Ben:                            

I always wonder. I've had calls with CEOs even, or CIOs, and they’re like – it usually comes up in availability, but it’s like the site should be five nine. And I know where that comes from: Six Sigma and all this sort of stuff. But it didn’t translate well into the web. And so it just always cracks me up – or you have people that: oh, I came from this other company, and at this other company we used six seconds. It’s like okay where does that come from?

Jeff:                            

How many years ago with that?

Ben:                            

It’s a funny kind of world out there when people start to pull numbers out of nowhere, I guess I'll say, for the polite saying. But to basically justify why their site is okay, or they don’t need to invest, or they don’t need to change anything. And it very well might be that their site’s okay, but without the objective, quantifiable, reliable kind of measurement, it’s really hard to tell.

Dave:                          

I would say that the thing that strikes me – you guys have probably seen this, as well – is that you can be in the numbers and you can talk tech, you can argue about the numbers all day long, but when you bring up business context, it’s like another kind of trend that keeps emerging which is that people are trying to continue in this quest to be able to explain the performance in the context that business people can understand so they have sponsorship for initiatives. Like if you want to actually address performance, you need alignment that hey, it matters. And when you can see that the competitor is delivering a usable page four seconds faster than you are, that’s huge. So I think it’s not all about just competition; it’s also about just understanding where the market – what’s the expectations, what’s normal. Because normal changes – I would say not every year; it’s literally like every six months normal is changing.

Ben:                            

Oh, yeah. I think definitely. So even with the health score idea, you don’t have to have competitive data. It’s like – again to my medical analogy, it’s like if you just took the tear sheet off and said this is what’s normal for people. That’s what we’re really trying to do with the health score. Is you don’t have to necessarily then pay for – though we’d sure love you to and there’s some value in paying for competitive data – but the health score provides that context for what’s normal. But we’re already thinking how do we keep that constantly updated?

Because as we move – in mobile, for example, from 3G to LTE and 4G, that changed things dramatically. What you could do in mobile over that connection speed, what’s the best practices were radically different. And you can’t just have this book, again, that was published five years ago or some rules-based thing somebody talked about three years ago as the context. You have to kind of consistently evolve that. And one of the ways we do it is competitive data that then we feed into the health score. But you’re exactly right. You have to have some objective way that is moving as the industry – and not even the industry.

A lot of sites now are looking at: I don't want to be the best I my industry, I want to look at other best-in-class sites outside of my vertical. So it’s not just ecommerce versus ecommerce: it’s ecommerce versus travel conversion sites – checkout in travel – versus wireless providers who are doing buy-a-phone, buy-a-plan; and banking sites that are doing authentication. I don't want to look at all of those and say, “OK, how do I stack up multi-vertical,” which I think is interesting, too.

Dave:                          

Yes, and as the notion of omnichannel becomes more and more present, we’re going to see – also the convergence to a chief digital officer. So you see instead of these islands, it’s kind of converging up to a chief digital officer or something like that, someone in IT that’s focused on all of the channels, or someone within the business side who’s focused not only on brick and mortar but on the stores. Those forces come together, you’ll see an app performing in a store where a customer is trying to get more information about a product.

There’s all these reverberations that are happening, and that experience has to be really, really solid, showing an ad or a coupon to a customer in the store. These are some of the more pushing, leading edge stuff. And it’s all going to happen fast and work well, or it’s just an idea on paper that doesn’t pay off.

Ben:                            

Great. Thanks, Dave. And next question is for Jeff. How can you present technical performance data to a business or a marketing-focused audience? How have you done this? Do you have any best practice around this? What’s your feedback on that?

Jeff:                            

I think very much in context present upon the role we’ve been talking about, it’s too full. One is – I mean business users, I think, like to keep things simple. And when you get into performance data, you can get analysis paralysis, I think, quite easily. So 1) keep it simple, and 2) I think you’ve got to keep it in context. So what I mean by that is again for us, what we do is we sit down and we have a discussion about their performance, and their performance relative to industry performance, and relative to maybe their historical performance if they’ve been a site with us for awhile.

So we can see kind of how the site has been performing over time. As they have made changes to it, have they essentially refined and minimized or have they bloated up their site, which happens easily in pretty much all website design. Sometimes customers go through that exercise of just building function, function, function, function until all of a sudden things start to slow down and someone asks the question. “Why is this taking as long as it is?”

And then we sit down and go through an optimization exercise with the customer and go through and say hey, as this is built up, really now the number of requests that make up a page is so far above average, let alone the average really fast site, or the size of the page has now become quite significant – again, above average. So because we have that data, we can sit down and do that type of comparative analysis not just in terms of the customer’s own path, but in terms of the industry. So we now have paths as they should be, and we know – we can drill into the detail of sites that are delivering performance that is on average or on par with the retail industry.

And we can show them why it is what it is, and why their particular design may not be and what they can do differently. That, I think, is helpful for shaping those business decisions to help the customer understand that – again, and this comes into play a lot for us because we are a platform. You can take a platform and make it very fast and efficient, or you can take a platform and make it very slow, and it depends on if you’re following best practices or not. And Ben probably wants to jump all over that one.

Ben:                            

Yeah, I think that’s a great point because people ask us all the time what’s the best platform, and of course we recommend NetSuite. But no, we try to stay out of recommending. But what’s the best platform, what’s the best technology? Is responsive design good, what CDN is good? And the reality is that it’s really what you do with it. It’s the devil in the details. And I think what business people – what we found with business people is that they want fact-based data to make decisions.

And I think that’s the challenge with some of this. Performance is separate from business, or IT is separate from business. And the IT folks are in some ways almost sometimes obscuring the data to protect their job or protect the view that hey, the business is going to be mad if they really see how bad the site is. And what I've found, because most of our work in the consulting space is with business people, is they want to understand the technical details. Not to the minutia that an IT person does, but they want to be able to make that tradeoff, like you said about functionality, time to market, experience, and performance.

And if you don’t have data about both of those – what is the cost to build the site in this with this experience, with this functionality, and what is the impact on performance, and is there another way to do that similar sort of functionality with better performance; if they don’t have that data, then they don’t make a decision. Or they make the wrong decision, which is time to market only, or functionality only. And so I think there is a big role that performance people who are brave enough to start talking to business people – and in some organizations that’s easier said than done – they have a really crucial role to play, I think, in bringing that context. So I think you’re exactly right.

Dave:                          

I was going to say on the load testing side, what we see – and it’s funny, the whole which platform – we have kind of a trend going right now where you have people who are running a platform but they know that the third party who coded for that platform could have done a good job, a bad job, or a really bad job. And it’s not until you actually stress that site that you find out that – so it’s amazing. Some of these platforms you might think are fairly standardized; they all have a shopping cart, etc. There’s a lot you can do to help or hurt yourself. And until they’re actually hitting it with high load, they don’t realize: oh, my goodness, this thing’s got itself twisted in knots. And so the trend we’re seeing is people who actually have to run these sites are very eager to push them sooner than later to figure out what the people who coded the site have given them. And it’s great if everyone is kind of all on the same page, but some of the economics are such that you have development teams that are separate from the ops teams. If they’re all a dev ops situation, that doesn’t happen. But in an ecommerce situation, it’s quite often that you’re delegating things.

So it’s pretty cool to actually see that even though someone is on a particular platform, and even NetSuite you probably have about as many different limitations that you do customers at some level, right? Right? And so actually don’t assume anything just because it’s like some other site. Until you actually exercise that code, you don’t really know how it’s going to behave.

Jeff:                            

Yes, the value of load testing, QA dev testing, production testing as people release things, it’s sort of basic 101, but Dave, as you know, there’s lots of people that kind of skip it or shortcut it and I think there’s a lot of value in making sure it’s part of the process.

Male Speaker:            

Excellent. Awesome. Thanks, guys. Next question is for Ben. Performance and performance management has typically been about speeds and feeds. How is that changing, and how do you see performance and user experience intersecting?

Ben:                            

I think it’s a great question. And really where that started to change was with the advent of real user monitoring, where you’re collecting information, much like Omniture Google Analytics and Webtrends, how they do it. You’re collecting information about the real customer session and their behavior as well as performance. And so I think that’s starting to change very radically how we think about performance.

Because if I can see a user, and I can look at things like time on site, bounce rate, engagement, conversion funnel in a typical ecommerce platform, and I can also look at their performance experience: so what device were they on, what browser were they on, where are they coming from and what was the experience they had in terms of load time. I can start to make very interesting correlations. And I can also, I think, provide context that’s even maybe more powerful than anything we’ve ever talked about. Because going to a business person and saying: hey, we can plot out cart speed and abandonment of cart, and it’s the same data set, it’s the same users.

If you move that needle slightly, instead of that intersection being three seconds, you can move it down. You’ll convert another 10 percent of your customers. That’s the sort of thing, I think, that is very, very powerful. Then you start getting people really excited about, oh, it’s bottom line impacting or top line impacting. It’s affecting my revenue. And so I think you’re going to see more and more with real user data, particularly this intersection of user behavior and performance. And our CEO talks all the time to customers about really what we do is ensuring business outcomes. So as a company, we’ve been thought of as a monitoring company, an alerting company, a measurement company. And all those things are true. But at the end of the day, all that stuff is really to ensure a business outcome. To make sure that you capture that moment with a customer, and that errors and performance issues and regional issues, and cloud issues, and CDN issues, third party issues – I could go on – don’t cause you to lose that fleeting opportunity. And so that’s the foundation, but then specifically with real user monitoring, I think you’ll see more and more. But then I think Dave, your little demo slide there is a really good example of that: technology with load testing. So you could talk a little bit more about that, but I think it –

Dave:                          

If you put money up there, their attention perks up a lot more. You might have a curious person who thinks the numbers are interesting, but when you start putting dollars out there, no one’s just going sort of glance over: yeah, we were losing about $400,000 an hour, what else did we talk about today? No, there’s going to be a conversation about how long did that go on, that we were losing $400,000 an hour? You said something earlier – some of this is so intuitively obvious. It’s not necessarily rocket science. It’s a matter of stopping, stepping back and saying how can I do that obvious thing that people aren’t doing as a general practice, and what will the impact be? And sometimes there’s technology you have to overcome to bring things together a certain way, but sometimes it’s just literally stepping back and making that context. And so we’re excited about making more of that kind of a default, that you shouldn’t have to go out of your way to do the numbers; the numbers should leap right out at you.

Ben:                            

I agree. So Jeff, how do you guys think about that? As you’re, I'm sure, collecting lots of statistics for your customers around business outcomes and user behavior? I know a lot of that’s private data for them, but are you guys looking at how you mash or correlate those things up with platform performance, whether that’s measured by Keynote or internal? How are those things interrelating for you in NetSuite?

Jeff:                            

Basically for us, we look at performance on a site-by-site basis. We have a whole series of demo sites. Even our developers will sit down and build sites for new releases. But then we also profile and test using the same basic approach, same methodology. And then we actually do aggregate up data from a platform perspective, as well. So we look at how performance is going for particular steps across a multitude of sites that are in the same vein.

And aggregating that information to understand how we’re doing as a whole, and looking for ways that we can sit down and do things better, do things faster. Our SuiteCommerce advanced products does have Akamai and CDN built in. We do have responsive design built in. We use a whole multiple layers of caching that then improve performance. And we’re always looking for ways that – basically we’re always looking at newer technologies that we can sit down and leverage to better the experience, and looking at new approaches in ways of caching more things.

We get into a lot of detail of even dividing up different assets to say how can we maximize the cache life of particular calls at an API level? So it gets pretty detailed over here.

Ben:                            

That is actually great information. But I guess my very nuanced – but not very well spoken – question was are you seeing your customers who obviously are using Google Analytics or Webtrends or Omniture try to kind of cross correlate, or interest in cross-correlating kind of pure business outcome in ecommerce conversion, right, the conversion funnel or cart abandonment or whatever with performance on your platform, or performance really of their site. Probably not your platform so much, but their site. Is that something that you are starting to see as a trend, where those teams are no longer kind of in separate silos?

Jeff:                            

I think really within the industry as a whole, and speaking to many CEOs and business owners, what you find – what I find, anyway, is that most of these business owners understand that there’s a direct relationship between top-line revenue and performance. And if their site is performing poorly, it’s going to hit their top line. I think it’s pretty much almost a given. I think when you get into quantitatively understanding, as Dave’s pointing out, if you have an outage that’s costing you $400,000.02 an hour, I don't think a lot of customers have got it to that extent.

That’s a great feature; kudos to you, Dave. But I think the general understanding from a principle standpoint of “slow performance equals lost revenue” is very clear in most of the minds of CEOs.

Dave:                          

And I think especially in ecomm, I think the challenge is you move outside of ecommerce to more marketing sites or product sites. They’re supporting a larger brand ecosystem. It’s a little more challenging for people to understand how does user behavior correlate to performance? In an ecomm site, I think there’s still a lot that could be done there, but there is more of an at least natural “Oh, yeah, if the site’s down, we’re not making money,” versus on a brand site there’s probably more that could be done. I think to be leading in this space around user engagement metrics, which are not really conversion metrics, and how that might relate to performance.

Jeff:                            

I definitely agree with you there. I think the other thing that comes up related to that when you’re dealing with brand sites is it comes down to the tradeoff side of things. Because if you have a site that’s very image heavy, flash heavy, you’ve got a lot of big, rich design, you’re going to – it’s more common to have performance issues with a site like that than something that’s incredibly lean and tight. And so you get into more discussions about tradeoffs. Because performance is often a discussion of tradeoffs.

You can’t deliver a 10 megabyte homepage and be fast; it doesn’t matter who you are. That comes up, I think, in my experience a lot more often on the sites that have incredibly rich and vivid multimedia content than it does on a typical pure play shopping site.

Dave:                          

Yes, and I totally agree. I think the only thing that I find about that tradeoff is sometimes we make it so binary as site owners or as the marketing team working for the site owner. Either you get a good experience or rich experience and you get bad performance, or you have a Google page and you get good performance. And I think it again is the devil in the details.

So you can DA – we have many luxury brand customers, many of them based in Europe, and you can have a very rich, immersive, Flash – though I'm not a big fan of Flash, but whatever, HTML 5 –experience and do it in a way that yeah, it’s probably not going to meet a sub 2 second homepage load but will still be, in terms of render, how quickly things start to paint above the fold, how you place your tags, how available the site is, will still be a high quality experience.

And I think that’s the discussion that has to happen at the business level, is it’s not a choice between Google and richness and good performance and bad performance; it’s really about your vision and then translated into best practices for performance and getting as good as you can get in that context.

Jeff:                            

I couldn’t agree with you more. It isn’t a black and white – an either/or type of scenario. And I see this a lot of times, too. Especially – we do optimization work for our customers. And many times I'll see two different sites. It’s the same site. One’s a prototype running the same platform, same everything, looks the same but loads in half the time. And it comes down into – again, it all comes down to best practices. It comes down to design approach. And as part of that, you’ve got to be a bit innovative, a bit creative. Yeah, you want to render above the fold of the page first and foremost, get it so that the user can interact with stuff. Use Ajax to pull in stuff after that.

There’s all kinds of techniques which you can apply so that you’re not giving the user what they don’t need first, and give them what they do need, and still load some of those other things. That level of innovation, I think, applies to site design on the whole.

Ben:                            

I totally agree.

Male Speaker:            

I just have a couple questions I want to address when it comes to the competitive intelligence. First, for Ben, when it comes to competitors for context – competition in contextual form, does it make sense to look at best-in-class sites outside of one’s industry?

Ben:                            

Yes, I sort of mentioned this already, so sorry about that, preempting your question.

Male Speaker:            

No, it’s okay.

Ben:                            

Yeah, I think we’re seeing more and more of that. Especially if your industry is small or niche. A good example would be the telecommunications industry in the US. We have four major carriers. And so the idea that if I'm the best in that industry, that’s meeting my customers’ expectations I think has a few flaws. One is most customers probably don’t go to Verizon first and then AT&T and then build up a competitive comparison in context. They’re comparing the experience they have on a bunch of other sites with what they’re having on Verizon or something.

And so number one is it doesn’t always make sense just to look at that because users’ perception – psychological perception makeup is not driven just by the competition. And the other thing is that you can learn a lot from best-in-class players outside of your space. And especially if you have a space that’s more innovative compared to maybe an older space or industry, I think there’s a lot that can be gained. And most functionality can be broken down into very comparable things: log in.    Well, that happens across financial services and retail and B2B and etc., travel, search functionality, product functionality, marketing functionality. You can create big buckets that are really industry agnostic. And I think we are seeing more and more people look to other industries, especially when their industry is more niche or maybe even they know it’s kind of a dog industry, and I won’t mention who those are. But that being the best in that industry is not really that great. I think at ecomm, that’s a very, very wide space.

And so in that case, I see people actually going the opposite way, which is: I'm a luxury retailer, I'm going to only look at luxury retailers, that’s who my market is. It depends, but definitely there is a move there to look at people outside of the space.

Male Speaker:            

Great, and maybe this is a good opportunity to give you a chance to talk about what Keynote offers when it comes to competitive intelligence.

Ben:                            

So without doing a product pitch, there’s really three things that we offer, primarily. One is that we have competitive indices that we published on our newly launched keynote.com. You’ll notice that some of them have kind of been stripped out. Jeff and I have actually talked about that around the retail industry index. And we’re really thinking about, how do we expand there? It is an Omni channel, it’s a multi-screen experience. So rather than just have many desktop indices and then a few mobile, doesn’t it make sense to do mobile, tablet, desktop, or desktop tablets app across the indices?

So I think you’ll see some changes in the next six months in our index strategy. And that data’s free and it’s available and it’s great for context, and I think a lot of customers use it in that way. In fact, I have tons of people who aren’t customers that harass me constantly about our index, so it’s being used. The second is you can always come and through the analytics team – my team – we can do a custom competitive study, and that’s really very, very popular. That would be you choose the competitors, the industry, in your vertical, outside your vertical, the pages you want to compare, the business process, and we do that study for you.

And that’s extremely useful in terms of context. And the last one is, the one I'll talk about, I guess – there’s others, but this is the other main one – which is Keynote competitive ranking, so KCR. And those are syndicated studies that we run in travel, financial services – probably we’ll eventually do into retail – auto insurance and others – and the idea there is that we’re actually combining user experience. So what you would typically do with a UX firm around brand and brand uplift and brand engagement and customers experience and performance.

And we’re doing it in a syndicated model, which means it’s much less expensive than you would have to pay if you did a private study. And it’s super, super interesting because you get to sort of see what are real users saying about the site and my competition? What do they think about my brand and my user experience? And it’s all competitive. And then how does that relate to my performance? And we’re finding very interesting correlations where if users perceive your site as slow, most of the time it is slow.

And so we watch you do things around looking at the intersection of how those things work. And so it’s a very unique study in that it balances both behavioral and real user data with performance. And so information about that is on the site if you’re interested in that. And we’re doing syndicated versions of those – probably three or four a quarter – in various industries.

Male Speaker:            

Great, thank you. I know we only have a few minutes left and I really want to get to a few questions, so let me just dive right in. The first one is for Ben. I know you touched on this a little bit earlier, but I wanted to revisit it and kind of address a specific question. What would be a good method of tracking site abandonment when a cart or purchase is not involved? Say in the context of using a site for service and support.

Ben:                            

It becomes more challenging, but some of the more common metrics are bounce, which is really about people that come to one page and leave. Exit, which is really about what pages are being exited on the highest, and then things like general engagement metrics like time on site. Now, time on site might be negative or positive, depending on the experience. If you’re running a support site and you want someone to look up a support article and get the heck out of there, then what you might actually be looking for is something like how many clicks on average.

And if people are spending 10 clicks and really it should be three, and that’s increasing over time, those are the sorts of metrics that we hear our customers looking at related to non-ecommerce or conversion metrics. Now, those are not performance metrics and those are not all going to come from Keynote. Our Run product will have some of that in there: abandonment, engagement, etc. But those are metrics that you can get from your typical Google, Omniture, Web Trends products.

Male Speaker:            

Great, thank you. And speaking of metrics, does Keynote collect metrics about the percent of people by patient categories that are included in the business model that you are referring to in the -

[Crosstalk]

Dave:                          

The question is, do we have metrics about which percentage of people fall into which patient categories? So it’s very vertical-specific. It can be specific to a company. And actually what we typically do is collaborate with companies that have analytics to get a better sense in the context of load testing, we’re wanting our massive population of users to behave like your users do. So we would actually work back from your own users and your own analytics and understand kind of the patterns you see. And we can also tease that out.

But then we can allocate bands of users. So essentially we could say it looks like half your users are average and a quarter are patient and a quarter are very patient. That’s a super simple, obvious default. But we can bias that and say when we field these users who arrive, your users are very, very picky so 70 percent of them are going to be like hair trigger upset about the smallest delay. And we can balance those populations and then ramp them up.

So do I have a list of the levels of patient by site? No, I wish I could just hand that to you. The fact of the matter is that you have a lot of that data for your own customers hidden away in your own data. And going forward, I think that Ben and his team are also looking for ways that they can reasonably and safely aggregate that data and not be giving away anybody’s store. But to be able to say, “in your sector here’s what we’re seeing,” and we’re very excited about the notion of we have a lot of data from a lot of customers, and we generally only publicly publish the stuff we run on our own indexes.

So we’re open minded about ways to give you more of that critical – those metrics in an aggregated form where they’re not really going to give away anybody’s secrets.

Ben:                            

Yes, definitely.

Male Speaker:            

Great, and one more question here on load testing is: can you further elaborate pieces of consulting, it’s added to the product or offerings in regards to performance – slide, mention abandonment rate, and revenue.

Dave:                          

At one level, we shipped a release a few months back where we took a report that we previously had sold as an optional report, and it usually took two days to get it. A very detailed Word doc with lots of charts and graphs and explanations. And we realized that 85 percent of that report, which previously was only available if you asked our consulting organization to create it for you for an additional fee, we could generate programmatically within 10 minutes of the test. And so we took all that stuff that people would only get as a very fraction of the test, and put it into every test.

And that’s kind of a no brainer. But we kind of woke up and said, “we don’t want to hoard this data, we want people to have this data.” And if they want specific advice, we’ll still offer that within a day or two as an executive summary. So that’s an example of just kind of again a trend, and that’s in a product today. It’s not something that’s coming; it’s there now. The graph you saw where we’re actually putting dollars during the test was a suggestion of one of our account executives in the field who spends a lot of time with ecommerce customers who said: “You know, if you would just give me the average cost of a shopping cart and how many people are abandoning, do the math, it would be hugely helpful.”

So at some level, off-site analysis can get a little more complex than that. But the fundamental part is what is my average conversion rate, what is my average shopping cart amount, and how many people just left? And in the load test, they’re leaving because either there’s an error or because you’ve violated their threshold of patience. So rather than have a big, three-day run up to a board meeting, we wanted to put that data in the portal during that test. We wanted it to be in a download report 10 minutes after the test. And that’s where we’re going with the – it’s shipping this season.

Male Speaker:            

Great. And I've got another question here around load testing. How does Keynote’s load testing differ from what we do in QA with Load Runner?

Dave:                          

Oh, great. So sometimes we get into these turf concerns or whatever, and actually what we’ve realized is they’re actually extremely complementary. So internally if you stress particular components, you hit hard at a particular thing to see what the theoretical throughput of a piece is or to check for regressions. But you’re almost never going to be testing CDN, you’re never going to be testing third parties, you’re not going to be testing your external bandwidth. So the outside in testing we do is A) holistic, so it’s the entire solution and it’s what I would refer to as a sort of ambient load.

You may be concerned about a new module you’ve added, but the reality is unless you’re also having people adding new accounts and you’re having people doing with the searches and the other things they do, your testing of that new module isn’t really as valid unless you’re hitting it all at once. Sometimes it’s the convergence of the load that’s an issue. So outside-in focuses on the external components: CDN, third parties, firewall, load balancer, and bandwidth at the boundary. But also focuses on a holistic demand on the system. And so they’re very complementary. You basically want to early and often be testing things sort of from a profiling perspective in development.

Somebody creates a new module, beat it up and make sure that when you hit it with 10,000 sessions, it doesn’t end up with too many connections to the database. But once you’ve done that, you then need to kind of look at the holistic and really do performance regression. You ship something, you think you’ve got something doing well, hit it from the outside with a peak day traffic, and make sure that you're getting the response you want. Does that answer the question?

Male Speaker:            

Yes, definitely. We’re going to switch back to the slides. Thanks, gentlemen, for your time today and the great discussion. For those that were curious on how to get started, or for any questions regarding load testing or Keynote’s comparative rankings or insights, please contact your account manager. You can also submit a request via web form by clicking on either of the links provided. And if you’re unsure who your account manager is, please just drop us a line at info@keynote.com or give us a call at 1-855-KEYNOTE. Thank you for attending our session today.

Duration: 56 minutes

Back to Top