Inaugurating Tomorrow’s Internet
What January 20 Tells Us About the Future of Streaming
Among the many firsts associated with the inauguration of America’s 44th president on January 20 this year were numerous milestones related to the scope and scale of the spectacle and its audience. From the massive security contingent to the crushing 2+ million crowd that converged on the National Mall, virtually everything about the ceremony set some sort of record. And as it turns out, the electronic tsunami it triggered was no exception.
The sheer volume of data pumped across our mobile and Internet pipelines in the hours surrounding the swearing-in was staggering. With millions of Americans watching the event online; texting, chatting, blogging and tweeting about it; capturing and sharing images of it; and searching for inauguration-related information over the course of the day, our online mettle was tested as never before.
So, how did we do? The answer by most accounts is pretty well, given the extreme circumstances. But the fact is, those circumstances won’t represent the extreme for long. As our consumption of bandwidth escalates and households open up their taps to receive an exploding array of streaming content, what seemed exceptional on January 20 may soon be the norm.
Here, Keynote takes a look at how the Web and key content providers performed on inauguration day; the role streaming content played in that performance; and what it may mean for our electronic future.
Battening Down the Hatches
While the National Park Service, DC Police and others were managing expectations and preparing for worst-case crowds, mobile providers like Sprint Nextel were also getting ready, investing in regional network capacity in preparation for an inevitable spike in wireless use by what was probably the largest gathering of mobile users in history. Online content providers took steps to prepare as well, with many making hardware upgrades to handle the day’s expected traffic.
But what couldn’t be upgraded was the core infrastructure of the Internet itself, which offers plenty of agility and capacity under a range of “normal” circumstances. But on that day, everything was pushed to the limit.
“We knew going in this was one of the most significant Web events ever, but we definitely saw a more dramatic slowdown than I think anyone anticipated related the overall health of the Internet,” notes Shawn White, director of external operations for Keynote Systems. “The Internet’s very resilient – it was designed by the Department of Defense to handle communications in the event of a nuclear attack, so it’s designed to handle outages. What it wasn’t necessarily designed well for is congestion and slowdown.”
Clearly, we’re very capable of a level of bandwidth consumption the Internet’s founding fathers could never have anticipated.
Performance Across the Web
Keynote’s inauguration performance analysis mirrored the experience of millions of Internet users that day. Keynote data point to significant slowdowns leading up and during the inauguration ceremony at news sites including ABC, CBS, Fox, the Los Angeles Times, MSNBC, NPR, USA Today and the Wall Street Journal. These slowdowns were understandable given that most news outlets were broadcasting a heavy combination of live photos, video images and blogs. The most significant slowdowns were experienced by ABC between 10:20am – 11:00am, with performance 8x slower than normal and 10% unavailability; CBS between 9:50am and 2:50pm (3x slower), and NPR between 11:30am – 12:20pm (10x slower, with 91% unavailability). In all these cases, Keynote data show a slowdown in “Web server response,” which is directly related to load and online traffic to these sites.
News sites weren’t the only ones overburdened. As a public service and part of its normal operations, Keynote Systems also benchmarks the performance of 40 major U.S. Federal Government Web Sites from the ten largest U.S. metropolitan areas. While no outages were noted at The White House Web site, Keynote data show that beginning around 10:00am EST, the site began to download 2x slower than normal and dipped to more than 16x slower at 11:30am, returning to 2x slower by 12:00pm.
The U.S. Senate saw massive slowdowns and outages from 9:20am until 2:00pm, while the U.S. Congress site did not. Performance for the Senate site slowed as much as 60x, with availability dropping to as low as 37%. The performance slowdown appeared to be directly related to their Web servers being overwhelmed with traffic.
The National Park Service, which hosted most of the inaugural events in its facilities throughout the Washington D.C. area, experienced a 2x slowdown on its site beginning at 11:00am and lasting until 2:30pm on inauguration day.
As for the business sector, the Keynote Business 40 (KB40) experienced a sharp slowdown around 11:00 am and lasting until 1:30pm, with average download speed dropping by 60%. While it is not uncommon to see the KB40 slowdown by this much during peak Internet usage, Keynote monitoring typically shows a gradual slowdown as more users access the Internet. In this case, the slowdown was very sudden and lasted much longer than normal.
But it was the bigger picture of overall Internet performance that told the real story. Keynote data indicated that between 11:00a.m. and 1:30p.m. ET, on January 20, the Internet as a whole slowed by 60 percent. And all evidence indicates that it was not the number of people who were online that caused the slowdown, but rather the amount of bandwidth each of those people was using. It was all about the streaming – the many multiple gigabytes of video traffic traversing the Internet as Americans sought to witness the transition of power as it happened, many of them exploiting fast workplace Internet connections.
According to content distribution network (CDN) Akamai, the inauguration day peak of more than 7 million active simultaneous streams came at 12:15p.m. ET, just as Obama was launching into his inaugural address and in contrast to an average of 1.3 million simultaneous streams for the previous 24 hours. That peak represented approximately two TB/s of data coursing through the nation’s electronic arteries.
CNN.com reported that nearly 27 million people watched streaming video on CNN.com Live on Tuesday, more than five times the site's previous record set on Election Day 2008, when 5.3 million people watched streaming video of the day's events. CNN.com tallied more than 18.8 million live video streams from 6a.m. to 1p.m.
According to Keynote performance data, CNN fared best out of all the major online news outlets for overall performance that day. Even so, some viewers were greeted with the message: "You made it! However, so did everyone else. This message means you've got your place in line to join our watch party."
“At least people who got a stream were pretty much guaranteed to maintain a stream throughout the session,” adds White. “At least streaming has that going for it – there’s already a built-in mechanism for people who are already watching stream. That’s not the case with traditional Web sites, for example.”
Barriers to Unlimited Streaming
While the volume of streaming that day was exceptional, the concern is that America is becoming increasingly addicted to this kind of bandwidth-hungry content. Streaming is the future of entertainment, with companies like Netflix, Blockbuster, Hulu and of course YouTube and Flickr making available an increasingly enticing array of streaming content, and consumers eager for the on-demand accessibility and choice the format offers.
Already, there are households that at any given time are consuming multiple simultaneous streams through various devices within their walls – from on-demand TV programming and films to games and social networking content. It likely won’t be long before the streams experienced on inauguration day pale in comparison to daily traffic demands. “As more people start to adopt streaming video as their main source of entertainment, the Internet needs to be able to keep up,” notes White. “It’s kind of like our highway system that was built 50 years ago — it was never anticipated that the average family would have 2-1/2 cars.”
Unfortunately, the roadwork required to fix the streaming problem isn’t simple, and the barriers to a future of unlimited streaming are many. The prospect of adding capacity to the Internet’s core infrastructure is both costly and economically complex, with content providers, carriers and consumers all sharing in the stakes. There’s also the issue of “last-mile” capacity. While adequate fiber capacity at the last mile of the delivery network is available in some areas, it remains costly. And it will be years before a full fiber-to-the-door infrastructure can be built out.
Some Internet providers have already placed restrictions on the amount of bandwidth individual households can consume. But that approach can’t hold back the floodgates for long. It is likely that major media providers will have to work collaboratively with the ATTs, Sprints and Verizons of the world to improve last-mile bandwidth for consumers, sharing in the costs and rewards.
What Can Be Done Now?
As online media stakeholders determine how responsibility, costs and returns for infrastructure investment can best be shared, it’s up to content providers to make streaming experience as positive as possible for users. The good news is that for prerecorded content in particular, there are a few best practices that can make all the difference. Here are just a few:
- Design videos for streamlined transmission to diverse players and last-mile technologies. It’s wise to provide streams in multiple formats to accommodate a variety of end-user hardware formats and carrier technologies, including DSL, cable and dial-up.
- Deploy caching and distribution systems or engage the services of CDNs like Akamai, Limelight or Bit Gravity. These solutions offer content providers a way to cache and redistribute prerecorded streaming content to the edge of the Web and closer to the end user, relieving traffic at the ’Net’s core and bypassing critical congestion points.
- Establish firm SLAs with carriers and CDNs based on solid metrics such as Keynote’s StreamQ grading system, which is the de facto industry standard for measuring the streaming user experience.
Performance standards are indeed one factor that may help push stakeholders to make necessary investments in streaming infrastructure. “We’re seeing a growing interest in our Streaming Perspective offering as providers look for a way to quantify and differentiate their service levels to customers,” says Ken Vodicka, streaming product manager at Keynote. “Our solutions measure the quality and reliability of streaming media and the way users experience it, analyzing factors such as connect time and buffer time to provide an objective grade on performance.”
Ideally, as the issues of core bandwidth, last-mile formats and household consumption levels are resolved, technical ingenuity will provide some interim answers to the streaming dilemma, generating new and innovative compression algorithms to help more streaming data get through existing pipes.
In the meantime, content providers should stay flexible in how and what they stream, utilize efficient distribution methodologies, and keep pressure on delivery providers to accommodate increasing volumes of streamed data. And while the rest of us are waiting for it to all play out – Puppy-Cam, anyone?