Companies developing latency-sensitive applications for hosting in their data centers could take a lesson from the playbook of OnLive.
The startup company, which just had its coming-out party during the recent E3 gaming confab, runs resource-intensive computer games on its servers and streams the action to players throughout the United States. Despite the technical hurdles of delivering real-time games—many of them first-person “twitch” games—to consumers, the service has quickly become popular, says Steve Perlman, the company’s CEO.
So much so, that the company has had to expand faster than planned.
“We are in the fall projections for our business model in terms of subscribers,” he says. “We ran out of floor space in our current colos.”
The service is a different take on game delivery. Some game companies are, somewhat controversially, requiring consumers to be always connected to the Internet, to head off piracy. Other game services, such as Steam and Battle.net, deliver entertainment software online. However, OnLive uses the Internet to deliver the actual content and upload players’ commands and mouse movements, clicks and twitches, to servers running in the company’s data centers.
The service trades computer games’ increasing requirements in terms of PC hardware for a requirement of a stable high-bandwidth Internet connection. While modest PCs may be able to play high-performance games, the player’s household needs a hefty connection to the Internet. For high-definition games, the player needs a 5 Mbps connection. However, the service degrades gracefully, with laptop screens requiring less bandwidth. Even standard definition, at 500 Kbps, is supported.
OnLive calculates that up to 60 percent of households in the United States could receive the bandwidth necessary for high-definition screens, says Perlman, who has been focused on content delivery since he founded WebTV in the 90s.
The Latency Issue
Yet, to make it all work, OnLive had to spend much of its time creating a low-latency data center and infrastructure. To play interactive games over the Internet, the biggest hurdle is not bandwidth, but latency. Ideally, games should have no more than 80 milliseconds of delay, says Perlman, but delays of up to 150 milliseconds can be tolerated.
“There is a lot of confusion about latency and bandwidth,” Perlman says. “Bandwidth and latency are orthogonal to each other. When you look at CDNs (content delivery networks) they are not there to optimize latency, they are there to optimize bandwidth.”
To tackle the problem, the company used the concept of data-center clusters. The company dynamically tests the latency between each of its data centers and the player’s computer and chooses the best tier-1 Internet service provider to deliver the content and retrieve gamers’ actions.
“One of the things that is different about our connectivity is that (for most companies) if you have a service, even a large online service, you generally have one tier-1 carrier and maybe a backup,” Perlman says. “We have ten different online carriers coming in.”
However, the Internet is not the main source of latency, the CEO says. The game’s code can create a lot of latency—up to 100 milliseconds in some games. Monitors are another big source of latency as well, sometimes as much as 50 or 60 milliseconds.
Usually, software and hardware latency are easy to pare down—developers and engineers have just not worried to much about tens of milliseconds of latency. When told that their monitor had a 45-milliseconds delay, for example, one manufacturer cut the delay to 6 milliseconds in the next model, Perlman says.
Compared to other content streaming networks, such as Netflix’s movies-over-the-Internet service, getting rid of latency is very important for OnLive.
“When you are talking about video streaming, a half second of latency is pretty good—in gaming, a half second is intolerable,” Perlman says.
Dell’s Solution
To minimize latency in the data center, the company had Dell’s Data Center Solutions (DCS) group create custom ultra-dense servers, pairing each CPU with a high-end graphics processor. Using the custom hardware and proprietary software, OnLive minimizes the time needed to create a high-definition video stream.
“They have to put thousands of these (servers and graphics cards) in the place,” says Andy Rhodes, director of marketing, Dell Data Center Solutions. “They are trying to replace the gaming console in the living room with the servers running in the data center.”
The last piece was to tweak the video compression to allow for lost packets. Because low latency is the primary goal, the company is willing to work around packet loss on the network, rather than spend precious milliseconds waiting for a response to a dropped packet message.
“We don’t have time for a retry,” Perlman says. “If we ask for a retry, the second round of latency makes the game unplayable.”
While the company has focused on gaming, there is no reason why other types of high-bandwidth applications could not get the same treatment, says Dell DCS’s Rhodes.
“They started with gaming, but with their emulation technology, they could start serving up CAD/CAM applications as well,” says Rhodes. “There are other applicable markets that could use those platforms. Gaming is a great start because we don’t have a console. But a lot of other applicable markets that you great beyond gaming.”
Follow everything from CIO.com on Twitter @CIOonline.