In June, Iceland-based CCP Games brought the hammer down on a group of resource hogs that were clogging its data center.
In an operation dubbed internally “Unholy Rage,” the company cut off 2 percent of its subscribers—real customers who had paid to play CCP’s massively multiplayer online role-playing game (MMORPG), known as EVE Online. The small group of players was using software to essentially cheat at the game, automating the collection of resources and the completion of quests to generate gold. These so-called real-money traders would then sell the gold for real-world cash.
Such schemes not only wreak havoc on the virtual world’s economy (CCP Games has its own on-staff economist), but the traders also have a significant impact on the company’s real-world data center. Case and point: When the company cut off the devious 2 percent of its users, it gained back 30 percent of its computing resources.
“With the reduced load, we have directly impacted our ability to scale our infrastructure to higher user counts,” says James Wyld, virtual world administrator for EVE Online, “so I would say we have saved on the next cycle of infrastructure costs.”
[ See CIO.com’s related story “Iceland: New Hotspot for Data Centers?” For timely data center news and expert advice on data center strategy, see CIO.com’s Data Center Drilldown section. ]
Most enterprises may not think that their operating environment is similar to the virtual fantasy world of World of Warcraft or the digital universe of EVE Online. Yet their data exists in a virtual space all its own, whether that’s a world of financial transactions, health data or sales information.
So for companies that want to shore up their virtual environments, here are some tips from the pros.
Know Your Infrastructure Costs
While EVE Online’s virtual world encompasses nearly 5,000 star systems, Blizzard Entertainment’s immensely popular fantasy MMORPG, World of Warcraft, has the most expansive infrastructure. Where CCP Games has a single data center in the United Kingdom, Blizzard has four data centers in the United States, seven data centers in the European Union and more in Asia Pacific.
“Blizzard is one of the top-10 architectures,” says Anthony Greenberg, founder and principal at RampRate, which advises companies on reducing their infrastructure costs. “They are massive.”
Both companies have similar data-center considerations, however: An expanding subscriber base led them to rapidly build more infrastructure, although not necessarily in the most cost-effective way, says Greenberg.
“Most data centers have an explosively growing relationship with data, but most of them have 20- to 30-percent fluff,” Greenberg says. “You need to know your costs as they compare to the market.”
RampRate advised both Blizzard and CCP Games and saved both millions of dollars on content-distribution agreements and bandwidth costs by researching market rates and renegotiating service agreements. Mid-term renegotiations also allow companies to add current best practices to their service agreements, Greenberg says.
Confront the 20 Percenters
As CCP Games found out, not all users are the same. Finding ways to deal with the small fraction of users that demand the most resources can result in immense savings.
In CCP’s case, analysis of game use pinpointed accounts that were dominating the game’s resources. The worst offenders were users that traded the in-game currency, called ISK, for real-world cash. These real-money traders ran macros to automate activities (such as mining) that generate money for their in-game players. Those macros take an inordinate amount of resources, says CCP’s Wyld.
Combine that with the fact that the exploitative users played nearly all the time, and CCP Games saw an extreme twist on the 80/20 rule: 2 percent of their user base was responsible for 30 percent of the workload in the data center.
“The impact on our data center was that we had to keep expanding the dedicated resources for mission-running areas of the game,” says Wyld, “and eventually, we had a large number of these areas quite clogged and overused.”
Dealing with the resource-hogging users resulted in server loads that dropped from 100 percent to zero overnight for the computers hosting strained sections of the game world.
Virtual Data Exists in a Real World
In 2004, Blizzard learned that the physical world can have an inordinate impact on the virtual, when a tornado hit the data center hosting a late beta project.
While the company maintains data centers all over the world, hosting some 13,000 blade servers using more than 112 terabytes of memory, back in 2004 it had just a single data center hosting its World of Warcraft beta. When IT managers heard about the bad weather, they called the data center managers who inexplicably told them that everything was fine, J. Allen Brack told attendees in September at the Game Developer Conference. Yet Blizzard had cameras watching their servers—cameras that showed rain water pouring into the data center.
Blizzard sent a team out to the facility to help protect the servers and get them back up and running. It took three days, Brack said, and taught them a valuable lesson. “It is important to monitor more than just the hardware,” Brack told attendees. “You also have to monitor the conditions in the data center, and you have to be prepared for disaster recovery.”
Focus on What Is Important
No one wants downtime, but game companies traditionally undergo shutdowns on a monthly, if not weekly, basis. For a game, such downtime is tolerable. For a bank, of course, it’s not.
“The risk of a game shutting down and losing a $10 subscription is a really different problem than a multimillion-dollar infrastructure shutting down,” Greenberg says. Yet, that’s not to say that game companies can take a freewheeling attitude toward their network connectivity, since they have some of the most dedicated users, the analyst says.
“CIOs can take a lesson from MMORPGs and learn that the quality of their service to their client has to be great,” Greenberg says. “Game companies care more because of their rabid user base.”
Follow everything from CIO.com on Twitter @CIOonline.