by Meridith Levinson

Google’s Robust Service on 9/11 Due to Scalability

News
Dec 01, 20012 mins
BPM Systems

At 7:30 a.m. PST on Sept. 11, 2001, Sergey Brin, cofounder of search engine and Web Business 50 winner Google, woke up to a phone call from a colleague who told him about the attacks on the World Trade Center and Pentagon. Hanging up the phone, he wanted more information, and visited several online news sites, including CNN.com and ABCnews.com. In spite of his hefty 6-megabit DSL Internet connection, none of the sites loaded. Thinking millions of Americans were likely having the same problem as they frantically groped for information about the attacks, he decided it was critical for Google to post cached copies of those developing news stories. (Google was selected as a Web Business 50 winner prior to Sept. 11 for its superior, easy-to-use search engine technology.)

In less than an hour, Google webmasters designed and sent an updated homepage to the company’s 10,000 Web servers nationwide. By midmorning on the West Coast, Google’s timely homepage appeared with links to copies of reports from otherwise inaccessible news sites.

Traffic on Google spiked the Tuesday of the attacks as visitors came to the site typing in queries for inaccessible news outlets as well as topics such as Osama bin Laden, Nostradamus and the FBI. It remained heavy for the next three days, but the site never buckled.

“I don’t think [the news sites] had ever planned for a worldwide news event of this magnitude,” says Jim Reese, Google’s chief operations engineer. “We’ve spent our entire time as a business working on the very issue of scalability and capacity,” he says.Indeed, Google has 10,000 computers distributed across data centers on the East and West Coasts. The company’s fleet of software engineers wrote code that distributes multiple copies of its data to each data center, which are linked to one another through multiple Internet connections. The company also performs load balancing between data centers and on individual Web servers. That means when someone sends a query to Google, it is automatically routed to the data center or Web server that is the least busy or has the smallest “load” of queries.

Even after the news sites were back up, traffic was heavy on Google. “Visitors realized immediately that they could now come to Google and do a search, or they could click a link and get a copy of the latest news website,” says Reese.