Before 22,000 silent fans in Arthur Ashe Stadium in Queens, N.Y., the shrill sounds of two women grunting and screaming, their sneakers screeching on the green hard-court surface, fill the evening air. It’s a warm September evening in 1999, and the two women rallying in the quarterfinal match for the U.S. Open championship, Monica Seles and Serena Williams, are two of the fiercest—and certainly the loudest&3151;warriors in the women’s draw.
The little yellow ball speeds back and forth over the net; the crowd’s heads move in rhythmic, side-to-side unison. Williams eventually passes Seles with a grunt and a powerful backhand winner. And almost before anyone in the stands can blink, every detail about the last 11 seconds—the speed of the serve, the length of the rally, the winning stroke, the new score—is captured by courtside USTA officials. From there, the data travels from the head referee’s souped-up ThinkPad to points known—the huge scoreboard atop Arthur Ashe stadium, a server farm in Illinois, www.usopen.org—and unknown—whoever is logging in at home or the office.
The technology to do all this comes courtesy of IBM’s e-Business Services group, a traveling band of techies and sports enthusiasts who broadcast numbers, pictures and other assorted sporting data to the web world from events such as the Open, the Olympics, the PGA Championship and the Ryder Cup matches. This technology for aggregating, processing and disseminating scores and other real-time sports data is also available to the oil corporation looking to synchronize and share information from oil rigs around the globe, the law firm setting up a remote office in Mexico City or Timbuktu, and the government agency providing temporary onsite disaster relief in the floodplains of Arkansas. The content may change, but the difference between processing information at a major sporting event and your company’s next remote IT initiative is simply a matter of scale.
These large athletic events serve as a laboratory for testing and refining the leading-edge products of the Information Age—and a great way for fans who can’t afford the price of admission to get up close and personal with the likes of Anna Kournikova and Tiger Woods.
PGA at Medinah: Teeing Off
On the par 4, 415-yard third hole at tree-lined Medinah Country Club, just outside the melting asphalt streets of Chicago, Davis Love III taps his ball in the cup for a good par. Although it’s just the first round of last August’s PGA Championship, the ’97 winner seems relieved with his score on what is a relatively easy hole for the pros. A serious-looking man holding a clipboard and trailing the group jots down Love’s score, the number of putts, driving distance (and more stats) on a 3-by-5 index card, as well as the scores of his two playing partners. As the threesome leaves the green, he hands the card to a volunteer sitting on a folding chair behind the putting surface. The volunteer, an older woman—one of hundreds of suburban Chicago locals and Medinah club members who donate their time and energy to help the PGA Championship run smoothly—enters Love’s par and other data (using his unique ID number) into a handheld device. Love’s four flies through the air via a wireless LAN connection into a nondescript trailer in the media parking lot—the IDS (Information and Display Systems) scoring trailer. IDS is a Jacksonville, Fla.-based company that records the scoring data for the results system on www.pga.com. It is the conduit for every person, organization and network connection looking for up-to-the-second scores, including the site team and PGA.com. That par score gets confirmed for accuracy by IDS staffers and is simultaneously sent to the scoreboards on the course and the ThinkPad 770s running just a chip shot away in the IBM trailer. Then, things really start to happen.
But before the IBM team uploads Love’s par and routes it to its many virtual destinations, let’s go back in time, over a week prior to the first drive down Medinah’s opening hole, to when the IBM site crew arrives.
First, team members unpack more than one ton of computer and high-tech equipment. (This wired traveling road show winds its way around the globe, not by special Big Blue chartered planes or 18-wheelers. The servers, wires, cables, computers, monitors, mouses and other assorted tech paraphernalia are shipped like ordinary luggage.) A pair of ominous looking $20,000 RS/6000 A pair of ominous looking $20,000 RS/6000 internal staging web servers, named Godzilla and Mothra, travel with the team.internal staging web servers, aptly named Godzilla and Mothra, travel with the other gear. Unpacking and setting up takes one to two days. However, the essentials—establishing T1 connectivity, getting the staging servers humming, making sure the publishing system is functional—are ready in about four hours. The site team’s makeshift IT war room—a beige trailer with a discrete blue IBM sign on the side—is parked just outside the media compound and is wedged among other media outlet trailers, including The Associated Press’s, Reuters’ and the Golf Channel’s.
The team, made up of 10 IBMers and two PGA staffers, provides the overall project management, graphic design, systems administration, scoring programming, and publishing system design and programming. In charge of the operations for IBM’s e-business services is Jeff Ramminger, a communications-sector executive who is both a technologist and a business person. He goes to every event he can, but producers at each site run the show.
The Boys in the Trailer
The crew, which is mainly composed of baseball cap wearing twentysomethings, is headed by 28-year-old senior producer Patrick Childress. Sitting just beside Childress in the trailer is a graphic designer, who coordinates all of the Jeff Ramminger, IBM e-business services: Putting together remote IT operations requires preparation, experience, follow-through and flexibility.content being put into the system, making sure every story, photo and graphic is where it should be on PGA.com. Next to her is a Lotus Notes programmer who is checking on the content publishing system. Down at the other end of the trailer is Steve Hammer, 24, the scoring technologist. He’s in charge of the real-time scores coming in from the IDS trailer next door. A content/LAN webmaster pushes all the news articles and interviews from the PGA writers (using a Lotus Domino e-Publisher application) to the staging server while also monitoring the data after it leaves Medinah’s premises, verifying that it’s getting where it’s supposed to go. If any system failures or outages arise, it’s his job to make sure they get fixed. A network services manager ensures that the network and all of its connections are humming along. Two technical webmasters are also in on the show, monitoring the system and the site. And an audio/video specialist (and intern) is capturing and encoding the feeds and footage that the PGA wants out on the web. He’s also verifying that the video feeds from the net cams from holes 11 through 18 are working.
From the IDS scoring trailer the long lines of coded numbers, letters and symbols containing Davis Love’s score come streaming across monitors of ThinkPads and televisions in the trailer. Hammer, sitting in front of his monitor, staring at all these coded lines, makes sure all of those scores coming in go directly to PGA.com and that each golfer’s virtual score corresponds to his real-world score. On PGA.com, these scores are used by golf fans who can create their own customized, Java-based leader boards, tracking what their favorite pros are doing on the course in real-time. Since the beginning of 1999, the scoring application has run on Linux, which works well with the team’s web-based applications. IBM programmers developed the custom software to capture data from the scoring application and format it for the site.
While Tiger Woods’ birdies keep rolling in from all over the course, new articles and audio/video content from PGA of America staffers, such as a Sergio Garcia interview and a Fred Couples approach shot from the fairway, are also being pushed to the site. For the articles, the team uses a homegrown proprietary template called e-Publisher, which IBM customizes for each event. The e-Publisher tool, a combination of Lotus Notes and Lotus Domino, allows the nontechnical staffers to create new content on their laptops, which is then sent to the site producers for approval and finally to the content webmaster for downloading to the site. Producer Childress collaborates with the PGA staff to pick and choose the news stories to put up and how much attention to give to them.
Meanwhile the audio-video specialist, John Vogt, is editing a post-round news conference with Tour veteran Jay Haas and using IBM’s HotMedia suite of tools to encode them for the site as well as making photos and video of today’s highlights available for viewing.
Hidden behind all of this are Godzilla and Mothra, 43P model 260 workstations, and the PC systems, connected on the high-speed LAN, that are used by the staffers in the trailer and the media members all over Medinah’s members-only grounds. On the network, the team uses IBM ethernet stackable hubs for LAN connectivity. Multiprotocol intelligent hubs support the fiber-optic and copper connectivity wires and cables that wind their way around Medinah.
Three server farms located in Bethesda, Md., Columbus, Ohio, and just down the street in Schaumburg, Ill., form the backbone for the internet demands—demands that can reach more than 430,000 hits per minute during some events. At the farms, massively parallel systems, or SPs, link together 512 RS/ 6000 processors that can handle the large galleries of fans that go searching through the massive amount of PGA Championship-related data that PGA.com offers.
As one might expect, caching technology, capacity planning as well as some serious load balancing are needed to handle the sometimes irregular volumes and demand spikes that can happen on the web. In tandem with the SPs are IBM 2216 Multiaccess Connectors that provide memory speed for the web server caching. By adding expiration information to all the content, the webmasters are able to keep the most recent data available to the users—making sure Justin Leonard’s fans are receiving the scores he’s just posted. This also reduces the By adding expiration information to all the content, the webmasters are able to keep the most recent data available to users.wear-and-tear on the servers. Capacity planning takes place long before the sites go live—sometimes as much as a year in advance—and is based on several factors such as the traffic rates at previous events and anticipated advertising. Much like weather forecasters, the team predicts peak traffic times and then rolls out the infrastructure to handle the anticipated demands. Load balancing is handled by IBM’s eNetwork dispatcher, which enables multiple web servers to perform as one. Directing all the traffic (or a percentage of it) to one server farm allows the team to perform maintenance on another farmif need be.
For storage purposes, the team uses a distributed file system (DFS) for content storage along with an Adstar distributed storage manager as a backup. Using DFS allows for replication between DFS servers and makes sure that the content is the same for all visitors—regardless of which node the connection is coming from. Powered by DB2 Universal Database for Linux technology and fitted with IBM’s Net.Commerce application, the PGA Championship Shop allows fans to purchase more “I wish I were there” items—from golf shirts to hats to golf umbrellas.
And it is the web visitors who IBM’s temporary IT shops are after—not only for IBM’s marketing efforts but for the partnering organizations. Numbers for the sites are growing steadily. And with each golf tourney that IBM handles, the site team and the variety of web offerings that are pushed to the site viewers become more advanced—much like the guys out on the golf course. Especially that Tiger Woods guy, who shot 11-under par for the tournament and managed to hold off a Sunday comeback by the spry Sergio Garcia and win the 1999 PGA Championship.
U.S. Open at Flushing Meadow: First Serve
Two weeks after Tiger Woods took home the PGA’s Wanamaker Trophy, the IBM team landed in New York City—greeted by the low-flying aircraft and sizzling temperatures. Scoring specialist Steve Hammer is here, seated in his familiar hunch gazing at the letters and numbers flying across the monitor in front of him. In charge of operations this time is 30-year-old Executive Producer David Balcom, who, unlike his coworkers, is not wearing a baseball cap.
Ditching the beige trailers of the PGA, the site team has upgraded to roomy office space tucked beneath Louis Armstrong Stadium, which sits next to Arthur Ashe. Underneath this very loud grandstand, where the noise from the 10,000 fans above stomping their feet at every passing shot and ace shakes the room, the team, slightly larger than that at the PGA, goes to work.
It’s only fitting that, in the city that never sleeps, shut-eye is at a premium for the site team. Unlike the other Grand Slams that IBM covers (Wimbledon, the French and Australian opens), the U.S. Open’s matches are played day and night. Add the extra amount of pressure because it’s the final Grand Slam of the year and the fervor of the New York City crowds, and you have a recipe for some great, five-hour matches and late, late evenings. Barn burners that go until one or two in the morning at the Open are not out of the ordinary—and the IBM team is there until the last five-setter has been decided.
At any given time during the two-week affair, the U.S. Open site team is crunching the numbers streaming in from each of the 18 tennis courts where matches are being held. Aces, unforced errors, serve speeds, double faults—it’s all processed. The data gets to the site team in much the same way it did at the PGA, but because the event has much more going on for a longer period of time, there’s much more to do.
Like at the PGA, humans start the process. USTA tournament referees and data entry teams sitting courtside record the statistical details of each point on their ThinkPads—such as Serena Williams’s growing number of aces. An automated radar gun captures the speeds of the 100-plus mph serves. All the courtside information is transmitted to the scoring and results center—a stark room filled with racks of ThinkPads—and then dumped into a tournament database and replicated to the backup database. Automated processes continuously broadcast new data across the network to the more than 100 clients on the scoring and results system. (This includes the tournament inquiry systems, the internet war room—and then to www.usopen.org—the match update center and the tournament scoreboards.) Each client processes the information for users to view—a media representative logged on to a PC at the Media Center, a fan staring at the scoreboard at Arthur Ashe Stadium, a USA Network or CBS commentator studying the monitor in the broadcast booth. If the technology fails, there won’t be any double-fault statistics for TV commentator John McEnroe to tsk-tsk over during the 138 hours of network coverage.
For all of the long hours and complex work, though, you don’t hear too many complaints from the members of the site teams. In fact, besides the technology, teamwork is the other main ingredient. “We’re careful to pick people who have a strong affinity for the job because good ideas come from people who like this stuff,” says Ramminger, who’s talking about his team during what would be one of the best matches of the tourney, the Williams versus Seles quarterfinal. In assembling his crew, communications sector executive Ramminger identifies several important rules for handling IT at large, remote events:
Preparation. Be ready for anything at any time, from late nights to natural disasters, like the earthquake at the 1998 Nagano Games or the deluge at the 1998 French Open.
Experience. You need people who know their stuff—not just switches and routers, but aces and bogeys too.
Follow-through. “You need a never-ending attitude striving for perfection,” he says.
Flexibility. Be limber, whether it’s adapting to a new application, business partner or time zone.
But with any team working together for long hours in close quarters, producers Balcom and Childress stress that the team can succeed only if the members are having fun doing their jobs.
Back in the site team bunker, a white-walled room dotted with posters of past U.S. Open champions, the team members are eating cheeseburgers and salads and Executive Producer David Balcom: Planning for heavy traffic, he looks for systems that are watching the scores. These web monkeys, writers, producers, and video and audio techies live, breath, eat and sleep (well, not so much sleep) each event they cover. Getting info to the site is job number one; fans and web users alike have come to expect sports data to be real-time. If not, they’ll go somewhere else.
From event to event, new ideas and processes develop. For example, the e-Publisher tool that is now used at each event grew out of processes IBM originally did manually. Demands for the content—from both partnering organizations and users on the website—are high and getting higher. “One of the key things we’ve learned is to plan for peak traffic, and that means having highly scalable, reliable and redundant systems,” says Balcom.
Next Stop: The Ryder Cup
In what proved to be the pivotal match, 17-year-old Williams finished off Seles that autumn evening. She advanced to the finals—narrowly missing a chance Fans and web users alike have come to expect sports data to be real-time. If not, they’ll go somewhere else.to play against her sister Venus—and proceeded to upset Martina Hingis and take the U.S. Open women’s singles championship.
What’s next for Serena? Endorsements, fame, fortune. What’s next for the site team? Long hours, more work on those computer tans and little recognition from millions of fans enjoying the fruits of their labor. After the Open they packed up their ThinkPads, cables and servers, and headed up to Boston to cover the Ryder Cup. Then, it’s down to Sydney, Australia, for the first Olympic Games of the new millennium and the last Games (for a while) for IBM.