The marriage of IT and medical research may be just what traditional pharmaceutical companies need to survive in an increasingly competitive field. Scientists at Aventis Pharma had just finished animal testing on a promising new drug to treat asthma, and they knew they had to move fast. Two of their biggest competitors, GlaxoSmithKline and Schering-Plough, had similar treatments further along in the drug testing pipelines. Both companies were already proceeding with human trials to test the effectiveness of the same approach Aventis was studying — an anti-interleukin-5 (anti-IL-5) therapy designed to inhibit a protein thought to foster asthma attacks. So rather than starting late down the same protracted and pricey path, Aventis embarked on an unusual shortcut; it turned to Entelos, a Menlo Park, Calif.-based company, to run a new kind of software that could simulate a clinical trial on two virtual asthmatic patients named Alan and Bill. Results from the computer simulation program led Aventis to doubt that the anti-IL-5 therapy would be an effective agent against acute asthma attacks. The same conclusion was reached by its competitors after years of expensive clinical trials.The biosimulation technology that Aventis used to achieve such cost-effective results is just one in a growing arsenal of new information technologies that offer tremendous potential to streamline and reduce the costs of drug development. Grouped under the umbrella of bioinformatics, these technologies all involve the use of computers to store, organize, generate, retrieve, analyze and share genomic, biological and chemical data for drug discovery. And their usage is spawning an entirely new branch of IT.“It’s a magical time in the history of science now that we have so much computing power and storage capacity,” says Robert Dinerstein, a senior research scientist in the Bridgewater, N.J., laboratories of Frankfurt, Germany-based Aventis Pharma. It wasn’t always so. The pharmaceutical industry, a conservative bastion of empirically minded scientists, has been slow to embrace new technologies. And, in fact, for much of the 20th century not much changed about the trial-and-error process of creating new medicines. Historically, the drug discovery process has invariably begun with a theory about the possible cause of a particular disease. “Some idea got us started — either from scientific literature, a particular researcher’s expertise or just someone’s crazy idea,” Dinerstein explains. The preclinical research continues with the synthesis and purification of a compound that appears to affect a certain protein or molecule thought to be involved in the disease process. Using test tubes and petri dishes, scientists then conduct efficacy and safety tests on that compound or drug candidate. If it passes that hurdle, they move to more extensive preclinical and clinical testing processes to determine how the body responds to the compound — how it’s absorbed, distributed, metabolized and excreted (pharmacokinetics) — as well as the chemical effects of the drug on the body (pharmacodynamics). Then the testing begins on animals and proceeds to three phases of clinical trials in humans. But the old-school approach to drug development is expensive, time-consuming and prone to failure. Nearly 75 percent of the 5,000 drug candidates currently tested in these different phases fall short of expectations and never reach the market. Factoring in the cost of all the drugs that fail, drug companies spend an average of $880 million and 15 years to develop each new drug that does make it to market, according to a recent study by the Boston Consulting Group. And, of course, they pass those costs on to the consumer (employers, hospitals, insurance plans and patients) with prescription drug markups that have become the target of increasing criticism. Such high costs (and the ensuing negative publicity) have finally brought the pharmaceutical industry, kicking and screaming, into the IT age. In fact, bioinformatics may be just the shot in the arm drugmakers need to survive in an increasingly competitive and consolidating field. By fully integrating these new technologies, analysts say, pharmaceutical companies could cut the cost of creating a new drug in half and shave two to three years off the development. In addition, informatics holds the promise of uniting what have traditionally been separate research and development efforts within the same companies. “We’re an information-based industry, but we’ve been a bit behind in the extent to which we’ve been using computer-based tools,” Dinerstein says. “They’ve had better computer models for oil drilling than we’ve had for drug discovery.”While executives in this closely guarded industry won’t say how much they’re investing in research and development informatics, they are definitely hopping on board. Merck & Co. in Whitehouse Station, N.J., recently paid $620 million to acquire Rosetta Inpharmatics, a genomics and technology company based in Kirkland, Wash. And New York City-based Pfizer says it recently spent more than $100 million to create an “integrated system of high-speed discovery technologies.” Pradip Banderjee, a senior partner with Accenture Consulting, conservatively estimates that drugmakers as a whole spend more than $4 billion a year on that kind of technology, not including the cost of hardware. But compared with the cost of a clinical trial — particularly a failed one — it’s not much. “We spend tens of millions of dollars on early clinical trials. If you could use this technology to tell someone early on whether or not to do a clinical trial, that would be significant,” Dinerstein notes. Gigabyte GlutBioinformatics may also be the only way drug companies can deal with the gigabytes of data they produce and receive every day. The pharmaceutical trade organization Pharmaceutical Research and Manufacturers of America predicts that by 2003, scientists will have discovered more than 10,000 potential targets for drug development, resulting in what some call “target glut.” And that number will only get larger thanks to the 30,000 genes and an exponentially greater number of proteins being identified and analyzed in the Human Genome Project. At the same time, combinatorial chemistry allows companies to synthesize more than 100 compounds per chemist per year. “Informatics is how you deal with the amount of data being generated,” says Rick Roberts, global head of discovery research informatics for Pfizer.In the past five years, most big drug companies have created official informatics departments, either by integrating their research and IT departments or by creating close ties between the two. But the unofficial origins go back further. “It started as an outgrowth of the scientific discipline as opposed to IT,” says Nathan Siemers, group leader of bioinformatics at New York City-based Bristol-Myers Squibb. “Basically it’s been a research endeavor, but over its evolution it’s become more infrastructure-related. More people need access to this information, and the scale of information we have to disseminate to our clients — the researchers — is growing drastically. So our ties to IT, which originally were almost nonexistent, have become stronger and stronger.”Today, there are informatics technologies popping up to help at nearly every stage of the drug development process. Early on in the process, bioinformatics technology allows researchers to analyze the terabytes of data being produced by the Human Genome Project. Gene sequence databases, gene expression databases (which track how genes react to various stimuli), protein sequence databases and related analysis tools all help scientists determine whether and how a particular molecule is directly involved in a disease process. That, in turn, helps them find new and better drug targets. Using IT analysis tools and genomic databases, for example, Merck researchers were able to compare the entire genome sequence for mice and humans. They not only discovered that the two genomes were 90 percent identical, but they also found a gene in mice that might have the same function as a gene that may be involved in schizophrenia in humans, says Richard Blevins, who has been Merck’s director of bioinformatics for three years. Currently Merck is working with genetically altered, or “knock-out,” mice, in which certain genes are altered to create a specific mutation (schizophrenia, in this case) to see how the animals react to drug candidates. This research, still in the very early stages, could eventually lead to a target for a new schizophrenia drug, Blevins says. Similarly, Bristol-Myers Squibb has discovered a novel method for treating epilepsy using gene-sequencing mining tools. The particular drug candidate isolated for this research has since shown strong efficacy in knock-out mice and is nearing clinical trials, according to Siemers of Bristol-Myers Squibb. Drug companies also employ a variety of cheminformatics software — tools that can predict the activity of a particular compound by studying its molecular structure. For instance, scientists can use molecular modeling software (tools that rely on interactive 3-D visualization or mathematical algorithms) to discover and design safe and effective compounds. Chemical databases allow researchers to store and retrieve compounds and related data. Robotics makes it possible for chemists to synthesize hundreds of thousands of chemical compound variations from a library of simpler molecules in a short amount of time. High-throughput screening technology (see “The Definitions of Life,” Page 116) allows researchers to screen thousands of compounds at once, rather than just 10 or 20. Technology may help at the clinical testing stage too, though it has been a bit slower to catch on. Virtual patient simulation software, like the Asthma PhysioLab program that showcased virtual patients Alan and Bill, can simulate patients, targets and therapies in order to predict experimental outcomes before companies commit major resources to lab research and clinical trials. Essentially, this software helps predict the effect particular compounds have on the human body. Integrating InformaticsThe chief challenge for CIOs is piecing together all those diverse technologies into a fully integrated drug discovery process. Given that hundreds of vendors are jumping on the bioinformatics bandwagon and there are very few standards, that is no small feat. “R&D chiefs and CIOs are looking at all of these immature technologies in a quickly evolving marketplace and are being forced to spend quickly,” explains Craig Wheeler, vice president of Boston Consulting Group. “But it’s necessary to put it all together — in silico [in the computer], in vitro [in test tubes] and in vivo [in life] — to get any real value out of it.” In order to do that, they’ve had to call in the IT troops, which have long been isolated from the lab scientists. Whether as a part of a new informatics department or working hand in hand with research, the role of IT is becoming increasingly important in the pharmaceutical industry. Peter Loupos, Aventis’s vice president of drug innovation and approval information solutions, studied molecular biology and genetics as an undergraduate and IT as a graduate student. He now sits on the leadership team for drug discovery at Aventis and has watched the role of IT evolve. Just five years ago, IS was a relatively isolated department responsible for providing infrastructure and operational support at Aventis. Today, the company’s far-flung research teams depend on sophisticated software and hardware to do their jobs. The early phases of drug development are often done in silico, Loupos notes. “Similarly, it is impossible to perform global clinical trials and prepare [Food and Drug Administration] submissions unless the process is implemented with an e-business philosophy. This means that as an organization we had to change our strategy, our focus and our skills,” he adds. The role of IT is now central to successful research and development at Aventis. “This visibility has moved the organization from the background to full partnership,” Loupos says. Aventis now employs “a cross-functional team approach, bringing together the skill sets of scientists, informaticians and IS professionals to create new solutions to drug discovery.” At Merck, the 30-person bioinformatics group is one-third computer scientists, one-third natural scientists and one-third that rare breed — the bioinformatician — with a dual background in science and IT. “To do [informatics] correctly you need to blend scientific skills and IT skills. Many on our staff are scientists with strong IT backgrounds,” Pfizer’s Roberts says. “Otherwise you get technology without a purpose.”That partnership is particularly vital because most pharmaceutical companies are doing a combination of building some tools in-house, licensing software, and partnering with or buying up companies that already have a piece of the technology. The main reason for these acquisitions is that most can’t find enough qualified individuals to keep up with their demand for new tools, and their core business remains drug, not software, development. “Our internal resources don’t come close to matching our demand for tools,” says Ken Fasman, vice president and global head of research and development informatics at Waltham, Mass.-based pharmaceutical company AstraZeneca. Fasman works out of Waltham and oversees a staff of 55 worldwide.In 1998, Bristol-Myers Squibb built its own Laboratory Information Management System (LIMS), a database that allows the company to track DNA samples as they are sequenced, stored and analyzed by scores of different scientists. The choice to build rather than buy at that time was out of necessity. “None of the vendors offered the kind of flexibility we needed,” Siemers explains. He needed a system that could be easily accessed by a scientist who was working at the bench with only three samples as well as by researchers examining 30,000 samples with a high-throughput screening device. Although that LIMS is still in use at Bristol-Myers Squibb, the preference now is to buy informatics software and integrate it into the company’s custom-built systems. “Our philosophy is if we can find a tool from a vendor that will do the task, it’s in our interest to just go out and buy it,” Siemers says. “Our business is drug development, not software engineering.”But hundreds of vendors in the informatics space combined with varying standards and platforms makes the integration task tricky. Even Pfizer, known for doing more informatics in-house than most, works with about 15 major vendors in this area. “It’s really become an integration job, and it’s very difficult,” Roberts says. At Pfizer, Roberts uses simulation software from St. Louis-based Tripos, chemical databases from San Leandro, Calif.-based MDL Information Systems, and screening and visualization tools from Cambridge, Mass.-based Spotfire, just to name a few. “We work very closely with all of our partners and try to sway them to build to our standards,” he says. “But frankly none of these systems are perfect, and we have to build a lot of bridges.”For example, Pfizer’s scientists often use one database to examine the chemical structures of compounds in order to make assessments about their viability as drug targets. And then they have to transfer that information into a completely different piece of software from another vendor to assess that molecule’s biological properties or safety. As a result, Roberts and his department are constantly faced with the challenge of building application programming interfaces (APIs) between different systems and databases. Although vendors are beginning to offer more modular systems that can plug in to other systems or well-documented APIs in an effort to garner a bigger chunk of this billion-dollar business, many are still hawking closed, standalone systems. “We’re able to push vendors more and more to work with open standards, but when someone has a monopoly position, you don’t have a lot of leverage,” says AstraZeneca’s Fasman, who works closely with his IS counterparts to deal with such problems. The number of mergers and acquisitions in the drug industry further complicates the integration issue. Recent major drug marriages include Warner-Lambert with Pfizer, and SmithKline Beecham with Glaxo Wellcome to produce the Middlesex, England-based GlaxoSmithKline. “Everyone has different databases,” says Dinerstein of Aventis, which was formed by the merger of Hoechst Marion Roussel with Rhone-Poulenc Rorer in 1999. “We have immense amounts of data, but our first task is to figure out how to make that data accessible.”Culture ShockOne of the thorniest impediments to informatics is not technical integration, but cultural assimilation. “The pharmaceutical industry is one of the most hidebound industries in the world,” says Alan Hillyard, senior vice president of cheminformatics research in the San Diego office of Lion Bioscience, which is based in Heidelberg, Germany. “They don’t want to adapt to anything. They had one guy working in a lab in the 1900s, and that’s the way they still want to do it. They have the most sophisticated tools, and they want to be cutting edge, but there’s still that attitude of ’We’ve always done it this way.’”Some compare the traditional process of drug development to spending millions to build an airplane and then simply sending it off a cliff to see if it flies. But just as the aviation industry slowly began to integrate bits and pieces of computer-aided design and modeling 30 years ago — and Boeing completely changed its development cycle to go straight from in silico design to production on its 777 aircraft in 1995 — some predict a similarly dramatic change in the way pharmaceutical companies develop medicines. Yet it won’t happen overnight. In 20 years perhaps, the pharmaceutical industry may be able to develop a drug that relies heavily on genomics, high-throughput screening and computer-aided drug design, but the integration of informatics will require a complete change in the culture of the pharmaceutical industry. “It’s going to take a lot of experimental work to convince the researchers to look at, say, a gene expression study and trust what they’re seeing,” explains Blevins, who hopes to eventually use gene expression studies to screen Merck’s entire compound collection. Leaders in the emerging field stress they will never be able to fully develop a drug using only computers, as some analysts have suggested. “That’s nonsense,” Fasman of AstraZeneca says. “The reality is that scientific experimentation started out in vivo, in real animals and plants. The big advance was when we moved to test tubes. But in vitro didn’t replace in vivo; it added on to it. It’s the same thing with in silico drug development. It simply gives us another way to approach the problem.” Eventually, however, there should be a symbiotic relationship between science and technology. Pfizer, which has released eight successful drugs in the past five years, hopes to double the output of its drug discovery efforts as a result of bioinformatics integration. But thus far company officials say there have been no measurable returns. The significant time and costs savings may not come until these companies have an end-to-end, IT-driven research and development solution. Analysts and executives agree that it will be another three or four years before this technology starts to have an effect on drugs already in the clinics and 10 to 12 years before there’s a real effect on the entire drug discovery process. “So far, informatics spending has been piecemeal, in pockets where the company does understand the impact of the technology,” says Vikas Taneja, a project leader at Boston Consulting Group. But they are optimistic about its potential. “We’re already looking to a new future where there is constant give-and-take between what is learned at the bench and what we do at the computer to allow us to conduct more effective experiments faster,” Fasman says. “There’s quite a bit of hope.” Related content news Emirates NBD drives sustainability goals with Microsoft partnership By Andrea Benito Dec 10, 2023 2 mins CIO news COP28: How Du and Ericsson's partnership is supporting UAE Net Zero Strategy By Andrea Benito Dec 10, 2023 3 mins CIO Green IT brandpost Sponsored by Freshworks When your AI chatbots mess up AI ‘hallucinations’ present significant business risks, but new types of guardrails can keep them from doing serious damage By Paul Gillin Dec 08, 2023 4 mins Generative AI brandpost Sponsored by Dell New research: How IT leaders drive business benefits by accelerating device refresh strategies Security leaders have particular concerns that older devices are more vulnerable to increasingly sophisticated cyber attacks. By Laura McEwan Dec 08, 2023 3 mins Infrastructure Management Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe