by Allan Holmes

The Role of IT in Managing the FDA’s Enterprise Risk

News
Mar 15, 200616 mins
Data CenterIT Leadership

On Feb. 8, 2001, a U.S. Food and Drug Administration advisory committee met to discuss a study conducted by the pharmaceutical giant Merck that showed a disturbing increase in heart attacks and strokes among patients taking the arthritis pain reliever Vioxx. It appeared that FDA reviewers, before pushing Vioxx through its fast-track approval process in 1999, may have overlooked data from Merck’s clinical trials pointing to potentially fatal reactions to the drug.

Three years later, in September 2004, public pressure and a high-profile lawsuit forced Merck to pull the drug off the market. Meanwhile, the case unleashed a storm of criticism against the FDA’s approval processes, in particular its ability to assess all the information relevant to a specific medication before clearing it for public use. In the past two years, the agency has watched as drugs it approved for multiple sclerosis, diabetes and cholesterol were recalled by pharmaceutical companies. And the FDA has been forced to issue increasing numbers of warnings for newly approved drugs that have turned out to have dangerous side effects, including most recently three drugs for asthma (Advair, Serevent and Foradil), which the FDA warned in November could cause severe asthma attacks and possibly death.

According to a report by the Government Accountability Office, more than 5 percent of all drugs approved between 1997 and 2000 were withdrawn for safety reasons. These withdrawals concerned drugs that got the green light during the period after Congress passed a 1992 law that encouraged faster drug approvals and resulted in a 200 percent increase in the rate of such withdrawals from previous years (the FDA disputes the study). Last November, FDA scientist turned whistle-blower David Graham told a Senate committee that his agency, under pressure to grant fast drug approvals, is “incapable of protecting America against another Vioxx.”

Criticism has been aimed at the FDA’s inability to properly manage the risks inherent in its business processes, but it may as well be targeting the agency’s IT and the workflow the systems support. During the period in the late 1990s when many of the recently problematic drugs were being approved, the agency was under intense pressure from pharmaceutical companies and patients to fast-track potentially lifesaving medications. But it was saddled with decade-old information systems that made it difficult for FDA scientists to access all the information they needed to make decisions quickly. The disconnect between new business processes and old IT magnified the central risk facing the 100-year-old agency: Moving too slowly or too quickly could cause injury or even death.

The FDA’s high-wire act requires it to balance the demands of manufacturers and patients with competing demands from the general public to make sure the products are safe—while maintaining the trust of all these constituents. In the past, the FDA has hired more scientists and other support staff to keep it in balance, but, increasingly, FDA managers are concluding that IT has a central role to play in managing the agency’s enterprise risks. “This agency is all about risk management,” says acting CIO Fred Farmer. “Everything has a side effect, even an orange. And so when a drug comes through here, it’s always a balance between what good is it going to do versus what harm could it possibly cause.”

Neither FDA officials nor agency critics attribute the Vioxx disaster to ineffective IT. But in 2002, as the controversy over the FDA’s fast-track process was building, the agency embarked on a full-scale modernization of the systems used to accept, store and manage electronic applications for new drugs and medical devices.

Most of the FDA’s effort focuses on consolidating redundant systems throughout the agency’s eight “centers”—the divisions responsible for approving products within the agency’s jurisdiction. These products include not only drugs, medical devices and food, but also so-called biologics (such as vaccines), veterinary medicines, cosmetics and radiation-emitting products (such as mobile phones). The agency also conducts food inspections and issues regulations for food labels. Agency officials believe that decreasing the number of systems the agency must maintain and establishing data standards will help scientists share information more quickly and easily. With better access to information, they will make faster decisions based on better analysis.

Already the new systems—and workflow processes that have accompanied them—are showing results, says Stephen Wilson, acting director of the FDA’s Office of Business Process Support in the Center for Drug Evaluation and Research (CDER). For example, the agency can now track a patient’s reactions to a new drug during a clinical trial, increasing the likelihood that adverse effects will be investigated by the FDA. In the past, it could take weeks, if it was possible at all, to locate an individual’s file. “IT allows you to be more comprehensive and to act more quickly, which lowers the risks all the way around,” Wilson says.

A Legacy of Risk

The FDA’s jurisdiction covers nearly every product that Americans ingest or apply to their bodies; its regulations affect 25 percent of the $12 trillion U.S. economy. Therefore, any delay in getting these products to consumers has a financial impact on manufacturers and sellers, as well as on the health and satisfaction of the public.

On the flip side, FDA officials need to conduct scientifically sound reviews in order to understand a product’s adverse effects. But the thoroughness of these reviews can slow down the approval process. Before Congress mandated fast-track drug approvals, reviews averaged about 15 months for high-priority drugs (those like Vioxx believed to have the potential to have an immediate and marked improvement in public health) and more than two years for standard drug applications.

For years, the FDA leaned on its reputation for taking things slowly to protect public health. But in 1992, after years of complaints from pharmaceutical companies and the health-care industry that the FDA didn’t move fast enough, Congress told the agency to put a premium on speed. That year, lawmakers passed the Prescription Drug User Fee Act so that the FDA could charge drug companies fees for new drug applications. The agency used the money to hire more reviewers and to invest more in IT to speed up the drug approval process. Paper applications for new drugs—stacks of some reached as high as six feet, brimming with stats from clinical trials—were part of the problem.

Previously, top FDA managers had not considered IT to be a central component to balancing the competing imperatives of caution and speed. IT was more of a support service, and scientists didn’t question what the department delivered.

Meanwhile, IT planning or buying decisions were primarily made by the IT directors in each of the eight centers. The higher profile centers, such as the CDER, which handles drug approvals, got more money and better equipment. But even still, the FDA ran mostly on disparate legacy systems.

The FDA hasn’t tracked how much of the $1.5 billion collected in drug fees went to improving IT, but after 10 years, technology still wasn’t helping FDA scientists with their decision-making process. The agency wasn’t even keeping up with the basics. Former CIO Jim Rinaldi recounts that when he got to the FDA in 2002 he found 1,800 new PCs owned by the CDER that had been sitting in a warehouse for nine months, still in their boxes.

In a 2002 survey by the inspector general of the Department of Health and Human Services, 58 percent of the CDER scientists said they did not have enough time “to conduct an in-depth, science-based review” for drugs that were put on the fast track.

Around the same time, new approaches to treatment were affecting how the FDA did its job. The boundaries between drugs, biologics, medical devices and other FDA-regulated products were beginning to blur. For example, a stent designed to keep an artery open is coated with a medication that keeps blood from clotting. Is it a drug or a medical device? The expertise to review such a product resides in two separate centers, the CDER and the Center for Devices and Radiological Health. But they operated on different IT platforms, used different network technology, ran multiple versions of Oracle databases (with different data standards and nomenclature) and had their own security measures. The systems couldn’t communicate, and thus, the scientists couldn’t create the workflow necessary to do a thorough review of products that fell into multiple categories.

Under Rinaldi, the agency spent the first two years of the modernization project focused on reducing costs by revamping the IT organization and its business processes. In 2003, the FDA created a central Office of Shared IT Services, consolidating 15 contracts covering such services as help desk and desktop support. And in March 2004, the IT function was completely reorganized, so that the CIOs in each center reported directly to the FDA CIO, giving the agency CIO more control over technical standards and purchasing.

While these steps did nothing to directly address the FDA’s workflow problem, Rinaldi (who left the FDA last summer to become CIO at NASA’s Jet Propulsion Laboratory) realized that consolidation could accomplish more than cutting costs. It could also mitigate the risks in the approval process that derived from incompatible systems by enabling new, integrated workflow processes. The media was hammering the agency about its Vioxx oversight. From his discussions with the center directors, Rinaldi concluded that the agency’s legacy of incompatible systems was preventing it from taking full advantage of technology to reduce review times and provide more efficient access to the clinical trial data the scientists needed to make decisions on a drug’s safety.

“We weren’t thinking of risk management explicitly, [but] we were thinking of it implicitly,” Rinaldi says. “The way we thought about it was, we knew IT was central to getting the right information to the right person at the right time. Only later did we start thinking that what we were doing was actually mitigating risks.”

A More Efficient Workflow

The FDA had the foundation of a new workflow in a series of electronic applications for new product approval that it launched in the 1990s. But the forms weren’t standard across the agency (manufacturers submitted their data using only paper forms). Meanwhile, there was no common system to accept and route the applications to the members of an FDA review team, and no concrete plans to create an automated workflow.

The Center for Biologic Evaluation and Research (CBER) was one of the most advanced, in that it accepted electronic applications in PDF form and had the beginnings of an automated workflow process. Reviewers who worked for the biologics center could access the applications online, but members of the review team from other centers had to manually thumb through printouts of the files to find what they needed. Other centers relied even more heavily on the old manual processes. “That really slowed things down, and made it more likely that [reviewers] were getting out-of-date information” from the paper files they accessed, Rinaldi says.

Given the beating the agency was taking, it was easy to get the center directors to agree in principle that they needed to automate their workflow processes in order to take advantage of the electronic application. The risk mitigation argument carried the day.

“Those in the centers who dealt with reviews every day were the ones who really got risk management, and how to make IT a part of it. It was obvious we had to start with how work flows through the centers,” recalls Rinaldi.

Now came the hard part: devising a workflow that could be used by every center. Rinaldi could not simply transfer the workflow system from the biologics center to the drug evaluation center because the biologics center’s workflow system could not be scaled to handle the drug center’s workload. Rinaldi knew that if he tried to force the biologics center’s solution onto the drug center, he risked creating a system that provided less functionality and that would not be robust enough to cut review times and provide better information to scientists.

Early in 2003, as the IT department reorganization was proceeding, Rinaldi and Farmer (who was director of IT programs at the time) convened a meeting with representatives from the FDA’s centers to begin defining a common workflow. The group spent the first two hours in “spirited discussion” pointing out how each center’s workflow processes were unique, recalls Leonard Wilson, the chief business enterprise architect at CBER, who is responsible for implementing CBER’s managed review process. At one point, Wilson recalls, he noticed that each center performed the same tasks—they just called them different things.

Once Wilson pointed out the similarities in the centers’ processes, the next hour of discussion focused on identifying the steps in the workflow process the centers had in common. And so on. “We had become inefficient in many areas because we simply had local language barriers,” Wilson says.

That discussion led to two IT initiatives: the development of a single intranet, based on enterprise technology standards, which is currently being rolled out, and the beginnings of an FDA-wide automated workflow system, which is still under development.

The point of the workflow system is to give each center the components to create a workflow of its own, but still be able to integrate them. One component that Wilson is helping to develop is a standard webpage for scientific and regulatory reference information. When it’s completed (it’s now being prototyped), a reviewer will be able to select the type of product the application is for, then determine how to proceed with the review. Having this functionality, Wilson says, will make it easier to train new reviewers because the workflow process will be simpler and standardized. That in itself should enable the FDA to conduct reviews and grant approvals more efficiently, thereby giving scientists more time to do their analysis instead of doing paperwork.

Managing What They Can’t Control

It’s too early to measure the impact of the FDA’s latest investments on its approval process. When it comes to drug approvals, the FDA has reduced the median time to review a priority drug from 15 months in 1993 to just under seven months in 2003, but IT contributed little to those gains. Farmer and the top managers in charge of workflow processes in the FDA’s centers believe IT has the potential to shorten the approval time more, while reducing the risk that they will approve products that have dangerous side effects.

Meanwhile, the act of pulling together managers in each center has led the FDA to expand its idea of how to manage business risks using IT. Before the Vioxx fiasco, IT was expected to manage only the risks to its own operations—to guard against system failure, data loss and security breaches. The problems with the drug approvals got the agency focused on how technology could mitigate the risks inherent in its own business processes. The next step, which the agency is just beginning to assess, is how it can use IT to address risks to its mission that come from outside its walls.

“This is a very common reaction to managing risks,” says Bob Charette, an enterprise risk management expert with consultancy Itabhi. “You manage the risks that you perceive to be in your control, not the ones you perceive are out of your control.” And yet, good enterprise risk management requires identifying and deciding what to do about external risks.

For instance, the agency is developing an auditing application, through which FDA auditors could match the data filed by a pharmaceutical company to the data the company keeps internally. Such a capability could discourage pharmaceutical companies from hiding or changing data that may hurt the chances for a drug to be approved. (In December, The New England Journal of Medicine posted an online editorial claiming Merck had withheld data on heart attacks in Vioxx trials that would have raised warnings about the side effects of the drug.) “What that gives you is more ability to look at all the information in a more comprehensive fashion and bring forth questions,” according to the CDER’s Stephen Wilson.

One promising idea, according to pharmaceutical industry experts, is a system that would track the performance of drugs once they are on the market. Adverse effects from drugs in a clinical trial may seem statistically insignificant in a sample of 3,000 people, or the trial may not test for a particular side effect. But, as was the case with Vioxx, once a drug is on the market and 80 million people begin taking it, new or more pernicious side effects become noticeable.

Using an automated system, the FDA could more easily collect reports of adverse effects from health-care professionals so that warnings could be issued or drugs withdrawn from the market more quickly. The idea for such a system has been discussed for years, but it wasn’t until 2002 that Congress directed the FDA to dip into the application fees paid by pharmaceutical companies to develop it.

An aftermarket tracking system might also illuminate ways the agency could improve the drug approval process, says Scott Gottlieb, a physician and deputy commissioner for medical and scientific affairs at the FDA. At its most sophisticated, such a system could gather data in real-time from electronic health records, flag possible correlations between drugs and adverse effects, and rapidly report those to FDA officials.

Because the system has such potential to mitigate risks beyond the FDA’s control, Charette thinks the agency should have made it a priority much earlier. Organizations need to identify the five biggest risks they face and tackle those first. The biggest risks tend to be systemic ones—risks that affect the entire organization but are perceived to be out of its control. “You don’t build security into a system after it’s [done],” Charette notes. “You build it in on the front end. It’s the same with enterprise risk management. You consider [risk] when deciding what systems to build and what business processes to automate.”

Gottlieb agrees that the FDA could have pushed harder, but cites such obstacles as a fragmented health system (in which there is little automation of health records), the need to protect patient privacy and a lack of funding (five years ago, the FDA estimated the system would cost $200 million) as reasons why the agency has put off this development. Now, however, it is looking into tapping health insurance databases to search for drug effects and has begun to research system requirements. Rinaldi says there was no way he could even begin to convince FDA managers that an aftermarket IT system could help them until he fixed the atrocious state of IT throughout the agency. That was a bigger risk to the enterprise, he says, because it threatened scientists’ ability to do their jobs quickly and thoroughly on every review.

Says the CDER’s Stephen Wilson, “We’re just beginning to look over the hill now to see what is possible.”