A cash-strapped IT manager makes the case for a business intelligence system one data analysis at a time.n Most large IT shops have a discretionary budget in which to invest in high-risk projects with a potentially big but unpredictable payoff. But in small and mid-market companies, smaller budgets often make justifying the cost of a speculative project an insurmountable obstacle. That’s because the rules of project funding are the same, no matter what size your company is. To properly evaluate any project’s feasibility, we must be able to compare the expected cost of the project against the expected value to the organization. With most IT projects, the future value of an investment is easy to predict by considering savings, efficiency gains or the reuse of existing resources. Simple forecasting methodology can then be used to calculate a potential return on investment (ROI) to determine a course of action. It’s a tried-and-true method, and one the CFO understands. In a small organization, there’s more to lose if you bet wrong. I was faced with the task of justifying a business intelligence (BI) project for which the rewards were potentially high, but so were the risks. Traditional project planning and ROI measurements are ineffective in calculating the value of BI because calculating the potential value of unknown information is similar to a prospector digging for gold. We know there is something of value there; we have a rough idea what we’re looking for, but how much of it, and how valuable it is, cannot be determined until after the investment is complete. Therefore, although I’m not suggesting that BI tools are the junk bonds of IT investments, the rules of risk and reward that CFOs can relate to put these applications in that same high-risk category. And yet, I had to make the case for them anyway. I am the technical lead for a large medical practice in the Midwest (I can not disclose its name due to restrictions on non-medical press coverage). I oversee technology used in 17 locations by a staff of more than 380 (including 45 physicians). One of our key management challenges has been to standardize enough of the practice to gain the same efficiencies achieved by a larger organization. Solving this challenge has so far eluded us, but it’s the only way for us to improve our fiscal position. The bulk of our services are paid for by insurers or the federal government, through Medicare and Medicaid. Unlike most practices our size, we are not affiliated with a hospital or university. To grow our patient base, we rely on a network of physicians who refer their patients to us. And so, we must do more with less. Much of this year’s strategic work plan relies on performance measurement. But the practice has limited tools to obtain meaningful metrics. We needed BI to get a better handle on our productivity. Here’s how we got the project funded: We snuck it in. Start Small To make the project more palatable financially, we broke up a large BI implementation plan into multiple cascading sub-projects, each of which built on the success of its predecessor. The value delivered by each step would be used to fund each subsequent step. Planned and executed correctly, the end result would be a low-risk, high-reward project delivering positive value from the first success. Because the project could not continue if any step did not deliver value, the CFO was convinced. The reason we got the green light is that, as Internet pioneer David Reed concluded (in what has become known as Reed’s Law), the value of the network comes not from its capacity, but from the transactions conducted on it. If there is no feasible action the organization can take, then the information the system would uncover has no value and you shouldn’t build it. Reed’s law of network value was pivotal in our requirements analysis. We identified first the areas where we knew we could deliver a positive value at a low cost—using existing tools wherever possible. For us, that meant building on existing Microsoft SQL and IBM Universe platforms, and building client side views using Access, Excel and basic Web services. As the BI initiative grew, we added the free version of Microsoft Office Sharepoint Server (MOSS), and plan on adding MOSS Enterprise features using some of the savings already delivered. In one of the first steps, we focused first on patient wait times. Due to the often urgent nature of our practice, the office can fill with last-minute patients and we occasionally have excessive wait times. Historically, the practice felt this was unavoidable. Executive leadership set as a priority for this year to develop a report card about the problem and suggest corrective action. Working with the clinical manager of the affected area, we started with simple data discovery from our patient charting system. As a patient moves through the practice, his or her location is entered into this system by the clinical staff in order to ensure that patients get the proper tests before and after their visit with a physician. The time stamps and locations of these events had been logged for compliance purposes, but we had no report to analyze this data. So we used our BI tools to dive into the data to see what was there. For each and every patient, we had a start and end time for each step of the process, which could then be computed into a statistical model of the typical patient visit. We learned how long typical patients sit in the waiting room and in the exam room waiting for the physician, how long they wait for an assistant, and how long they wait for certain tests. When we compared this data between physicians, visit types and locations, we learned (among many other discoveries) that the scheduling template we use is a key component of patient wait times, and that our model did not leave enough room for handling the additional traffic from urgent care visits that occurred during the day. This analysis provided us with a road map to improve patient care. We have begun retraining our telephone schedulers to recognize more easily the opportunities for patients to use our urgent care clinic rather than our specialized clinics, so that we can keep scheduled appointments running on time. Build on Early Success We then compared the cost and benefits of obtaining other valuable deliverables. In this step, we looked primarily for the opportunity to reuse solutions. For example, we used data about our IT help desk performance to design a simple workflow solution to streamline this function. Now we’re redeploying that same workflow engine to create a partially automated employee on-boarding application, a time sheet solution, a staff skill set knowledge base, an e-forms library and a reporting repository. This became the foundation for later phases of BI, allowing for a modular approach. While many tools can accomplish this, Microsoft’s Sharepoint made the most sense for us. Because each phase was built on the success or failure of the preceding phase, our plan had to remain flexible enough to adapt to changing goals and priorities as well as to developments in the project itself. For example, in the course of deploying the help desk workflow solution, we discovered we could automate a report of call volume to the help desk. This led us to conclude that running the same report for another call center could deliver an even greater value to the call center manager and the individuals focusing on patient wait time. By carefully evaluating each step and phase for positive value and added value, we have been able to maximize the value delivered, minimize the cost, and provide tools for uncovering hidden value in a fiscally responsible way. In financial terms, the marginal, or additional, benefits we gained in each phase had to exceed the marginal costs to build any new functionality. Using this modular approach, we’ve been able to stack needs to open up new opportunities. We quickly reached a point where many of the reports we were generating became key performance indicators for the practice. So now we’re starting on a dashboard capability to show these metrics in near real-time. To have simply started with the goal of a dashboard report would have been cost prohibitive. Starting small also had an unexpected benefit. It allowed users more opportunities to provide important feedback on which information was delivering value, and it got them excited about the capabilities of our growing BI options. Now that the cat is out of the bag with our business unit leaders, they’re starting to look for more strategically relevant information. Focus on End Users To ensure that appropriate and valuable action can be taken on knowledge once it’s discovered, the system of delivery and retrieval of information must be addressed. Finding the information is insufficient to generate value; it must be accessible to the parties who can best utilize it. This is especially true when you’re not sure what information will end up being most needed; targeting the information to specific individuals or roles within the organization could preclude its potential value. Yet historically, we e-mailed most reports only to those who requested them. Even as our new tools took shape, breaking this habit was one of the toughest challenges. Our practice is spread across a large geographic region and divided into business units including four medical departments and several support teams. As we’ve grown, our ability to stay in touch across the practice has become strained. But our best solutions to problems have come from sharing our challenges and ideas across business units. We realized early in the project that we would deliver the greatest value by finding a way to foster interactions across organization boundaries. We found inspiration from another networking pioneer, Robert Metcalfe, who observed that each additional user makes a network more valuable. Considering the limitations of our legacy reporting solutions, it was easy to conclude that one of the largest obstacles we faced in our quest for greater efficiency was our ability to find and use existing information. And so, in each phase of the project, we also considered the value of sharing information widely. In our patient wait time project, the clinical manager now pulls up her wait time report card, which is updated on demand. Everyone who has a hand in the process now knows how well we’re doing, and can adjust their portion of the flow accordingly. Most of the BI tools we developed were made available to the entire leadership and executive teams as well as the physicians; for the most part, anyone in a management or leadership role was given across-the-board access. The searchable reports are available through a portal. Individual and ad hoc reports are now discouraged; if a metric has value, we load it into the Sharepoint portal and update it with routine scripts, or on demand. None of this cost a lot: We built most of our reports using Excel macros, pivot tables, SQL stored procedures and Sharepoint’s Excel services feature. The development was done primarily in-house, though we had outside help with the more complicated procedures. The Bottom Line By using the marginal benefit approach, we have delivered more than $1 million in benefits to date with an investment of $125,000 in hardware and software. We started with a purchased program that facilitated the recovery of nearly $500,000 in underpayments by insurance companies. Since then we’ve expanded it to secure $300,000 in refunds or compensation from underperforming vendors and identified workflow improvements that have saved us about $450,000 worth of staff time. Meanwhile, we’ve also found another $1.1 million in insurance underpayments that we should be able to collect. By using our data to identify opportunities to automate more IT functions, we’ve also freed up staff time to dig even deeper inside the inner workings of our practice, where we have more opportunity to create value. We have turned our BI challenge into a perpetual motion machine, adding value to itself with every turn. J. Marc Hopkins is a 15-year veteran of IT and IT leadership in small and midsize businesses. For the last seven years, Hopkins has served as manager of information technology at a large medical practice headquartered in Ohio. He recently received his master’s in business administration from Indiana Wesleyan University. Related content opinion Website spoofing: risks, threats, and mitigation strategies for CIOs In this article, we take a look at how CIOs can tackle website spoofing attacks and the best ways to prevent them. By Yash Mehta Dec 01, 2023 5 mins CIO Cyberattacks Security brandpost Sponsored by Catchpoint Systems Inc. Gain full visibility across the Internet Stack with IPM (Internet Performance Monitoring) Today’s IT systems have more points of failure than ever before. Internet Performance Monitoring provides visibility over external networks and services to mitigate outages. By Neal Weinberg Dec 01, 2023 3 mins IT Operations brandpost Sponsored by Zscaler How customers can save money during periods of economic uncertainty Now is the time to overcome the challenges of perimeter-based architectures and reduce costs with zero trust. By Zscaler Dec 01, 2023 4 mins Security feature LexisNexis rises to the generative AI challenge With generative AI, the legal information services giant faces its most formidable disruptor yet. That’s why CTO Jeff Reihl is embracing and enhancing the technology swiftly to keep in front of the competition. By Paula Rooney Dec 01, 2023 6 mins Generative AI Digital Transformation Cloud Computing Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe