by Ken Terry

IT talent shortage hitting healthcare hardest

Jun 30, 2015
CareersElectronic Health RecordsHealthcare Industry

With IT staff shortages a reality for CIOs in most industries, healthcare – driven by federally mandated incentives for such IT-intensive projects as Electronic Health Records – is experiencing even more of an IT labor crunch. Is poaching experienced IT talent from other industries the answer?

businessman in waiting room
Credit: Thinkstock

Healthcare is continuing to experience a shortage of qualified health IT staff that, in the view of some observers, is growing worse. But few healthcare organizations believe that the solution is to lure IT pros away from other industries. In fact, most hospital systems and large physician groups would prefer not to hire any IT person who doesn’t have extensive health experience. 

“Healthcare organizations are looking for healthcare-experienced people,” says Frank Myeroff, president of Direct Consulting Associates, a health IT staffing firm in Solon, Ohio. 

Ernie Hood, senior research director for the Advisory Board Co., a large healthcare consulting firm based in Washington, D.C., agrees. In fact, he says, healthcare organizations are generally uninterested in graduates of health IT training programs, even if they have IT experience in other industries. “It doesn’t substitute for the actual field experience working in healthcare,” he says. 

It’s not about the Benjamins 

The workforce shortage in healthcare does not seem to be related to the salaries of health IT professionals. According to a recent Computerworld/IDG Enterprises survey, a CIO in health/medical services earns an average of $173,941 annually. [Computerworld and are both owned by IDG Communications.]The same position is worth $146,111 in computer-related services/consulting, $151,889 in education, $133,972 in government, $191,762 in legal/insurance/real estate, and $192,885 in manufacturing (non-computer related). Lower-level health IT staffers are also paid fairly well, compared to those in other industries. 

As you would expect, some healthcare organizations pay better than others. Myeroff attributes much of that differential to how much individual organizations know about current information technology. In addition, small, rural hospitals don’t pay as much as large, metropolitan healthcare systems, notes Hood. But overall, he says, “I don’t see health IT staff fleeing to other industries, so that makes me think that the compensation is somewhere in the ballpark.” 

[Related: IT continues to struggle to find software developers, data analysts] 

How bad is the current health IT staff shortage? A third of healthcare managers said they had to postpone or scale back an IT project because of inadequate staffing, according to a 2014 survey by the Health Information Management and Systems Society (HIMSS) [Note: PDF download]. But this may not be because healthcare organizations couldn’t find the people they needed, Hood says. 

“This could be an indication that there’s greater demand [for health IT] than the budget allows for,” he points out. “Is the barrier, ‘I can’t find people with the skills I need,’ or is the barrier, ‘I don’t have the resources from the organization to execute what they’re asking me to do?'” 

Moreover, he says, the availability of outsourcing has to be factored into the equation. While outsourcing was not the preference of most CIOs in a recent Advisory Board Co. survey, three-quarters of the healthcare executives who responded to the HIMSS survey said they had outsourced at least some IT. The top areas for outsourcing were clinical application support, project management, and system design and implementation. 

From bad to worse? 

In Myeroff’s view, however, health IT shortages are substantial and growing. “Technology is moving forward, and we don’t have the staff for it,” he says. “Tens of thousands of jobs are going to be needed and we don’t have the people for it.” 

[Related: In the IT talent wars, businesses need to sweeten the pot] 

One major reason for these shortages, he says, is the government’s incentive program for electronic health records (EHRs). That initiative has resulted in the majority of hospitals and physicians acquiring EHRs in the past several years. Known as the “meaningful use” program for the criteria that providers must meet to obtain the financial incentives, this program is now in the penalty phase: For the next few years, Medicare will cut its payments to organizations that do not show meaningful use of EHRs. 

Another IT-intensive government program requires all healthcare providers to move to a new diagnostic coding system in October. This shift entails internal and external software testing, not only with health plans and claims clearinghouses, but also with other trading partners. 

In addition, doctors and hospitals are grappling with the transition to an entirely new method of payment, known as “value-based reimbursement,” that rewards healthcare providers for quality and efficiency. The data aggregation and analysis needed for success in this game require specialized IT staff such as data analysts, who are in short supply. 

Because of government-mandated time frames, Myeroff interprets the HIMSS survey results differently than Hood does. Whether or not a healthcare organization has the budget to hire more IT people, he notes, it must achieve certain objectives by a specific date. For example, stage 2 of the meaningful use program requires hospitals and physicians to meet its criteria this year. “To meet those stringent deadlines, you need IT staff,” he says. 

Is in-house training the key? 

To close the workforce gap, many healthcare systems are developing additional IT workers internally. These staffers – typically nurses – take IT courses offered by vendors or professional associations. But most of their training occurs on the job. 

Hood says  this is not a very sophisticated approach. No healthcare organization that he knows of has a formal in-house training program. Most of the lower-level clinicians who take on roles in IT are super-users who often function in a help-desk capacity to help other users, he says. 

Myeroff says that many healthcare systems have done a good job of developing in-house IT talent. The problem he sees is that clinicians who take on IT roles don’t have all of the competencies required. Health IT professionals who know healthcare but are not clinicians may supply these missing skills. “But neither one of them can do the whole job.” 

A number of medical schools now offer medical informatics courses that train physicians in some areas of health IT. But while these programs produce informaticists and chief medical information officers (CMIOs), they don’t help in other areas such as security and EHR operations, Hood notes. 

[Related: 10 startups that are disrupting healthcare IT] 

The skills and experience gap between IT and healthcare persists, he says, but more and more clinicians are filling it. Hood cites the growing number of CIOs who are physicians with experience working on IT projects. This can be especially important in change management, he notes, where a hospital’s management needs to persuade physicians to support an IT program that may change their workflow and even their practice patterns. 

Both Hood and Myeroff emphasize that health IT pros must understand how technology can be used to improve healthcare. Older professionals who can’t do that may not be able to keep up, notes Hood. To succeed in this new era, IT staff must be very receptive to what physicians and nurses want, Myeroff adds. “You need the clinician mind to tell them what they’re building.”