by Robert Rowley

The next generation of EHRs will be fundamentally different

Mar 29, 2017
Artificial IntelligenceElectronic Health RecordsEnterprise Applications

Next generation EHRs will be fundamentally different. The data will be external, shared and universal. The system will be broken into functional pieces. Chart note creation will be automated, and will use multiple inputs combining graphical, text, and voice over multiple devices. AI will be integrated to deliver clinical decision support.rn

7 artificial intelligence
Credit: Thinkstock

Electronic Health Records (EHRs) have come a long way. Over 80 percent of physicians use them in their offices, and nearly all hospitals have implemented EHRs as well. Spurred by the HITECH portion of the 2009 Recovery act, Meaningful Use has put money on the table for physicians and hospitals to adopt and use EHRs.

It also defined what kinds of features an EHR must have in order to be Certified. Legacy systems took on these new requirements by adding to their offerings (sometimes referred to as “bolt-on solutions”). Some new startups that flourished after HITECH were more agile, not being burdened by a decade or more of legacy technology. But generally, the systems – which saturate the landscape we have now – all have similar features.

What are these features? To begin with, each EHR keeps the data confined within each enterprise (each doctor’s office, or each hospital). Locally installed EHRs do this, of course – but so do web-hosted EHRs. Even if the hosted data is stored in a single instance overseen by the vendor, the data is segmented into “apartments” for each practice. Sharing data between users of the same system is about as difficult as sharing between different institutions using different systems.

Another feature is that data input falls largely on the shoulders of the clinicians. Patient-entered information, often still the last piece of information residually on paper in a doctor’s office, is generally kept as a separate document in a chart, and is not used to populate today’s chart note. Patient check-in, which has seen advances in automation in the past few years, is still primarily an administrative step, and does not capture current-symptom reporting, smartly presented, in order to start filling out the History of Present Illness (HPI) section of a chart note.

In high-volume settings, such as Emergency Departments, chart note documentation is often burdensome enough that the department needs to hire scribes to follow the clinician around and create the chart notes. Any system that creates the need to hire additional people to function as scribes is fundamentally a poor design from a user interface (UI) perspective.

What would a future EHR look like?

The first fundamental difference from current-day EHRs is that next-generation systems would work from external, universal data. The subset of data about a patient’s story would not be confined to a single enterprise. An external, accessible store of data housed in a modern way with modern API connections to retrieve and add data is a fundamental feature of this new era.

When data is liberated this way, then the system being used does not matter so much – the data is there, and is the same data used by whatever system or systems that are connected. The data is separate from the EHR, so sharing copies of pieces of a patient’s chart from one practice or hospital with another is no longer a burdensome issue. The data is all there, and what a given clinician can see is governed by access and permissions, not by the structural limitations of not having the data in the first place.

Freeing the data from within the EHR also means that the UI does not need to be a massive one-size-fits-all that is so often seen in current systems. The EHR then really becomes a collection of apps which work together easily (after all, they are all working off the same data), and can be assembled as the needs of the practice or hospital may require. They can also be swapped out easily, since it would not result in loss of data in order to exchange a given app for a better one.

Clinical settings and workflows vary tremendously across the healthcare landscape. What a clinician in Family Medicine needs is different than Oncology, or Radiology, or Ophthalmology. Different apps that address the particular workflows for each setting could be developed and assembled as needed.

The other big change in EHRs will be moving away from clinicians being burdened with data entry. Patient involvement can improve, with consumer-facing apps that can capture patient history, family history, social history, and so on, all of which add to the patient’s universal record. Consumer-friendly ways of looking at one’s own data will become commonplace. Pre-visit interactions with patients, asking smart questions about why they are here for this visit, can begin to populate the record of the visit.

And it doesn’t need to be keystroked in. It can use modern voice recognition (a sort of smart “Dr. Siri”, if you will) which can be deployed through smartphones, or devices in the office. The system creates the chart note in the background, and all inputs – pointed and clicked, keystroked, or voice inputs – are used to build the record. Physicians are no longer in the role of being data entry clerks, where the demands of EHR interfaces distract attention away from the fundamental human-to-human interactions that are so important in healthcare.

Further, with the power of aggregated insights for the Medical Knowledge Graph, artificial intelligence (AI) can be used to make suggestions about care specific to the individual. This can powerfully assist the clinician through EHR-embedded features, as well as directly assist the patient through patient-facing apps, and make specific recommendations that take into account clinical history, lab values, genomic data where available, lifestyle preferences, family history and everything relevant.


Next generation EHRs will be fundamentally different. The data will be external, shared and universal. The system will be broken into functional pieces – apps – that are collected as needed. Chart note creation will be automated and will use multiple inputs, including patient-facing apps and check-ins, patient-assisted creation of HPIs and other elements of history. Input methods will be a combination of graphical, text, and voice, and can harness multiple devices (laptops, tablets, smartphones) in order to create a cohesive, accurate and shared document.

With AI, clinical decision support, both for the physicians and also for the consumer/patients, can achieve its potential and guide best practices in a systematic way, beyond what is possible with today’s technology.