Learning Center


The Science of Improving the EHR Experience

The Science of Improving the EHR Experience


An Arch Collaborative Impact Report

Authored by:  Taylor Davis & Connor Bice, 03/06/2020 | Read Time: 4 minutes

Theories are proven (or disproven) through application. Technologies from the ham radio to the microprocessor were successful applications that ended the debate about theoretical principles. Similarly, today’s EHR experience pioneers prove that they understand the science of EHR success by measuring the feedback of their clinical staff, making improvements, and then measuring again. This report details the experiences of the first organizations in the Arch Collaborative to measure their clinicians' EHR satisfaction, make changes to improve, and then measure again.




Close X
I am interested in joining the Arch Collaborative.

Here is my information:

Submit
Cancel
Sending Email...
After clicking "submit" above, a KLAS Representative will personally contact you within a day or two.

54% of Organizations Are Seeing Statistically Significant Improvements

trending net ehr experience scores of repeat respondents


What Drove Improvements?

Rush University was the largest provider organization to see a dramatic improvement: a combination of repeat and new provider ratings brought the organization’s Net EHR Experience Score up by over 21 points (n = 36 repeats, n= 469 non-repeats; p <.001). Between their two surveys, Rush communicated to providers with, among other methods, the flyer below.


what drove improvements

With this flyer, Rush University leadership focused on delivering improvements in the three pillars of EHR user optimization:

Improved proficiency: Communicated improved support through methods such as the Provider Optimization & Experience Team (POET) program and Provider Optimization Workshops (POW), which helped improve education.

Alignment to workflows: Through the POW program, efforts were made to help providers better personalize their EHR to their specific needs. In addition, changes were made to the EHR with regard to the CMS HCC codes to better meet user needs.

Shared ownership: Open inclusion of Shafiq Rab the CIO, and Brian Patty, the CMIO, as two very engaged parties in Rush’s efforts conveyed this was not just a training push. Senior leadership made significant efforts to listen to clinical needs and respond.


significant improvements from other organizations

Significant Improvements from Other Organizations


OrthoVirginia, a focused orthopedic group, saw Net EHR Experience Score improvement of 30+ points. They created a Provider Support Specialist program to drive proactive service while creating a solid relationship between these Provider Support Specialists and the providers themselves. The organization recognized the need for good governance when implementing any EHR changes and updated their governance process accordingly (for more details, see their case study “A Model for Improvement”).


Anne Arundel Medical Center saw their Net EHR Experience Score jump by 14 points after making two significant changes. They built a new data center, improved data infrastructure, and upgraded storage within their area network, increasing the EHR’s reliability. They also invested in their clinically integrated network and (in-house) developed functionality within the Epic platform and Healthy Planet to ensure that all providers in their community, on Epic or not, had full data exchange so that they could provide meaningful care in every encounter. Providers can reliably look at a patient’s medical record and understand the full continuum of care, regardless of what EHR the patient primary care doctor uses. This has led to increased satisfaction with EHR integration.

Edward-Elmhurst Healthcare, who had high clinician satisfaction on their first Arch Collaborative measurement, reported making very few changes between their first survey and their second; however, their Net EHR Experience Score improved by almost 22 points in satisfaction. They achieved this by continuing the efforts they already had in place based on their philosophy of eliminating or filtering out unhelpful functionalities as quickly as possible, along with a dedication to communication through the clinical support team. More changes are coming at this organization in the future; they are planning training sprints (as inspired by Dr. CT Lin at University of Colorado Hospital), encouraging clinicians to bring up development ideas, and carving out protected time for analysts and informaticists to dive into these development ideas in a collaborative manner.


One large health system saw improvements of 15 points (n=103, p <.001) as they aimed multiple initiatives at their providers and their nurses. Physicians received one-on-one training, were sent weekly education snippets, got a revamped training website, attended superuser conferences, and had a dedicated committee to review clinical decision support; the last helped remove low-value alerts (50% of existing alerts) and in turn decreased documentation time. Nurses received more trainings, saw improved workflows, received roadshow presentations for any upgrades, and implemented a new process for enhancements.

Lehigh Valley Health Network saw significant improvements among providers (15-point increase in Net EHR Experience Score) after they increased at-the-elbow support and focused on optimization after an EHR upgrade. They also hired additional informatics staff. Unique from many other organizations who have re-surveyed, this health system also expanded their use of scribes and saw a significant increase in satisfaction among their repeat survey takers.


University of Kansas saw scores improve among their providers by 7 points, driven by improved communication from the informatics team regarding upgrades and optimizations (for an example, see this video). They also created the Epically Efficient trainer-credentialing program focused on training and personalization. And they implemented a third-party tool to quantify chart fatigue and determine how long it took clinicians to read through an alert, then reduced the alert volume. In addition, they started a new technology governance process where they strategically prioritize a road map of EHR areas to work on.


The largest organization so far to see significant improvements rolled out a superuser program, created a change agent role, invested in a communication tool, and made a significant number of EHR enhancements.


why did some not see improvements

Why Did Some Not See Improvements?

Some organizations did not see the improvements they hoped for, often because of the negative impact of another factor within the organization.


One organization saw a slight drop in satisfaction scores. While this organization had invested in more education and training, a series of downtimes before the second survey left providers dissatisfied with their EHR experience and questioning their trust in the organization’s IT and informatics teams.


Another organization made significant investments into EHR education but still saw increased dissatisfaction. Providers reported they were increasingly asked to work late hours, and burnout spiked. Trust of organization leadership (not just IT/informatics) dropped overall.

One organization with very high clinician satisfaction in their original measurement (in the top 5%) saw a slight decrease in satisfaction as the result of an upgrade that removed key functionality providers relied on. The decline in satisfaction was small because of broad, clear communication in preparation for the change.


At one academic health system, the only organization to see a statistically significant drop in satisfaction, nurses reported increased frustration with poor support and communication. Providers also reported a decrease in satisfaction because they were receiving too many functionality changes in a short period of time without sufficient assistance.

Other organizations made a specific push to improve the EHR but only improved in one area, or they chose to not focus on making improvements.

One academic health system reported that they made system enhancements and were beginning to focus more on personalization. They did not undertake significant training, workflow, or governance improvements.

Another organization with strong clinician satisfaction was in the process of making significant changes in an EHR upgrade that affected clinicians’ workflows. This organization focused on strong change management and communication so that their clinicians’ experience would not deteriorate. After these significant efforts, this organization achieved their goal of steady satisfaction throughout the changes.

all four training sprint groups saw improvements

All Four Training Sprint Groups Saw Improvements

Inspired by University of Colorado’s training sprints, four organizations made targeted improvements with smaller groups of clinicians to try to drive improved EHR satisfaction. These sprints are attempts by organizations to target a specific group (i.e., a single specialty or clinic) and hone in on their training for a short period of time. A sprint usually includes analysts and trainers observing the providers as they work, planning time to develop individualized training for the providers, and then giving one-on-one instruction for a set period of time (usually 2–4 hours). These sprints rarely involve technical changes at the time; rather, they focus primarily on workflow efficiency and increased personalization training.


Summary

Provider organizations that think they cannot improve their EHR experience now have scientific data to rely on that demonstrates that improvement is not only possible, it is happening. It is within their reach to significantly improve on the clinicians’ experience. In order to drive improvements, organizations must first deliver an EHR foundation that is functional, stable, reliable, and responsive—and multiple EHR solutions are capable of being that type of foundation today. Then leaders must engage in user optimization efforts, focusing on aligning workflows through study and personalization, investing in user proficiency, and creating a sense of teamwork through shared EHR ownership.

For more insights, read the Expanded Insights section of this report, which dives deeper into these questions:

  • What changes that organizations made were the most effective at driving change?
  • What were the differences (if any) in improvement depending on organization type or size?
  • What were the differences between provider and nurse satisfaction improvement efforts and outcomes?
  • What types of organizations saw the greatest gains in satisfaction? Who was left behind?
  • Which aspects of the EHR have seen the most improvement? What areas continue to lag?

arch collaborative logo

Report Non-Public HTML Body

Report Public HTML Body


Topics

Report Topics

Clinical Outcomes

Upload Full Report



 Download Full Report

This material is copyrighted. Any organization gaining unauthorized access to this report will be liable to compensate KLAS for the full retail price. Please see the KLAS DATA USE POLICY for information regarding use of this report. © 2019 KLAS Research, LLC. All Rights Reserved. NOTE: Performance scores may change significantly when including newly interviewed provider organizations, especially when added to a smaller sample size like in emerging markets with a small number of live clients. The findings presented are not meant to be conclusive data for an entire client base.

}