Download Full Report

Similar Reports

 View Members

Clinician Training 2023
May 2023

Clinician Training 2023

Best Practices for Effective EHR Education

Authored by:  Lauren Manzione, 05/12/2023 | Read Time: 4 minutes

Since the early days of the Arch Collaborative, feedback from clinicians has shown training to be a key pillar of EHR success. The importance of education became even more apparent as methods for delivering training shifted throughout the pandemic. As a collaborative, we continue to ask questions and work with our member organizations—both healthcare organizations and vendors—to identify best practices for EHR education and share success stories that that illustrate them (see recent Collaborative report on vendors who offer EHR education solutions).

Want to see full details?


Want to see full details?

or Login 

Close X
I am interested in joining the Arch Collaborative.

Here is my information:

loading Sending Email...
After clicking "submit" above, a KLAS Representative will personally contact you within a day or two.

This report delves into many of the questions posed to clinicians in the Arch Collaborative’s User Experience and Trainer Quality Benchmark surveys to help further demystify the specific characteristics of EHR education that lead to clinician success with the EHR. The report also provides guidance on how organizations can generate clinician enthusiasm around EHR training to better help them thrive in their EHR environment. Unless specifically stated, all findings relate to both initial and ongoing training.

† The User Experience survey asks clinicians approximately 40 questions about their EHR experience and how it relates to their well-being and ability to care for patients. Key metrics from this survey are used to create an overall Net EHR Experience Score and to generate peer benchmarking. The Trainer Quality Benchmark survey asks clinicians 11 questions about their satisfaction with the EHR training they have received and the trainer who provided it. This data allows organizations to compare their training with that of other organizations and also allows them to benchmark satisfaction across individual trainers at their organization.

Training Best Practices at a Glance 

moving gears iconTrain in the context of patient care: Train clinicians on how to use the EHR within the context of caring for their patients. Knowing how to do/access something is not the same as it being a seamless part of the clinician workflow.

people chat bubble iconTailor the method to the message: Provide in-person training when possible and replicate the feel when impractical. Take advantage of the ease and accessibility of tip sheets and self-guided eLearning for quick, straightforward messages. Utilize one-on-one training for specific content or individuals that require more in-depth guidance.

clock iconProtect time for ongoing education: This is best formatted as 15- to 60-minute sessions, totaling 3–5 hours per year.

magnifying glass analysis iconDemonstrate the ROI: Share clinician testimonials, leverage usage data from the EHR vendor to demonstrate improved efficiency, and utilize surveys before and after training to gauge clinician-reported time savings.

lightbulb brain iconPrioritize mastery, not just proficiency: Even the most satisfied clinicians still have significant room for improvement. Clinicians in the top 80th percentile in terms of satisfaction with the EHR report an average score of just 58.9 (on a scale of -100 to 100).

Use the Training Method Best Suited for the Message

No one training method is guaranteed to be effective in all situations. It is important for organizations to choose initial and ongoing training methods that are realistic and scalable. At-the-elbow training is the method clinicians are most likely to describe as useful. However, given the time and resources it requires to consistently provide such training across an enterprise, organizations may need to carefully consider which content is best taught at the elbow and which content can be taught via other methods. Indeed, according to clinicians’ self-reported data, classroom training is the most common method by which they receive EHR training. This makes sense as it is a much more scalable approach.

Data from the Trainer Quality Benchmark survey indicates that self-directed eLearning may generate the biggest bang for the buck in terms of time savings for clinicians (see chart on next page). Individual eLearning sessions most commonly last less than 60 minutes (compared to the 3–8 hours for the typical session of classroom training) and can generate a significant ROI in terms of time savings for clinicians—on average, clinicians who participate in self-directed eLearning report saving 20–25 minutes per week in the EHR for every 15 minutes of eLearning. This demonstrates that self-directed eLearning can be a valuable tool for communicating simple, straightforward information that improves clinician efficiency.

training participation rate vs usefulness of training

Bellin Health Training Case Study
Bellin Health makes upgrade-specific training as palatable as possible for their clinicians, using different approaches depending on the clinical background of the trainee and the message being shared. See Bellin Health’s case study for more details.

ehr minutes saved per week for every one hour of training

Virtual instructor-led training generates many of the same positive impacts as in-person classroom training while being more realistic and scalable. At the start of the pandemic, organizations were forced to transition to virtual training almost overnight, resulting in a Collaborative-wide dip in training satisfaction as many organizations adjusted or put training programs on hold. However, satisfaction with virtual training has increased since 2020 as organizations have learned how to make it more effective. Some Collaborative members have found success replicating the in-person experience by providing engaging, interactive instructors and curriculum.

percent of clinicians that agree ehr training is helpful and effective

Guthrie Clinic Case Study
The Guthrie Clinic makes the best of virtual training by using engaging trainers, splitting the screen between the trainer and course content, and ensuring the EHR is available to clinicians during training to apply what they are learning. Learn more about Guthrie Clinic’s approach in their case study.

Clinicians Need More EHR Training, Whether They Realize It or Not

Almost half (46%) of clinicians who have taken the Arch Collaborative survey say they do not need more ongoing EHR training. However, on average, these clinicians don’t report significantly higher EHR satisfaction than peers who do want more training—as measured by the Net EHR Experience Score (NEES), the delta between the two groups is only about 6 points (on a scale of -100 to 100). How can organizations design ongoing EHR training that delivers tangible benefits for their clinicians?

clinician desire for more training
net ehr experience score by desire for more training

† Each individual clinician’s responses to the Arch Collaborative EHR Experience Survey regarding core factors such as the EHR’s efficiency, functionality, impact on care, and so on are aggregated into an overall Net EHR Experience Score (NEES), which represents a snapshot of the clinician’s overall satisfaction with the EHR environment at their organization. The NEES is calculated by subtracting the percent of negative user feedback from the percent of positive user feedback. A NEES can range from -100 (all negative feedback) to 100 (all positive feedback).

Shift the narrative to focus on EHR mastery: Across clinical backgrounds, the most common sentiment among clinicians who don’t want more training is that they already feel proficient with the EHR. These respondents’ average NEES indicates that this perception is likely accurate. However, even well-performing clinicians have weak spots, and regardless, EHR mastery, not general proficiency, should be the goal.

Enlist the help of the least satisfied clinicians: The low average NEES of clinicians who cite poor training quality as a reason for not wanting more training is of concern. Poor experiences in the past may make these users hesitant to reengage. OrthoVirginia found success enlisting these very users to help improve the training experience for their peers. See their webinar for more details.

Demonstrate proven time savings: Advertising the potential time-saving benefits of additional training can motivate clinicians to make training a priority. This can be done by reporting time-savings data to clinicians gleaned from after-training surveys (such as the Collaborative’s Trainer Quality Benchmark), year-over-year EHR experience data (such as that collected by the Collaborative’s standard User Experience survey), data from pre/post surveys collected around implementation of a new initiative, or clinician usage data provided by the EHR vendor.

clinician reason for nto wanting more ehr training
net ehr experience score by reason for not wanting more training

Intermountain Health and Kaiser Permanente Southern California 
Intermountain Health developed a flexible coaching program that increased their organization NEES by 40 points and helped clinicians save 63 minutes per week after a 1-hour session. Read more about what Intermountain Health did in their case study. This program was partially modeled after Kaiser Permanente Southern California’s ongoing EHR education master course, which 98% of attendees recommend to their peers. Read more about the training program in their case study.

Workflow-Specific Training Is Linked to Higher EHR Satisfaction

Agreement that initial or ongoing training is workflow specific is correlated with higher satisfaction in some hard-to-improve metrics, including the EHR’s efficiency, functionality, internal integration, external integration, and ease of learning. In fact, across the Collaborative, external integration, efficiency, and ease of learning are the three NEES metrics with the lowest satisfaction. However, teaching clinicians to use information from the EHR within their workflow might be as important as working to improve the EHR itself. Clinicians who report that training is workflow specific are also less likely to report burnout and less likely to report plans to leave their organization. Higher satisfaction with personalization training is also correlated with a higher overall NEES.

When asked an open-response question about what they found useful about training, clinicians repeatedly mention the importance of getting training that applies to real-life scenarios (see examples in the "Voice of Clinicians" section). Many clinicians say they want scenario-based training and a trainer with specialty-specific clinical knowledge who can answer questions in real time.

percent of clinicians satisfied with nees metric by agreement that training is workflow specific

The Voice of Clinicians

“For training to be more useful, it really needs to be directed by providers who use the system. There is a big difference in the theory of how it should work and the reality of what it looks like using the system in practice.” —Physician

“In-person training was helpful. Changes to charting that are directed and taught online are difficult to follow and often not directed to what we chart in my clinical setting. In person, the charting can be focused on what I need, and I can get my questions answered right away.” —Nurse

“I enjoyed asking questions that directly related to problems that occurred in the past or ways to be more efficient in my specific workflows. I enjoyed when the trainer had also used the application the way I have used the application (pharmacist to pharmacist). I appreciated when the trainer had the opportunity to see how I utilized the EHR and could understand my frustrations with certain workflows or information-gathering issues.” —Pharmacist

“In-person training helped me spend time playing around in the play environment and asking direct questions to a content expert. Often, the class instructor helped us implement the curriculum into our everyday workflows in a personalized manner. We were able to ask our location-specific questions and play around with our own what-if scenarios instead of following only a prescribed scenario. Discussion with a content expert also helps the end user learn and apply a real-life perspective. Experts taught us several different ways to approach documentation for the same item and spoke to how they integrate each approach into their daily routine.” —Nurse

Clinicians Need Just 3–5 Hours of Quality Ongoing Training Each Year

Many clinicians claim that training takes too much time, but it doesn't have to. Just 3–5 hours of follow-up training per year correlates with a higher NEES than 2 hours or less. This holds across all clinical backgrounds. The length of individual training sessions does not need to be excessive either. Responses from the Trainer Quality Benchmark survey indicate that satisfaction with training does not increase as training length increases beyond 30–60 minutes. Keep ongoing training manageable and demonstrate the value of investing time into training.

net ehr experience score by yearly hours of follow up training
agreement that training was highly valuable by length of training session

What Is the KLAS Arch Collaborative? 

The Arch Collaborative is a group of healthcare organizations committed to improving the EHR experience through standardized surveys and benchmarking. To date, over 300 healthcare organizations have surveyed their end users, and over 400,000 clinicians have responded. Reports such as this one seek to synthesize the feedback from these clinicians into actionable insights that organizations can use to revolutionize patient care by unlocking the potential of the EHR.

Report Non-Public HTML Body

Report Public HTML Body

Upload Full Report

This material is copyrighted. Any organization gaining unauthorized access to this report will be liable to compensate KLAS for the full retail price. Please see the KLAS DATA USE POLICY for information regarding use of this report. © 2019 KLAS Research, LLC. All Rights Reserved. NOTE: Performance scores may change significantly when including newly interviewed provider organizations, especially when added to a smaller sample size like in emerging markets with a small number of live clients. The findings presented are not meant to be conclusive data for an entire client base.