Simple Strategies for Improving Arch Survey Response Rate
So you’re taking the leap and you’re going to survey your clinicians about their EHR experience in partnership with the Arch Collaborative. But how do you make sure you get the responses you need to create real change? That’s what we wanted to know at Franciscan Health.
First, here’s a little background on how I got involved in the process. Franciscan Health has surveyed with the Arch Collaborative once a year starting in 2017, which was before I came back to the organization in 2018. I got to see the 2017 results, and the results of the following Arch Collaborative surveys kicked off in 2018 and in 2019. Each time we got slightly better, but there was still a lot of room for improvement.
In 2020, I was asked to help put together the survey because there were added questions about training modalities and tip sheets. My input as an educator was wanted, so I got to help create the survey. In the process, we wondered how to get more people to take the survey. Our response rate was not what we considered great in 2019; we were really looking to improve it in 2020.
Tracking Progress
To track the response rate, we put together a graph comparing week to week from 2019 to 2020. In 2019, the deadline had to be extended by three weeks because we just didn't have the response we wanted. The last week, we got two more responses and then finally decided to close the survey. But as you can see, in 2020, we were actually able to end the survey on time. So what made the difference? I have outlined a few strategies that we used.
Three Strategies to Improve Response Rate
Strategy 1: Showing Clinicians the Changes Made from Past Surveys
Often, people take a survey, and that is all they ever hear of it. It is not readily apparent that their response did anything. I wanted to pull back the curtain to show clinicians what their survey results really did.
As an educator, I love infographics. In collaboration with our informatics department, I made two pieces for this situation. One was for advertising the survey. For that piece, the message was simple: We want to know what you want; your voice matters. The other infographic showed the results. It was going along the lines of “you asked, and we listened.” We highlighted changes like the following:
- You asked for fewer changes, so we decreased the number of releases.
- You asked for fewer best practice advisories, so we reduced them by over 13 million between March and October of 2020.
- You asked for fewer clicks and time-saving measures. We implemented a simplified sign-on to keep you from having to log in every time.
By showing them that their survey responses resulted in change, we got more people interested in making their voice heard.
Strategy 2: Making the Survey Easily Accessible
The survey was previously available in the Epic application, but you had to click the Epic button, and it ended up taking three clicks to get to the survey. We changed that by adding a button to the hyperspace toolbar that said “KLAS Survey.”
We took this approach because, within the last year, we added an updates dashboard so that everybody could access new-release education in one place. To get to it, you just click the “Updates” button in the toolbar. People were already used to that button. So we just added the KLAS Survey button right next to it. That made it super easy for clinical informaticists on their rounds to click and take the survey.
The button was linked to a focused dashboard. There, we told people how long the survey would be open and gave them both the QR code and a link. (And some subliminal messaging with a smiley face.) That was helpful in getting people to take the survey because the process was much easier than it had been.
The QR code was especially useful. We still gave people a link, but if they had to take the survey with the link, it connected the survey with the IP address and made them finish it in one sitting. However, if they used the QR code, then they could finish the survey later if they got interrupted because it would be available on their phone, and they could just hop in and take it when they had time.
Strategy 3: Advertising and Offering Prizes
To get the word out about the survey, we still sent an email and advertised it on our FRANC page, which is our intranet. To make it fun, we offered small prizes for completion. We had tech-type prizes; the grand prize was a UV box for your phone, and the bottom-tier prizes were charging lanyards. We didn’t spend a ton of money; our budget was around $500, but everyone wants a prize.
Room to Improve
One thing I would do differently for the next survey is focus more on our providers. Our provider numbers didn't go up as much as our general numbers. They did go up, but we didn't hit our goal to improve by 10%. (Nurses blew that goal out of the water; they would have surpassed any goal we set.) We have around 900–1,200 providers, and we got around 400 responses. The return rate wasn’t bad, but in can be improved. We always want to hear what more people have to say.
Hand in hand with that, I think we could advertise the survey earlier for providers and let them know where they will find it. Hopefully, there will be more recognition of the button because we’re going to keep that approach. Adding the prizes to the infographics and advertising them a little more could also be a good strategy. The only other element I’d like to improve is that we promised the survey would be an 8-minute survey, but we added so many questions that it turned out to be more like a 10-minute survey.
Commit to Change from the Start
If you are just starting out with the Arch Collaborative and want to encourage a good response rate, it is important to make a transparent commitment to change. Collecting data for data's sake is worthless. I’m not saying that’s what anyone is doing, but it’s important to tell your clinicians that. If your users don't know that you are going to do something meaningful with their data, then they won't respond.
An issue with our early surveys was that trust in IT was in the basement. This was a result of misalignment and not delivering on our promises. We'd say we would make a change, and then it never came to fruition for a multitude of reasons. Or we did make some changes, but we made them in a vacuum without seeking the users’ voices. To fix this, we instituted some governance changes and really gave clinicians more of a voice and control over the changes that happened to their system.
Make sure that you can credibly say that you will be using the data to improve and do better. This is a partnership. The catchphrase we used at Franciscan Health was "Operationally led, technology enabled." This highlights the idea that IT is not leading the way. Instead, we are here to serve our users’ needs. It is key that people understand you’re there to serve and that the Arch Collaborative survey is a great opportunity to make their voices heard.
Photo credit: nexusby, Adobe Stock