"Alexa, Solve Healthcare." - Cover

"Alexa, Solve Healthcare."

I recently participated in a call with several Arch Collaborative members that discussed some of their vendor challenges in a productive manner. We evaluated the use of scribes in hopes of determining how the EMR documentation process could be improved. One of the physicians quickly guided the discussion to how voice recognition could be used with natural language processing (NLP) to greatly improve the documentation process.

The example given was having Amazon’s Alexa in the exam room and having Alexa capture all physician voice commands to generate the required billing, quality reporting, and patient documentation. As my physician friend spoke, a vision came to mind:

“Alexa, start documentation on Mike Davis, male, with a birthdate of MM/DD/YYYY.” The provider could dictate H&P data, and all speech would be captured as text and evaluated by NLP to help drive a working diagnosis and problem list.

The current medications could be listed, and all active medications would be captured and coded with NCPDP codes. The provider could even dictate diagnostic test orders, and lab and imaging orders would be captured and translated to SNOMED CT.

And maybe at the end, the provider could ask Alexa to add eggs to their grocery list. You get the idea.

How far are we from this? Nuance and M*Modal have been working with speech recognition and NLP for years. 3M has a product that can translate textual documents into encoded data used for billing. How close are Amazon and Google to delivering these capabilities?

Remember the impact the iPad had on physicians when it first came out and was used by vendors to provide mobile interactions with the EMR? Think about the impact an Alexa- or Google-supported EMR-documentation platform could have on healthcare. Physicians wouldn’t need to type.

In fact, if implemented correctly, all EMR navigation and functions could be voice activated! We would turn our frustrated and click-weary physicians using today’s EMRs into Star Trek’s Bones.

Speaking of Star Trek, we could take this thought experiment a step further. Imagine: A robot asks a patient for identification information. The Alexa/Google function records the patient information.

The robot then places sensors on the patient to detect vital signs and run diagnostic tests. When directed, the robot scans the patient with an advanced MRI or CT scan built into the robot.

Pertinent patient information and diagnostic test results would be recorded in the patient record in a standard, encoded format that would be available to all caregivers and family members with permission to access the patient’s record. No more physician burnout! No more EMR complaints! All of our problems finally solved. Okay, maybe that’s setting the bar a little high.

The intriguing question is: How long will it be before Amazon and/or Google figure out how to apply Alexa/Google Home technologies to the EMR and drive the “next big thing” in EMR capabilities? Beyond these consumer giants’ massive capital with which to drive innovation lie other riches beyond most HIT companies: consumer experience.

As tough as the competition may seem among healthcare IT vendors, it’s nothing compared to the gargantuan brawl of consumer electronics in which appealing to the end user is the difference between success and failure. In healthcare, the person buying the software isn’t typically the daily user.

Obviously, that is not the case for Google, Amazon, or even Apple products. These companies’ battle-hardened sense of what consumers want in their IT makes the thought of these companies applying their muscle to HIT all the more appealing.

If you ask me, the companies that can deliver this level of usable technology will drive huge revenue increases to the bottom line.


I’m willing to bet that this change will come faster than many of us might think. Until we get to that futuristic day, there’s a lot still to be learned about our current EMR situation.