North Americans are increasingly surrounded by smart speakers, filling their homes with the sounds of voice technology. Clearly a huge cultural trend at the moment, voice tech is here for the long run as a permanent feature of our increasingly digital world.
For those of us focused on healthcare, important questions have emerged around the possibilities that voice tech is currently unleashing. While Amazon, Google, and Apple are busy deploying their smart speakers to build brand affinity and teach us new voice skills, the true value of voice tech in healthcare is ready to be invented.
Early experiments with Alexa are maturing into first-generation commercial applications. Health institutions (such as hospitals), payers, and pharmas are beginning to offer access to voice-based health information or utilities, like medication reminders or the ability for patients to provide spoken feedback. Voice algorithms are emerging that even promise to expand voice tech into a full-fledged diagnostic tool.
Assistants And Patient Privacy
Beyond Alexa diagnosing people’s colds or a smart scale that talks to patients about their diabetes; two of the biggest issues quietly confronting the future of voice in healthcare pertain to patient privacy and regulatory compliance. For the moment, the voice tech privacy conversation has mainly focused on HIPAA technology concerns and how to keep patients’ personal information safe in the cloud. What no one seems to be talking about is patient privacy within the assistant domain.
To provide context, assistant refers to a voice tech type, like Alexa, Siri, or Google; domain refers to the realm of data the assistant has access to when interacting with the patient (or for the patient); and combined they underscore two basic questions: Who exactly is the patient’s virtual assistant and how much of their data can the assistant access?
On the surface, these questions may sound fundamentally like some of the privacy concerns that healthcare has been dealing with for decades, but with the rapid advances in voice tech and artificial intelligence, these concerns are getting much more complex than ever before. In the near future, it will be critical to give patients and HCPs the ability to manage the medical information domains that these virtual assistants will have or want to access. HCP and patient adoption of voice tech will only be built on trust, earned through positive experiences, that offer full transparency and user control. Failure to generate basic trust has the ability to undo even the very best of intentions.
Managing Health Domains
For the average consumer, it’s a safe bet that privacy concerns have already crossed their minds while using voice tech. Since people primarily interact with these smart devices on a one-to-one basis, the complexities of assistant domain still seem within their presumed control.
Let me give you a personal example: I use Google, Alexa, and Siri — spread across multiple devices, but in completely different domains for the moment:
- Alexa knows my shopping history and musical preferences
- Google knows all my search habits, where I’ve visited geographically, and is connected to my Fitbit
- Siri is on my work phone, so only has access to my career data
- My Fitbit data resides on my Pixel phone, so my expectation is that Google, and my other Google connected devices, may also have access
My current smart speaker situation feels manageable and within my personal control, but is it? And for how long? What if I am able to ask Alexa to access my Fitbit data from Google? That might be really useful if Alexa is managing my medication regimen. Then how will I manage Alexa’s access to Fitbit on Google? How do I control which data Alexa can specifically access? Is it a one-time deal, where Alexa pulls Fitbit data and then disconnects? Will I grant Alexa access for a period of time? How will I know when Alexa’s access has been finally removed?
When these cross-domain connections eventually extend to accessing data, such as hospital records, there will be even more complex issues to sort out. Historically, healthcare has done very little to educate patients on privacy issues and this has unfortunately left many patients with little experience or skills for managing their own medical data. Meanwhile, an onslaught of intelligent healthcare assistants is entering the picture, hoping to access these personal information domains.
On the positive front, voice tech opens the world to amazing and empowering, new opportunities, such as generating personal health data in real time. Many patients already use personal biometric sensors, such as Fitbit or Apple Watches to track their health and that data will be aggregated with new voice data to create an even more complete health graph. In the near future, voice assistants will run diagnostic algorithms, quietly listening in the background, to detect subtle changes in your speech, that may indicate developing medical issues, like depression or Parkinson’s disease. As Klick’s CEO, Leerom Segal, said in his opening keynote at the Voice.Health summit, “The other thing about voice is that it is a much more intimate data set, because it is your real life”.
Where will all this new data go and will it be secure? How will we grant our assistants permission to access these new domains? These are some of the important questions we all need to keep asking, as healthcare creates an exciting new voice tech future.
Here at Klick, we’re working closely with our clients to explore the best applications of voice tech for brand marketers. By staying true to the principles of upfront assistant disclosure, conducting interactions with full transparency, and protecting patient privacy, we’re committed to ensuring patients won’t have to ask, ‘Alexa, are you really my virtual health assistant?’ or be concerned about privacy issues in voice tech.
You can hear more thoughts about voice tech on The Voice of Healthcare podcast, Episode 16 Special Pharma Edition