Privacy beyond HIPAA in voice technology
While voice has been touted as an emerging technology with the ability to lower the bar to entry, industry players are now starting to warn of privacy gaps. Amazon Alexa and Google Home devices are now becoming a frequent household item, used for everything from ordering a new wardrobe to helping with homework.
But when used in the medical industry, the technology needs to be administered differently than in the consumer world.
“When it comes to healthcare and voice design, we have several challenges we face every day,” Freddie Feldman, voice design director at Wolters Kluwer Health, said at The Voice of Healthcare Summit at Harvard Medical School last week. “HIPAA is a big topic on everyone’s mind nowadays, and it is one we take seriously. The first thing most people think about when they hear HIPAA is securing servers platforms, but there is more to it. We have to consider things like the unintended audience for a call.”
He said that due to the nature of voice, even leaks not expressly prohibited by HIPAA can be inappropriate. For example, if the voice technology is intended for home use and gives a message from the radiology department to the house, then it’s giving away too much information, he said.
Much of it comes down to appropriate use. For example, putting the speakers into a hospital room setting poses a different set of challenges.