Should You Really Share Your Medical Information With Alexa?

A lot of Americans have welcomed digital assistants into their homes with open arms. But amidst concerns about who exactly is listening to the conversations recorded by Alexa devices, the company is foraying into the medical field — and consumers may be right to think before they speak.

Although the process of reaction injection molding — in which two liquid components are mixed and cured into a mold to create plastics — is responsible for creating countless medical devices, it might not be too long before a different type of technological gadget is a mainstay in healthcare facilities across the nation. Recently, Amazon gained approval to run an invite-only program for patients in Cedars-Sinai Hospital that granted Alexa the ability to take on healthcare skills. HIPAA-compliant developers created and launched specific skills for the device that allowed participants to book appointments, check on prescriptions, ask for recent blood sugar readings, and analyze hospital post-discharge instructions.

The pilot program was a big step for Amazon, though the company is quick to point out that this and any future iterations will allow only select entities and business associates subject to HIPAA to create the medical skills adopted by Alexa. The company itself provides the skill building environment, which is also subject to HIPAA guidelines, though the individual developers are solely responsible for complying with the U.S. Health Insurance Portability and Accountability Act of 1996.

In a statement, Amazon said: “These skills are just the first step in making it easier for customers to manage their healthcare needs using just their voice — we’re excited to see what developers build next.”

Although the average hospital owns or rents over 35,000 SKUs of equipment at any given time, it may be a while before you see an Alexa in your local clinic or doctor’s office. Although AI tools can certainly provide greater efficiency and accuracy for patients, many Americans have concerns about whether it’s a good idea to rely on these devices so heavily. There may even be a question of whether the inclusion of Amazon devices could increase costs. Approximately 15% of Baby Boomers say budgeting is their biggest challenge when planning a trip, and since healthcare costs can be a burden for individuals of any generation, it’s understandable that patients might have questions about how Alexa might impact the cost of their hospital stay.

In addition, consumers will need to know much more about what these digital assistants are doing with that sensitive health data before being so free to divulge personal information. Amazon has stated that the company maintains several different layers of security to all Alexa skill data (including access control, encryption, and secure cloud storage), while HIPAA regulations require further safeguards to ensure information is protected. But after Bloomberg recently reported that Amazon employees around the world are listening to voice recordings captured by the company’s Echo devices, consumers may be right to be worried.

Reportedly, the purpose of listening to those voice recordings is to have them transcribed and fed back into the operating software. This will help the device better understand human speech, but it also lets Amazon employees become privy to conversations that users might not want anyone else to hear. Workers have allegedly heard evidence of possible crimes, though employees have no way of tracing the voices to identify the individuals. The fact that it’s possible to access these recordings could create real issues for patients who choose to share medical information with Alexa, regardless of the safeguards that are supposedly in place. What’s more, as The Verge points out, there actually is no certification for HIPAA compliance; the process is self-implemented and verbiage can be vague, which makes some experts wonder what Amazon can really do with this information and the security steps the company needs to take to keep that information truly protected.

Whether you own an Alexa at home or might be open to using one in a healthcare situation down the line, it may pay off to err on the side of caution. Consider deleting old Alexa recordings, turning off the microphone and camera when the device isn’t in use, and disable the “drop in” feature to ensure that you don’t make it easy for other devices to listen in. Until more is known about how Alexa recordings are being used and how these devices could impact the healthcare field, it might be wise to stick to more traditional methods of checking on prescriptions or discussing details of your personal life.

Comments: 0

Your email address will not be published. Required fields are marked with *