By now you’ve likely heard that Amazon is moving into the HIPAA space with Alexa. In conjunction with their partners, they’re launching what Amazon calls “HIPAA compliant” apps. If only it were that easy to create a HIPAA covered app, or as Amazon calls it, skill. As with Amazon Web Services (AWS) it’s ultimately up to the individual developers to honor the law. While Amazon may well be a trusted third party, if developers don’t build apps or “skills” with privacy, security and HIPAA compliance in mind, I wouldn’t trust Alexa with any of my healthcare data.
Having worked with a number of software development companies in the healthcare space, I can tell you that more often than not developers want to create cool and useful things. The problem with that is cool and useful don’t always automatically align with security and privacy needs. In fact, in the development process you won’t often find security and privacy at the top of the priority list.
Time, Trust & Alexa Users
Trust but verify, folks. If I were the covered entity or business associate planning to eventually trust the Amazon platform to adequately secure protected health information, I would want assurances and proof that the developers of any app/skill have privacy and security baked in.
Granted, Amazon has indicated that trust takes time to build. That it will be a while before patients widely trust Alexa with sensitive health information. Ok, let’s say we’re down the road a way. Trust has been earned. Now we face another potential issue that has nothing to do with the platform or any of its available apps/skills. The concern lies with the end users. End users are not always savvy when it comes to protecting sensitive personal information. I think this is where there’s a definite need for some education on the part of partners, Amazon and healthcare providers.
Right now, when Alexa is on, it listens to all voices in the room, all the time. How awful would it be if an end user thought he or she was taking advantage of touted HIPAA compliant solutions but instead was airing sensitive information using another Alexa app? Hopefully Alexa will also be smart enough to not broadcast protected health information like “You need to follow up with your [insert-private-condition-here] specialist” to the whole house.
Alexa (Amazon), are you listening?