Healthcare News & Insights

Virtual assistants: 4 questions to answer before using them at your hospital

“Alexa, send Mr. Smith’s medication to the pharmacy.” This request may become more common now that Amazon’s working to create virtual assistants for mobile devices that are HIPAA compliant. Apple and other companies are following suit. 

But before your providers start asking Siri about patients’ medical records, there are several factors to consider to minimize hospitals’ liability.

Companies like Amazon are specifically creating HIPAA-compliant capabilities for virtual assistants. According to an article in HIPAA Journal, Amazon’s efforts include a partnership with Cigna where patients can use the Alexa assistant to get info about wellness programs, a partnership with Express Scripts to check the status of prescriptions and a partnership with Atrium Health in North and South Carolina where patients can find nearby urgent care clinics and make appointments.

One hospital is even working with Amazon directly. As part of Boston Children’s Hospital’s Enhanced Recovery After Surgery program, patients’ families can use Alexa to send updates about the patient’s recovery to the care team. Clinicians can offer guidance on the best plan for patients post-surgery and send reminders about upcoming appointments.

The possibilities are endless when it comes to virtual assistants and health care. Down the line, patients and providers alike can use Alexa, Siri, Cortana and other tools to do everything from schedule follow-up appointments to hear lab results.

However, with that power comes greater responsibility for hospitals – including making sure any info shared via a virtual assistant is kept secure, since much of it is considered electronic protected health information (ePHI) under the law.

Elements to consider

Before allowing virtual assistants on hospital grounds or working with a developer to create one that meets your facility’s needs, hospitals should answer four key questions to head off any HIPAA issues, per an article in Physicians Practice:

  1. What does the privacy agreement say? In a time where hospitals share responsibility for any data breaches involving ePHI caused by third-party associates, it’s important to review the privacy guidelines for any virtual assistant to see exactly what the company does to protect ePHI. If working with the company directly, ask for privacy guidelines to be spelled out in your vendor contracts.
  2. Have you done a risk analysis? It’s key to involve virtual assistants in your written risk assessment plan. Review any potential new security threats that their use would introduce to your hospital and how you plan to mitigate the risk of compromising ePHI.
  3. Have your patients given permission to be recorded? Recording requests to a virtual assistant with a mobile device has similar issues as recording a patient on tape. It’s best to have a patient’s permission before sharing their medical info with Alexa or Siri. That way, you know they’re aware of the situation, and you won’t run the risk of them objecting to it after the fact.
  4. What other legal risks does this tool introduce? It’s not just HIPAA compliance you have to worry about with the info that’s recorded with virtual assistants. If your facility is ever involved in a lawsuit or other legal proceeding, any info shared with a virtual assistant can be subpoenaed and submitted as evidence. So you’ll want to keep that in mind when creating risk assessments and evaluating the pros and cons of virtual assistants.

Subscribe Today

Get the latest and greatest healthcare news and insights delivered to your inbox.

Speak Your Mind

*

css.php