Chatbots, Healthcare, and Legal Advice: Best Practices

Devin Morrissey
BotPublication
Published in
4 min readApr 14, 2020

Image Source: https://unsplash.com/photos/uLnmmE8Y0E4

Chatbots are an incredible gift to healthcare delivery. They allow patients to get up-to-date information, prevent them from falling through the cracks, and save time, money, and resources for all parties.

While healthcare delivery is one area that benefits from technology, the application of new technologies also opens up new questions for both practitioners and patients. All healthcare organizations have commitments to privacy and security both under HIPAA and in the context of medical ethics. Internet-enabled technologies may offer the transformation of care, but there’s little room for hiccups.

What are the legal implications of using chatbots in healthcare? Are they good for more than appointment setting? What’s the worst that could happen in the event of a data breach?

Note: Any advice provided in this article is not a substitute for professional legal advice.

What Chatbots Offer Healthcare Organizations

Today’s consumer wants easy access to the information they need. They’re also increasingly interested in self-service options, but they want those self-service options to remain personal. No one likes being hung up on by a human, but there’s something uniquely aggravating about being hung up on by a piece of software.

Chatbots offer this kind of personalized self-service that was previously unavailable. It’s digital, available on multiple channels, and the bots are customizable for your organization.

What can chatbots offer healthcare organizations? The list is longer than you think. They can:

Choose HIPAA-Compliance First

Chatbots are now a staple of customer service across industry in general. While many sectors have the option of choosing the right chatbot for their needs and customer base, healthcare organizations don’t have that luxury. The number one feature a health chatbot can offer any practice or hospital is compliance with the Health Insurance Portability and Accountability Act (HIPAA).

In other words, healthcare companies have very few options. You can’t adapt a Facebook Messenger bot or use SMS messages because they aren’t and will likely never be compliant with the HIPAA Security Rule.

What makes a HIPAA-compliant chatbot? First, the product must ensure all personal health information (PHI) stays between the patient and the provider. The exchange must be direct: there can’t be a man-in-the-middle situation. Second, any and all data must be fully encrypted, both at-rest and in-transit. Lastly, if both the healthcare administrator and the business associate do not understand the concepts of compliance, there needs to be recourse for either party.

This is where security measures obviously come in; encryption and direct transmission aren’t the only requirements, but they’re the two big ones that stop covered entities from using most products on the market. And of course, the covered entity needs to engage in a written business associate contract with the chatbot provider, which means being responsible for the associate if there’s a breach.

With Great Chatbots Come Great Responsibility

Even when an organization uses a secure chatbot, the organization also needs to go the extra mile to further secure it. Remember that your network is only as secure as the people who interact with it. And given your requirement to remain responsible for your business associates, you want to ensure that this applies both within your organization and your chatbot provider’s.

Surveys from as recently as 2019 show that only 32% of healthcare employees receive cybersecurity training. As your healthcare organization adopts new tech, you also adopt new training responsibilities. You wouldn’t deploy a new surgical tool or medical device without first training staff, and the same is true of something as seemingly simple as a chatbot.

Who needs to be trained in chatbot procedures? Everyone. Remember that chatbots aren’t autonomous. If you intend to use them in communications between providers and patients or for intra-organizational communication, the training is particularly important.

Remember that Chatbots Don’t Replace Doctors

Finally, chatbots present a continuing issue of liability. It’s important to remember that while some healthcare tasks are perfect for automation and AI, you should not extend your chatbot into areas where you put patients’ health at risk. For example, chatbots work well for triaging patients and directing them to the right service. Chatbots are an inexpensive way to re-direct patients to the nurse call line, a retail clinic, urgent care, or the ER. They also protect patients from Googling their symptoms and acting on misinformation.

However, there are limits to this type of application. You might be tempted to use chatbots to triage drug overdoses if you operate in an area that’s particularly affected by the opioid epidemic. The “good Samaritan law” of 39 states does allow you to direct people to the ER without fear of arrest. You should make sure your chatbot directs anyone with a suspected overdose straight to the ER rather than walking them through a series of questions to qualify the patient.

In other words, don’t allow the chatbot to try to play doctor. It’s a tool: not a professional. Chatbots don’t have emotional intelligence nor do they have a physician’s license. While there is potential for chatbots in the realm of remote healthcare, their role remains rightly limited.

Chatbots absolutely have a role in the future of care. What’s more, their role is likely to grow as associated tech, like the Internet of Things, 5G, and artificial intelligence, also improve. Before that happens, it’s important to remember that healthcare organizations have a responsibility to their patients’ safety and privacy first. As long as patient care remains at the center of our adoption of new tech, we will see both sides flourish.

--

--

Devin Morrissey
BotPublication

Devin prides himself on being a jack of all trades; his career trajectory is more a zigzag than an obvious trend, just the way he likes it.