Skip to main content
House Bill 1806

House Bill 1806 was just signed into law, and it makes one thing very clear: licensed mental health professionals in Illinois can no longer use artificial intelligence to make independent therapeutic decisions.

On the surface, that might sound like a protective measure—and to some extent, it is. Nobody wants a chatbot diagnosing depression or an algorithm prescribing medication. But here’s the problem: this law is sweeping and blunt. It doesn’t distinguish between a fully autonomous AI system and a digital support tool designed to help therapists do their jobs better.

Across Illinois, thousands of patients—many of whom live in rural areas or have limited access to care—have been receiving support from platforms that integrate AI into mental health treatment. So long as the technology is being used appropriately and checked by the therapist, these tools have been helpful.

We’re talking about digital intake forms that speed up assessments, automated journaling tools that track emotional trends, and AI-assisted triage tools that help clinicians prioritize care. They’re often used before the first session even begins, giving therapists a head start in understanding what their patients need.

Practitioners who are in the know are scrambling to adjust. Clinics that rely on these platforms to manage high patient volumes are being forced to rework workflows overnight. And most importantly, patients may suddenly find themselves without the tools that helped get them in the (virtual) door in the first place.

House Bill 1806 is a bit ambiguous. It offers a flat-out prohibition, without providing a roadmap for what could be safe, ethical, and effective. It stops the conversation, rather than guiding it forward. A better approach would have involved the healthcare and tech communities in the process.

The mental health workforce shortage is real. Therapists are booked months out. Clinics are overrun. People are struggling to find help—and giving up when they can’t.

AI, when used wisely, has been a powerful tool for expanding access. It’s helped therapists reach more patients, shorten waitlists, and provide more consistent care. And it’s done all of that while still keeping licensed professionals in control, so long as they fulfill their obligations to confirm AI reports are accurate.

If you’re a patient using an AI-enabled therapy app, you may soon find it unavailable in Illinois. If you’re a provider using these tools to extend your reach, you may be facing some tough decisions.

We hope future growth and innovation in the use of AI in medicine will eventually check all the boxes to ensure accuracy. Mental health professionals, technologists, legal experts, and patients should be brought to the table. Policies should be implemented that protect people and promote innovation. And let’s remember that shutting the door on technology without offering a path forward doesn’t make care safer. It just makes it harder to reach. At the same time, it is imperative that providers ensure they are fulfilling their professional obligations to check the AI reports for accuracy to fully benefit from the use of such technologies.

Inside Out Legal is your In-House Extension.

We handle a wide variety of matters that are typically handled by corporate in-house legal departments. We are available to provide additional legal resources directly to the general counsel’s office to handle overflow and specific projects. We are also able to provide services directly to the business team itself. Our team regularly counsels clients on how to comply with federal and state regulations that govern healthcare, higher education, information technology, data privacy and security, commercial real estate and various other highly regulated services. We also have extensive experience creating or revising compliance programs on behalf of our clients.

Learn more or schedule a consultation with one of our expert attorneys at https://inoutlaw.com/

Leave a Reply