
Artificial intelligence is increasingly used in mental health care through virtual therapists in VR settings, digital therapeutics, chatbots for symptom tracking, and wearable devices that monitor patients.
While these tools show promise in reducing symptoms and improving access, experts caution that they cannot replace the empathy of human therapists and may pose risks if misused without proper ethical guidelines.
Concerns over these risks have led Illinois to become the third state, after Utah and Nevada, to ban licensed therapists from using AI chatbots in treatment.
Experts warn the move, meant to protect patients, could restrict access to affordable care, especially for Black families who already face barriers to mental health services.
The new “Therapy Resources Oversight” bill prohibits therapists from using AI tools for treatment decisions or communication with clients. It also bans companies from marketing chatbots as substitutes for licensed care.
Violations can result in civil penalties of up to $10,000, enforced by Illinois’ Department of Financial and Professional Regulation upon receiving complaints.
Black families often struggle to find culturally responsive treatment. One review found only eight percent of school-based mental health programs addressed the specific needs of Black American male students, even though Black American male youth experience mental health issues at rates 40 percent higher than their peers.
Research has also shown that AI models struggle to interpret Black vernacular and cultural contexts, which can reduce the accuracy of detecting depression in Black users.
Meanwhile, regulators and psychologists say unregulated chatbots can do more than make mistakes.

RELATED: Don’t Let a Chatbot Mislead You on Your Health
A Stanford study released in June found many chatbots fail to avoid dangerous prompts, even when users indicate suicidal thoughts, and may enable harmful behavior rather than deter it. Other reports describe cases of “AI psychosis,” where users develop delusions or unhealthy attachments to AI systems. One case involved a man convinced he could fly after extended ChatGPT use.
Vaile Wright of the American Psychological Association said therapists play a role that machines cannot replace. “Therapists are validating, but it’s also our job to point out when somebody is engaging in unhealthy thoughts, feelings, behaviors, and then help somebody challenge those and find better options,” Wright told the Washington Post.
Mental health advocates caution that while safety is important, removing AI-assisted tools could also eliminate a low-cost support option. That concern is especially relevant in underserved communities.
A recent study found self-referral chatbots helped increase mental health access among minority groups, with referrals rising 29 percent for underrepresented ethnic groups and 179 percent for nonbinary individuals.
Another report from the National Center for Biotechnology Information (NCBI) highlighted both the promise and the challenges of using artificial intelligence in mental health care. In the review, researchers found AI can support diagnosis, treatment planning and real-time monitoring for conditions such as depression, schizophrenia and autism.
Tools like chatbots and wearable technology may help expand access and personalize care. But NCBI also stressed that AI systems must be designed to address cultural differences, avoid bias and account for ethical concerns before they can be fully trusted in clinical practice.
Even as states pass bans, enforcement remains a challenge. The laws apply only to licensed providers, meaning individuals can still turn to AI chatbots for emotional support. Experts warn that the laws cannot prevent vulnerable people from relying on chatbots, often without knowing the risks.
Proponents of the bans say they are necessary to protect patients from harm, but experts argue that states must also consider equity and affordability.
Without alternatives, Black families and other underserved communities may be left with fewer options for accessible mental health care.






