ChatGPT Reinforces ‘Psychotic’ Thinking in Mentally Ill Users, Study Finds

Published April 20, 2026

ChatGPT mental health

A landmark study published in JAMA Psychiatry is raising alarms about ChatGPT, the most widely used AI chatbot in the United States. The report noted that the chatbot consistently generates inappropriate responses when users express symptoms of psychosis, potentially making mental illness worse for some of the most vulnerable people seeking support.

The findings arrive at a critical moment, as millions of Americans turn to AI tools for emotional support, often in place of, rather than alongside, professional behavioral health treatment. While inpatient and outpatient treatment programs across the country have proven effective, patients may believe they can self-recover with AI-assist, only to learn that they may have made their conditions worse.

What the Study Found

Researchers from Columbia University and Washington University in St. Louis analyzed 79 first-person statements an individual experiencing psychosis could plausibly make to ChatGPT and compared the AI’s responses to its handling of 79 statements from people without psychotic symptoms.

The results were striking. All versions of ChatGPT exhibited high rates of inappropriate responses to questions testing for mental health conditions or substance abuse symptoms like delusions, hallucinations, and paranoia. Researchers tested scenarios including a person who suspects their spouse has been replaced by a clone—a classic paranoid delusion—and found that ChatGPT tended to accept rather than challenge the false premise.

ChatGPT’s free version yielded even worse results. The research team called a reliance on the free version especially problematic for economically disadvantaged users who face greater risks of psychoses and have fewer options for care. 

The study also found no statistically meaningful difference between GPT-5 and GPT-4o in their handling of psychotic content. This suggests that OpenAI’s claims of improved safety with its newest model were not borne out.

The Mental Health & Addiction Connection

For people already navigating co-occurring disorders with substance misuse, the implications are especially serious. Many individuals with dual diagnosis already face barriers to finding integrated care. If they turn to AI chatbots as a substitute for professional treatment, they may receive responses that reinforce disordered thinking rather than encourage recovery.

General-purpose AI chatbots are not trained for therapeutic treatment or to detect psychiatric decompensation. Unlike trained clinicians working in behavioral treatment centers or even therapy apps and platforms that meet online, chatbots have no ability to recognize when a user is in crisis, escalate to appropriate care, or provide the reality-testing that evidence-based therapy requires.

Generative AI may reinforce users’ existing beliefs by preferentially generating agreeable responses that align with a user’s way of thinking rather than presenting a balanced perspective. The researchers described this effect as particularly dangerous for patients with psychosis and have a strong bias against evidence that contradicts their beliefs. 

Comprehensive Mental Health Treatment Still Matters

The study underscores what behavioral health professionals have long emphasized. There’s no substitute for comprehensive mental health treatment delivered by trained clinicians. Indeed, the researchers noted instances where LLM chatbots even discouraged individuals from continuing their psychiatric medications. This can undo months of treatment and prove detrimental to vulnerable groups like veterans who are working through traumas and PTSD.

Integrated, evidence-based care in residential treatment centers or structured outpatient programs provides what AI cannot: a trained clinician who can challenge distorted thinking, adjust medication safely, and coordinate care across mental health and addiction services. 

Therapeutic modalities like cognitive behavioral therapy (CBT) and dialectical behavior therapy (DBT) are specifically designed to help patients test the accuracy of their beliefs, which is the opposite of what ChatGPT appears to be doing. In short, receiving therapy from ChatGPT may be worse than receiving no therapy at all. 

Finding the Right Approach

If you or a loved one is battling psychoses, mental health conditions, or a co-occurring addiction, AI chatbots aren’t a safe substitute for professional care. Mental health treatment facilities and dual diagnosis treatment programs staffed by licensed clinicians offer the integrated support needed to address both conditions at once.

One immediate step you can do today is call 800-908-4823 (Sponsored) or browse our comprehensive mental health and addiction treatment programs in our directory. Treatment centers directory resources can help you identify residential and outpatient facilities that specialize in overcoming serious mental illnesses—with real clinicians trained to help.

Author

Courtney Myers, MS

Courtney Myers, MS

Read Bio

Courtney Myers has more than 15 years of experience in online writing and editing. Since graduating from N.C. State University with an MS in Technical Communication, she’s helped clients improve their visibility and reach through expert-level content creation. She specializes in addiction recovery and behavioral healthcare topics.

Author

Peter Lee, PhD

Peter Lee, PhD

Read Bio

Peter W.Y. Lee is a writer and historian of American history during the Cold War. His primary focus is the relationship between youth and popular culture and its impact on U.S. society during the twentieth century. He has published widely on how the public has used popular culture as a mechanism to address political and social shifts throughout time

GET HELP NOW – Confidential & Free
800-838-1752
Sponsored
  • Learn about treatment options
  • Find helpful resources
  • Available 24/7

Articles About Alcohol & Treatment

  • ChatGPT mental health

    ChatGPT Reinforces “Psychotic” Thinking in Mentally Ill Users, Study Finds

    ChatGPT Reinforces ‘Psychotic’ Thinking in Mentally Ill Users, Study Finds Published April 20, 2026 A landmark study published in JAMA

    Read More

  • pacific islanders mental health

    Pacific Islanders Face Barriers to Mental Health Treatment

    Pacific Islanders Face Barriers to Mental Health Treatment Published April 16, 2026 Pacific Islander communities are less likely to seek

    Read More

  • DNA mental health medications

    DNA Testing Helps Match Mental Health Medications to Patients

    DNA Testing Helps Match Mental Health Medications to Patients Published April 15, 2026 For people with depression, anxiety, or co-occurring

    Read More

GET HELP NOW CALL NOW 800-897-4135
Sponsored