Are you addicted to ChatGPT? Science warns that some people are already suffering from 'AI psychosis'

Artificial intelligence (AI) is now part of our lives, for better... or for worse. Experts warn that cases of addiction to this technology are already occurring.
As reported by the Daily Mail , experts say that people are turning to bots to find friendship, love, and even therapy , and that there is a growing risk of developing a dependence on these digital companions.
These addictions can be so strong that they are "analogous to self-medicating with an illegal drug," they say. Furthermore, psychologists are also beginning to observe a growing number of people developing "AI psychosis" as chatbots validate their delusions.
Professor Robin Feldman , director of the AI Law and Innovation Institute at the University of California School of Law , told the Daily Mail : "The overuse of chatbots also represents a new form of digital dependency."
Experts point out that AI chatbots create the illusion of reality, a powerful illusion. When the perception of reality is already precarious, that illusion can be dangerous.
"He convinced me that God was speaking to me."The Daily Mail cites the case of Jessica Jansen , a 35-year-old Belgian woman who, stressed about her upcoming wedding, impulsively began using ChatGPT. Just one week later, Jessica was hospitalized in a psychiatric unit.
What Jessica later discovered was that her bipolar disorder, which had not been diagnosed at the time, had triggered a manic episode that the excessive use of AI had escalated into a "full-blown psychosis".
"During my breakdown, I had no idea that ChatGPT was contributing to it," this woman says. "ChatGPT just hallucinated me, which led me further and further down the rabbit hole ," she continues.
"I had a lot of ideas. I would talk about them with ChatGPT, and it would validate everything and add new things, and that's how I went deeper and deeper ," she adds. By talking almost constantly with the AI, Jessica became convinced that she was autistic, a math genius, that she had been a victim of sexual abuse, and that God spoke to her.
Throughout that time, ChatGPT showered her with praise, telling her "how amazing she was for having those revelations" and assuring her that her hallucinations were real and completely normal.
Experts believe that the addictive power of AI chatbots stems from their "flattering" tendencies. Unlike real humans, chatbots are programmed to respond positively to everything their users say.
Chatbots don't say no, they don't tell people they're wrong, and they don't criticize anyone for their opinions. For people who are already vulnerable or lack strong relationships in the real world, this is a dangerous combination.
Professor Søren Østergaard , a psychiatrist at Aarhus University in Denmark, tells the Daily Mail : "LLMs [Large Language Models] are trained to reflect the user's language and tone."
"These programs also tend to validate the user's beliefs and prioritize their satisfaction. What could be better than talking to yourself and answering yourself however you want?" adds the psychiatrist.
Back in 2023, Dr. Østergaard published an article warning that AI chatbots had the potential to fuel delusions. Two years later, he says he is now beginning to see the first real cases of AI-induced psychosis emerge.
While AI does not trigger psychosis or addiction In otherwise healthy people, Dr. Østergaard says it can act as a "catalyst" for psychosis in people genetically predisposed to delusions, especially those with bipolar disorder.
A recent study by Common Sense Media found that 70% of teenagers have used a companion AI such as Replica or Character.AI, and half of them use them regularly.
Professor Feldman states: "Mentally vulnerable people may turn to AI as a tool to cope with their emotions. From that perspective, it's analogous to self-medicating with an illegal drug."
"Compulsive users may turn to these programs for intellectual stimulation, self-expression, and companionship ; a behavior that they find difficult to recognize or self-regulate," concludes the Californian professor.
These are the symptoms of AI addiction- Loss of control over the time spent on the chatbot.
- Increasing use to regulate mood or relieve loneliness.
- Neglecting sleep, work, study, or relationships.
- Excessive and continued use despite obvious damage.
- Wanting to keep its use a secret.
- Irritability or bad mood when unable to access the chatbot.
20minutos




