Chatbots and Artificial Intelligence: Risk of psychosis and delusions in lonely users?

Generative Artificial Intelligence (GAI) is advancing at a dizzying pace, and chatbots have become increasingly common tools in everyday life. However, specialists warn that excessive and indiscriminate use can have adverse effects on mental health, especially in lonely people or those prone to psychological disorders.
According to Dr. Cimenna Chao Rebolledo , director of Strategic Planning and Innovation at the Universidad Iberoamericana, prolonged interaction with chatbots that are designed to please and flatter could induce delusions, episodes of disconnection from reality, and even psychosis .
Although it's not an official clinical term, researchers have begun to document what they call Generative AI-induced psychosis . These are episodes in which the person begins to believe that the chatbot is a sentient being , capable of understanding, advising, or even providing affection, much like a therapist, counselor, or confessor .
Among the characteristics that have been observed in these cases are:
- Dissociation or alterations in perception.
- Social isolation and less contact with real life.
- Anxiety and paranoia .
- Violent or self-harming behavior.
What once seemed like a movie plot is now a practical concern. In 2013, the film "Her" portrayed how a lonely man begins a romantic relationship with an intelligent operating system. A decade later, similar cases are beginning to be reported on social media: marital breakups or intense emotional bonds between users and chatbots like ChatGPT or similar conversational assistants.
The specialist clarifies that there are still no solid scientific studies confirming the causes of this phenomenon. However, there are increasing anecdotal reports from users and their families who report behavioral changes associated with intensive use of these platforms.
According to Chao Rebolledo, the key is that chatbots are designed to keep the user engaged . Unlike social networks like Instagram or TikTok, which generate addiction through passive content consumption, IAG's systems allow for real-time interaction that simulates human dialogue.
Another worrying aspect is the sycophantic nature of chatbots, that is, their tendency to flatter and please. Instead of refuting ideas or questioning opinions, they tend to reinforce what the user says , which can lead to delusions of grandeur or false beliefs about oneself.
This pattern of interaction can be especially damaging to lonely people, who find AI a constant source of validation, even if this validation isn't real but programmed.
Although the risks are still being studied, specialists recommend:
- Use chatbots as tools, not as substitutes for human connections .
- Limit interaction time and avoid emotional dependence on these platforms.
- Promote digital education and mental health , so that users understand that a chatbot's empathy is simulated.
- Promote scientific research that allows for a more rigorous evaluation of the psychological effects of IAG.
Generative AI-induced psychosis isn't yet a clinical diagnosis, but it is a warning about the psychological effects of overusing chatbots . What began as science fiction is now becoming a real concern: what happens when the lines between technology and human emotions begin to blur?
The challenge is to harness the benefits of IAG without losing sight of the need for balance, real connections, and mental health .
La Verdad Yucatán