GPT-5 clashes with emotions

How can a model like GPT-5, presented as "college-level," provoke so many negative reactions? Its launch has opened an unexpected emotional wound: that of users who saw the disappearance of GPT-4o, the model with which they had established a relationship that, in many cases, was dangerously emotional. We are entering into what is known as emotional design.
The success of cell phones, microwaves, cars, and toasters isn't just a matter of technology or design, but also of emotions. My toaster may not be the most advanced on the market, and I often have to worry about burning my bread. But it's mine, and it's been toasting my bread every morning for twenty years. We establish emotional bonds with objects, and this makes us perceive them as better. In this regard, AI, however advanced it may seem to us, is exactly like a toaster.
If AI doesn't take into account people's emotions, it won't be as intelligent.It took OpenAI three years to realize this, until they released GPT-5 and eliminated the rest of the models. The death of GPT-4 was particularly mourned. The world, which OpenAI adopted as its laboratory, is not passive: it is populated by human intelligences. With their corresponding emotions.
A product like ChatGPT, which is part of everyday life, inspires trust, and even provides companionship (worryingly, too many people use it as a psychologist), facilitates the creation of emotional bonds. It's not unique to AI (we all know men who are in love with their cars), but it's easier to establish an emotional relationship with a system that writes and addresses you by name than with a Cupra.
Faced with so much excitement—and a user revolt—Sam Altman, OpenAI's chief executive, made a mea culpa: he heard the users' cries and would try to resolve them. And GPT-40 was resurrected.
From an engineering perspective, GPT-5 represents a real breakthrough: more words processed, fewer hallucinations, and, on paper, more efficiency. This new model doesn't seem like a qualitative leap, but it does represent a quantitative leap in reducing computing costs. OpenAI needs it if it doesn't want to continue losing even more money.
But neither the coldness of test indicators nor the financial demands can override human experience. GPT-5 has failed not technically, but socially.
In an unusual intervention, Altman admitted that the AI bubble is about to burst. I think he's absolutely right. But he's wrong if he thinks it'll only come from market expectations: it will come from people's over-expectations of AI and from AI's ignorance of the fact that humans have emotions and are generally stubborn.
Those who promised a general artificial intelligence have stumbled upon a basic emotional intelligence.
lavanguardia