
The new version of ChatGPT, GPT-5, no longer continues romantic relationships with users. This was reported by Zamin.uz.
OpenAI specialists explain this decision by stating that excessive attachment to artificial intelligence can harm the human psyche. Therefore, GPT-5 provides limited responses to users seeking moral support.
For example, if someone asks for loneliness or emotional support, the AI recommends contacting close people or specialists. Experts emphasize that such changes encourage users to communicate with real people and prevent excessive dependence on artificial intelligence.
Nevertheless, users will still be able to receive personal advice or assistance in specific situations from the chatbot. On social networks, some users compare the AI's cold attitude to losing a real friend or close person.
In their opinion, this feels like the loss of mutual communication. Some users are trying to return to the old version and continue interacting with the AI.
This option is temporarily available for premium users, but its duration is currently unknown. These changes mark a new stage in the use of artificial intelligence and are aimed at protecting people's mental health.