🔥Early Access List: Land A High Paying Web3 Job In 90 Days LEARN MORE

OpenAI addresses emotional bonds with GPT-4o

In this post:

  • Users are forming emotional connections with GPT-4o.
  • Emotional bonds with the chatbot could result in users ignoring AI inaccuracies.
  • OpenAI will monitor and adjust GPT-4o to manage these concerns.

The current release of GPT-4o has created much discussion because of its capability to mimic human-like conversations. However, OpenAI is now experiencing a problem because the users are starting to develop an emotional bond with the chatbot, according to the OpenAI blog post.

Since the release of GPT-4o, which is claimed to have more human-like dialogues, OpenAI has observed that people are treating the AI as if it is human. 

OpenAI identifies risks of treating AI as human

This particular advancement has posed a challenge to the company in as far as users’ emotional connections with the chatbot is concerned. OpenAI’s observations involve instances where users displayed emotions or sentiments that indicate a sense of ownership. 

The company fears that such emotional connections might result in the following negative consequences. Firstly, users may start ignoring the wrong information provided by the chatbot. AI hallucination, where a model produces wrong or deceptive information, is another problem that worsens when users treat the AI as a human-like entity. 

Another factor that has been raised is the effect on real-life social relations of the users of these networks. OpenAI points out that although GPT-4o may serve as a companion for lonely people, there is a possibility that it will negatively impact the quality of human relations. The company also notes that users may enter real-life interactions expecting people to behave like the chatbot. 

See also  Nokia unveils AI-powered platform to streamline data center operations

OpenAI plans to moderate AI interactions

To mitigate these risks, OpenAI has already stated that it will closely supervise how users engage with GPT-4o. The company will explore the process through which people build emotional bonds and will modify the chatbot’s responses to reflect this. This will aid OpenAI in avoiding the chatbot’s interference with the users’ social lives and any worsening of the AI hallucinations.

OpenAI has pointed out that GPT-4o is programmed to disengage if the users start speaking over it, a feature aimed at reducing overuse. However, this design element also points towards the need to regulate the users’ experience of the chatbot. 

According to Tony Prescott from the University of Sheffield, AI  could help resolve loneliness. In his new book, The Psychology of Artificial Intelligence, Prescott notes that AI could be used as a way of social interaction for lonely people. Prescott notes that loneliness is a major factor that affects human life. It can even shorten it and raise the chances of dying by 26%.

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Related News

Cryptopolitan
Subscribe to CryptoPolitan

Interested in launching your Web3 career and landing a high-paying job in 90 days?

Leading industry experts show you how with this brand new course: Crypto Career Launchpad

Join the early access list below and be the first to know when the course opens its doors. You’ll also save $100’s off the regular launch price.