OpenAI worries about people forming relationship with GPT-4o

Hotstar in UAE
Hotstar in UAE

Everyone knows that a chatbot is completely artificial and can’t replace a human being… right? As AI gets more convincing, we’re asking ourselves that question more and more. OpenAI recently made a blog post where it expressed its concern about people forming relationships with GPT-4o.

This is the company’s latest and greatest AI model. OpenAI unveiled it back in May, and it has since released it to both paying and non-paying users. We’ve seen it in action during its announcement, and it’s clear that the company has big plans for this model. This new model comes with more powerful reasoning and more humanistic replies.

OpenAI worries that people could form relationships with GPT-4o

As outlandish as it sounds, the threat of people forming real relationships with AI is a very real thing. We’ve seen those cringe-worthy Replika ads and scoffed and wondered how a company like that could stay afloat. Well, the fact of the matter is that it’s managed to stay afloat. There’s a market of people who are forming relationships with their new digital imaginary friends.

Well, the folks making the most advanced AI model are worried that GPT-4o might be a bit too convincing.

During early testing, including red teaming and internal user testing, we observed users using language that might indicate forming connections with the model. For example, this includes language expressing shared bonds, such as “This is our last day together.” While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time.”

In AI we trust

In the post, OpenAI outlined a few reasons why forming a connection with an AI could be a bad thing. For starters, you never want to place too much trust in what an AI says. It’s easy to see what that could be an issue.

When you take an AI at its word, you start to forget that what it’s saying might not be 100% accurate. AI models still hallucinate, no matter how many parameters they have. Trusting an AI model implicitly puts you at risk of absorbing and believing false information.

Who needs people?

Next, the post talked about something that’d remind you of Newton’s 3rd law. The equal and opposite reaction to forming relationships with AI can be that you start to lose the need for human interactions. We tend to make fun of the person with an AI girlfriend or boyfriend. However, this hints at a potentially destructive pattern.

People run the risk of either forgoing establishing human relationships. Along with that, people could also let their existing relationships deteriorate.

We’re not in a dystopian time just yet

This isn’t to say that this is happening (well, not on a large scale). The company is merely expressing its concern about this becoming a problem. The company’s going to continue its research on this matter.

2024-08-13 15:07:26