OpenAI has been making some large leaps into the public mindset ever since it unveiled ChatGPT. Since it’s a big company doing big things, it was only a matter of time before it got caught up in some big trouble. According to The Washington Post, OpenAI is in the midst of a lawsuit over how it obtained data to train its LLM (Large Language Model).
AI Chatbots (and let’s rope image generators into this also) are like human brains; they have to learn. Companies scrape millions of words and feed them into the LLM so that it could learn the elements of speech, facts, etc. It’s like being an infant and learning how to speak just by growing up around your family.
But, OpenAi is in a lawsuit because of how it obtained its data
It was a matter of time before someone looked at what ChatGPT is and how it learns. A law firm in California claims that OpenAI violated the rights of millions of people around the world by using their written content to train ChatGPT.
This is a controversy we saw not too long ago with image generators using the art from artists to learn. The thing is that ChatGPT used all sorts of content from people including reports, articles, social media posts, etc. When they posted these to the internet, they didn’t consent to having it scraped to train an LLM. This means that OpenAI could have violated their right to privacy.
For all we know, if you use ChatGPT to generate a poem, it could generate it using information scraped from a poem that you wrote. This goes for any story, article, or social media post you made.
Sure, some people might not care but think back to the whole controversy with artists. If you’re an artist who opposes AI art, you wouldn’t want a company using your art to train its image generator. So, if you’re an author, poet, journalist, or another type of writer, you might not want ChatGPT using your content to train its LLM.
Since this is a lawsuit on a large scale, you should expect it to go on for a while. At this point, we don’t know too much about the case like the damages. We’ll have to wait on more information about this subject.
2023-06-29 15:10:25