The U.S. elections are over, and we’re all getting ready for another four years until the next one. As we expected, we saw a ton of misinformation and deepfakes because of AI technology, and a report from one of the biggest AI companies just put this into perspective. OpenAI announced that DALL-E blocked a quarter-million election-related queries during the elections.
Hundreds of millions of people have access to tools that allow them to literally create events that never happened, not to mention that it was an election year. What could possibly go wrong?! Let’s face it, once we started to see the early potential of AI tech back in late 2022/early 2023, we knew that the election was going to be tricky. While this election has passed, we have another one in four more years, and there’s no telling where AI technology will be by then.
DALL-E blocked a quarter-million election-related queries
OpenAI had its hands full this election year. Of course, people from both sides of the political spectrum wanted to generate images that shine a harsh light on Trump/Vance or Harris/Walz. No side had a moral high ground, as both sides wanted their elect to win the White House. It got so heated that DALL-E, OpenAI’s image generator, received more than 250,000 queries to generate images depicting the aforementioned four.
According to a report from OpenAI, this is because of a safety measure that keeps DALL-E from generating images of real people. We’re also sure that OpenAI paid special attention to block requests related to any political figures.
So, OpenAI played its part in reducing the spread of misinformation during this election season. The only issue is that DALL-E is only one tool in the tool shed. There are countless DALL-E alternatives out there. Sure, many of the top tools out there have similar safety measures in place, but not all of them. For example, Grok’s image generation tool is a loose cannon and doesn’t come with any guard rails. If you don’t want to put up with DALL-E, then Elon Musk’s tool is only a few clicks and $16 away.
As much as we like to believe that safety measures will help preserve democracy, we’ve already seen instances where deepfakes have hit the news. People are always going to find a way around the barriers. Now, just imagine how bad things will be in another four years!
2024-11-12 15:11:57