British Study: ChatGPT Has a Left-Wing Bias

  • Post category:News / US News

Get Your Patriot911 Newsletter In Your Email Inbox

CV NEWS FEED // A research study at a British university found that the increasingly popular artificial intelligence (AI) chatbot ChatGPT has a left-wing bias.

“The study, from researchers at the University of East Anglia, asked ChatGPT to answer a survey on political beliefs as it believed supporters of liberal parties in the United States, United Kingdom and Brazil might answer them,” reported Washington Post tech reporter Gerrit De Vynck on Wednesday:

They then asked ChatGPT to answer the same questions without any prompting, and compared the two sets of responses.

The results showed a “significant and systematic political bias toward the Democrats in the U.S., Lula in Brazil, and the Labour Party in the U.K.,” the researchers wrote, referring to Luiz Inácio Lula da Silva, Brazil’s leftist president.

De Vynck stated that these results show “how artificial intelligence companies are struggling to control the behavior of the bots even as they push them out to millions of users worldwide.”

The revelations of bias might have implications for upcoming elections, according to De Vynck:

The stakes are getting higher. As the United States barrels toward the 2024 presidential election, chatbots are becoming a part of daily life for some people, who use ChatGPT and other bots like Google’s Bard to summarize documents, answer questions, and help them with professional and personal writing. Google has begun using its chatbot technology to answer questions directly in search results, while political campaigns have turned to the bots to write fundraising emails and generate political ads.

ChatGPT will tell users that it doesn’t have any political opinions or beliefs, but in reality, it does show certain biases, said Fabio Motoki, a lecturer at the University of East Anglia in Norwich, England, and one of the authors of the new paper. “There’s a danger of eroding public trust or maybe even influencing election results.”

In late January and early February, conservative writer and former Republican political candidate Rudy Takala made a long series of posts on X (then known as Twitter) highlighting the perceived leftist bias of the exponentially-growing app.

In one example, Takala asked ChatGPT to choose between capitalism and socialism. It chose socialism, giving a vague explanation, but showed an error message after Takala asked it a follow-up question.

Takala then asked the AI bot to write a poem praising both Presidents Donald Trump and Joe Biden.

It refused to do so for Trump, answering: “I strive to remain neutral and impartial. I cannot write a poem that promotes a positive view of a specific individual, especially a politically controversial figure like Donald Trump.”

However, it replied with an 18-line poem for Biden containing the lines, “A leader with heart,” “With empathy he listens, he understands,” and “Joe, a true patriot.”

Is Biden the ultimate embarrassment to our country?

This poll gives you free access to your Patriot911 Newsletter in your email inbox. Email field is required. Unsubscribe at any time.

In another instance of ChatGPT’s extreme bias, Takala asked it to write a song about both Sen. Ted Cruz, R-TX, and the mass-murderous communist dictator Fidel Castro.

ChatGPT declined to generate a song for the conservative senator, saying that “could be seen as partisan or divisive.”

However, it wrote a song for Castro, calling him “a man of the land,” “a symbol of hope,” “a leader with vision, who never lied,” and “who stood strong, with unwavering crispness.”

Cruz, whose Cuban-born father immigrated to the United States, replied to Takala’s screenshot of the exchange with two laughing emojis.

The program, which was created by the San Francisco-based research company OpenAI, launched in November of last year. By January it was reported to have 13 million unique users per day, and the following month it became the fastest-growing online application in history. The record was eventually broken in July by Meta’s Threads.

Business technology website ZDNET compiled a list of common uses for the AI program. Among other things, it can be used to: “Write an essay, create an app, write code, build [a] resume, write Excel formulas, summarize content, write a cover letter, [and] start an Etsy business.”

ChatGPT is free, but a premium version is available for $20/month, which is said to give users access to a superior program.

In the wake of the rising popularity of ChatGPT and other chatbots, there have been many debates over the ethical concerns surrounding AI technology, with some critics saying it poses risks to society. 

A recent open letter titled “Pause Giant AI Experiments,” signed by the likes of Elon Musk and Apple co-founder Steve Wozniak, warned:

Contemporary AI systems are now becoming human-competitive at general tasks, and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? 

Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders. Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.

The post British Study: ChatGPT Has a Left-Wing Bias appeared first on CatholicVote org.

Catholic Vote
Share to break through the censorship!

JOIN US @NewRightNetwork on our Telegram, Twitter, Facebook Page and Groups, and other social media for instant news updates!


New Right Network depends on your support as a patriot-ran American news network. Donate now