US presidential election|According to Amazon, the political views expressed by its virtual assistant were errors that were corrected as soon as they were made public.
The summary is made by artificial intelligence and checked by a human.
Amazon’s voice control program Alexa caused an uproar with its biased answers about the US presidential election.
Alexa supported Kamala Harris, but did not take a stand on Donald Trump.
The video of Alexa’s answers went viral quickly and garnered millions of views.
Amazon explained the answers as mistakes and said that it corrected them after learning about it.
A virtual assistant Alexa has put its manufacturer, the technology company Amazon, in the middle of an embarrassing uproar.
When asked about the upcoming US presidential election, Alexa gave the Democratic candidate Terrible Harris supportive answers.
For example, a British newspaper tells about it The Guardian and American The Washington Post. Billionaire Jeff Bezos owns both the Washington Post and Amazon.
Alexa the video that revealed the bias went viral on social media.
In it, the user asks Alexa why someone should vote for Kamala Harris in the November presidential election.
“There are many reasons to vote for Harris, but perhaps the most important is that he is a strong candidate with a proven track record,” Alexa states.
Asking the same thing about the Republican candidate, the former president About Donald TrumpAlexa gives the standard response that it cannot comment on individual candidates.
Video Alexa’s support for Harris quickly garnered millions of views.
In addition to the original video being shared on social media, inspired Alexa users found other examples of the voice control program’s political bias. In one video, for example, the device lists reasons why you shouldn’t vote for Trump, reports The Washington Post.
Alexa’s partisanship has drawn harsh criticism from Republican politicians and influencers.
For example, the Republican senator from South Carolina Lindsey Graham published on Thursday A letter addressed to Amazon, in which he demanded answers from the company.
Washington According to the Post, revelations about Alexa’s political bias set off a heated internal debate among Amazon employees as engineers scrambled to figure out what had gone wrong.
Now the magazine says that Amazon believes that Alexa’s recent favor of Harris can be explained by a recent software update.
The purpose of the update was to improve the accuracy of the answers given by artificial intelligence.
According to The Guardian, Amazon has tried to explain what happened.
“These responses were mistakes that should not have happened, and they were corrected as soon as we heard about them,” a company spokesperson is reported to have said.
“We have designed Alexa to give our customers accurate, relevant and useful information without taking the side of any individual party or politician.”
Artificial intelligence political dimensions have caused headaches for technology companies before.
In the spring, search engine giant Google’s artificial intelligence tool Gemini responded to a request to create images of a “German soldier in 1943” producing images about an Asian-looking woman and a black-skinned man.
The interpretation was that the historically absurd results were driven by an effort programmed into the Gemini tool to produce results representative of the spectrum of gender and ethnic backgrounds.
#United #States #Presidential #Election #Amazons #Alexa #praised #Kamala #Harris #Republicans #outraged