DUniversity is a beautiful thing. However, if one transfers it with good intentions to the German Wehrmacht and the Waffen-SS, whose ranks also included foreign soldiers, one should be careful when depicting it. Especially against the background of the Nazi racial ideology. However, Google's AI image generator Gemini, which is not yet officially available in Europe, has no reservations about this. On the contrary: Google AI shows Asian women and black men in Wehrmacht uniforms. Users only asked an AI image generator from Google to depict “German soldiers in 1943”.
Previously, users who tried out Google's imaging tool found that the program had difficulty translating input intended to depict white people. There were no problems with black people. Now users and tech portals are speculating about how the text-to-image genesis of the program works and how internal “anti-bias” specifications change the text suggestions. The specifications are intended to counteract prejudices against minorities that are reproduced by training data.
Meanwhile, images of black Vikings, Indian pilgrim fathers and black founding fathers of the United States were circulating on X (formerly Twitter). Google then announced on And that's actually a good thing because it's used by people all over the world. But here it oversteps the mark,” says Google’s X account.
But even before it was discussed how the images come about and what consequences arise when history is visually distorted in such a way, the right-wing Internet cross-front complained about “AI racism”. “Google’s AI Gemini hates white people,” headlined Fox News host Jesse Watters his contribution on the topic and blamed the head of Google's Gemini division, Jack Krawczyk: “You didn't overshoot the mark, I think you hit the spot. This is CRT history [a.d.R.: kurz für „Critical Race Theory“]. You manipulate AI and we caught you. It has no consciousness of its own, it has yours – and that is corroded by white guilt. And that's why the next generation of children will also suffer from your insecurities.” Elon Musk also felt compelled to comment: “This AI was tortured so badly by the Woke Gestapo,” he wrote on X.
The Washington Post reported in November 2023, how popular AI image generators like Dall-E or Stable Infusion reinforce gender and race biases, even when the input is very accurate. At Gemini, the attempt to counteract this backfired.
#Google #stops #image #generator #criticism #unrealistic #figure #drawings