For many years, the vicious cycle has turned: Websites feature creepy, unverified complaints about alleged cheaters, sexual predators, vagrants, and scammers. People slander their enemies. Anonymous posts appear high in Google results with the names of the victims. Later, websites charge victims thousands of dollars to remove posts.
This ring of slander has been lucrative for the websites and associated intermediaries, and devastating for the victims. Now Google is trying to break the circle.
The company plan to change your search algorithm to prevent websites, which operate under domains such as BadGirlReport.date and PredatorsAlert.us, from appearing in the list of results when someone searches for a person’s name.
Google also recently created a new concept that it calls “known victims.” When people report to the business that they have been attacked on sites that charge to remove posts, Google will automatically remove similar content when their names are searched. The “Known victims” also include people whose nude photos have been posted online without their consent., allowing them to request the suppression of explicit results for their names.
The changes, some already made by Google and others planned for the next few months, are a response to recent New York Times articles documenting how the smear industry takes advantage of victims with unwitting help from Google.
Google wants to prevent websites, which operate under domains like BadGirlReport.date and PredatorsAlert.us, from appearing in the results list when someone searches for a person’s name. (Photo: AFP)
“I doubt it’s a perfect solution, certainly not from the start. But I think it should really have a significant and positive impact, ”said David Graff, vice president of global policy and standards, and trust and security at Google. “We cannot monitor the web, but we can be responsible citizens.”
That represents a momentous change for victims of online defamation. Google, which accounts for about 90 percent of global online searches, historically resisted human judgment playing a role in its search engine, though it has given in to mounting pressure in recent years to combat the misinformation and abuse that appear at the top of its results.
At first, the founders of Google saw their algorithm as an unbiased reflection of the Internet itself. He used an analysis called PageRank, named after co-founder Larry Page, to determine the value of a website by evaluating how many other sites linked to it, as well as the quality of those other sites, based on how many sites linked to it. they.
The philosophy was: “We never touched the search, in any way, or how. If we start tapping search results, it’s a one-way ratchet toward a curated Internet and we’re no longer neutral, ”said Danielle Citron, a law professor at the University of Virginia. A decade ago, Professor Citron lobbied Google to stop so-called revenge porn from appearing in a search for someone’s name. The company initially resisted.
Google articulated its hands-off viewpoint in a 2004 statement on why its search engine was showing up on anti-Semitic websites in response to searches for “Jew.”
“Our search results are generated completely objectively and are independent of the beliefs and preferences of those who work at Google”the company said in the statement, which it removed a decade later. “The only sites we skip are those that we are legally required to remove or those that attempt to maliciously manipulate our results.”
Google’s first interventions in its search results were limited to things like web spam and pirated movies and music, as required by copyright laws, as well as financially compromised information such as Social Security numbers. Only recently has the company reluctantly played a more active role in cleaning up people’s search results.
The most notable case occurred in 2014, when European courts established the “right to be forgotten”. Residents of the European Union can request that what they consider to be inaccurate and irrelevant information about them be removed from search engines.
Google unsuccessfully fought the court ruling. The company said its role was to make existing information accessible and that it did not want to participate in regulating the content that appeared in search results. Since the law was established, Google has been forced to remove millions of links from search results for people’s names.
The war with Trump .
More pressure to change came after Donald Trump was elected president. After the election, one of Google’s top search results for the “vote count of the final elections of 2016” was a link to an article that wrongly claimed that Trump, who won the Electoral College, had also won the popular vote.
A few months later, Google announced an initiative to provide “algorithmic updates to display more authoritative content” in an effort to prevent intentionally misleading, false or offensive information from appearing in search results.
Around that time, Google’s antipathy towards engineering harassment from its results began to soften.
The file of Wayback machine Google’s policies on removing items from search results captures the evolution of the business. First, Google was willing to remove nude photos posted online without the subject’s consent. Then he began removing medical information from the list. Then came fake pornography, followed by sites with “takedown exploit” policies and then so-called doxxing content, which Google defined as “exposing contact information with the intent to harm.”
It is the most used search engine in the world, with 90% of the share.
The removal request forms receive millions of visits each year, according to Google, but many victims are unaware of their existence. That has allowed “reputation managers” and others to charge people for removing content from their results that they could request for free.
Pandu Nayak, the head of Google’s search quality team, said the company began battling websites that charge people to remove defamatory content a few years ago, in response to the rise of a thriving industry that exposed photos police of the people and then charged for their removal.
Google started ranking these exploitative sites lower in its results, but the change didn’t help people who don’t have a lot of information online. Because Google’s algorithm abhors a vacuum, posts accusing those people of being drug addicts or pedophiles might still appear prominently in your results.
Slander traffic websites have relied on this feature. They couldn’t charge thousands of dollars to remove content if the posts didn’t hurt people’s reputations.
Nayak and Graff said Google was unaware of the extent of this problem until it was featured in The Times articles this year. They said changes to Google’s algorithm and the creation of its “known victims” ranking would help solve the problem. In particular, make it difficult for sites to gain traction on Google through one of your preferred methods – copying and republishing defamatory content from other sites.
Google has been testing the changes recently, with contractors making side-by-side comparisons of new and old search results.
The Times had previously compiled a list of 47,000 people who have been written about on libel sites. In a search of a handful of people whose results were previously riddled with defamatory posts, the changes Google made were already detectable. For some, the posts had disappeared from their first page of results and their image results. For others, the posts had mostly disappeared, save for one from a recently launched smear site called CheaterArchives.com.
CheaterArchives.com can illustrate the limits of Google’s new protections. Since it is fairly new, it is unlikely to have generated complaints from victims. Those complaints are one way that Google finds libel sites. Additionally, CheaterArchives.com does not explicitly advertise post removal as a service, which could make it difficult for victims to remove it from their results.
Sundar Pichai, CEO of Google. Photo: Reuters
Google executives said the company was not motivated solely by sympathy for victims of online slander. Instead, it is part of the Google’s long-standing efforts to combat sites that try to appear higher in search engine results than they deserve.
“These sites, frankly, are messing with our system,” Graff said.
Still, Google’s move is likely adding to questions about the company’s effective monopoly on what information is and what is not in the public domain. In fact, that’s part of the reason Google has historically been so reluctant to intervene in individual search results.
“You should be able to find anything that is legal.”said Daphne Keller, who was an attorney at Google from 2004 to 2015, worked on the search products team for part of that time, and is now at Stanford studying how platforms should be regulated. Google, he said, “is just showing its own muscle and deciding what information should disappear.”
Keller wasn’t criticizing his former employer, but rather lamenting the fact that lawmakers and law enforcement have largely ignored the smear industry and its extortionary practices, leaving Google to clean up the mess.
That Google can potentially solve this problem with a policy change and tweaks to its algorithm is “the advantage of centralization,” said Citron, a professor at the University of Virginia, who has argued that Tech platforms have more power than governments to combat online abuse.
Professor Citron was impressed by Google’s changes, particularly the creation of the “known victims” designation. She said these victims are often repeatedly posted and the sites compound the damage by scraping each other.
“I applaud your efforts,” he said. “Can they do better? If you can.”
Kashmir Hill and Daisuke Wakabayashi for The New York Times.