E.xtreme messages run really well on Facebook. The more a message or a piece of opinion heats up the minds, the more users share and approve of the post – and the more such interactions there are with an entry, the more interesting it is for advertisers. Facebook benefits financially from this advertising revenue.
The problem is: dubious actors on the Internet use this connection for themselves: for example those who disguise their identity – and then spread fake news or create a political mood by distorting statements by politicians against their better judgment.
The company tries to be all the more determined in its monthly reports. In it, Facebook lists all those profiles that are supposed to be part of so-called disinformation networks. From the new report of the US company, which is available to WELT, it emerges: In December 2020, more accounts, pages and groups were removed than ever before in a single month.
According to this, 17 networks and “misleading campaigns from almost every continent” were taken offline. This also includes the Russian disinformation network around the supposed German news portal “Abendlich Hamburg”. An alleged revelatory article about the Russian opposition politician Alexej Navalny was published there at the end of November, which was then picked up by numerous Russian media.
A research by WELT with “netzpolitik.org” uncovered a whole network of such bogus foreign media from various European countries that gave themselves serious-sounding names, but whose supposed journalists did not even exist. The backers from the Russian-speaking area placed disinformation on these platforms. The tracks led to the separatist areas in eastern Ukraine and to a Russian propaganda channel with connections to Russian intelligence services.
In connection with this network, Facebook has now blocked 23 accounts, 25 pages, eleven groups and 19 Instagram accounts that are said to have violated the guidelines for preventing foreign influence. Pages and groups are said to have been managed via fake accounts, comments to fake popularity and users to be directed to supposed news sites, the report says. Facebook had a total of around 23,000 subscribers to the pages, 7,000 group members and 17,600 followers on Instagram.
The network, which Facebook also sees as originating in Luhansk in eastern Ukraine, which is controlled by pro-Russian separatists, had its target audience in the Republic of Moldova, Kazakhstan, Great Britain, Spain, Kyrgyzstan, Ukraine, Belarus, Germany and Russia.
In the Republic of Moldova, for example, benevolent articles about the Soviet Union and pro-Russian politicians were published, while EU-friendly politicians were criticized. However, Facebook was unable to establish a direct connection to Russia.
Over 100 networks discovered in three years
In addition, according to the report, Facebook has dug networks from Iran, Morocco, Kyrgyzstan, Kazakhstan, Argentina, Brazil, Pakistan, Indonesia, France and Russia. Facebook security chief Nathaniel Gleicher points out a special feature. “In public, there is often talk of covert foreign influence via Facebook. However, many of the operations that we found took place within one country. They come from parties or influential individuals, ”he told WELT.
At least twelve of the 17 operations uncovered are said to have influenced elections in favor of local actors, according to the report. This blurs the line between healthy public debate and manipulation. The majority of the deactivated networks were therefore in an early development phase and therefore had only a small audience. Increasingly, however, some also used extremist groups similar tactics.
Since March 2020, Facebook has been reporting on a monthly basis which networks have been removed from the platform. Security chief Gleicher said that “over 100 networks worldwide” have been uncovered on Facebook in the past three years. By intensifying public relations work, the group is trying to refute the accusation that it is not doing anything against fake news and tendencies towards radicalization.
Most recently, Facebook took action against the anti-Semitic conspiracy movement QAnon in the summer. However, the theories of the cult had already spread via a network of groups and influential people on the platform and developed into a real threat to internal security – in the USA, but also in Germany.
However, it is increasingly critical that Facebook and other social networks decide which posts are deleted and which accounts are blocked – and which are not. Although the platforms usually have guidelines that are supposed to define exactly that, their concrete design is often inconsistent and also intransparent. This is especially true when the posts come from influential accounts.
Like Donald Trump’s. When the US President wrote in a post in May of last year “when the looting begins, the shooting begins”, the post on Facebook stopped. Mark Zuckerberg justified this with freedom of expression. He did not want to be an “arbiter of the truth”, he said in a television interview at the time. But after storming the Capitol last Thursday, Zuckerberg suddenly blocked Trump’s account.
Other platforms like Twitter followed suit a short time later. While Trump’s opponents burst into jubilation, it stirred strong criticism to the tech corporations because, without democratic legitimation, they themselves decide on an ever larger scale how freedom of expression should be interpreted in their opinion.
And the debate has long since taken on bigger features. It’s no longer just about classic social networks. At the weekend, Apple and Google also took action and banned the Parler app from their stores. The social network from the USA is mostly with Republican politicians and popular with rights.
A little later, Amazon excluded the network from its cloud services. Since then the app is no longer available. Activists had previously saved tons of data about Parler users that they want to publish soon.
Critics criticize the lack of legally binding rules
According to the non-governmental organization Reset.tech (in German: Neustart für Technologie), this is also due to the fact that there are no legally binding rules. Had this existed, the tech companies would have been forced to act faster and tougher against disinformation on their platforms.
In addition, they should have enforced and communicated their guidelines more consistently – also for heads of state. According to Reset.tech, that would also have strengthened freedom of expression.
The European Union is currently developing the so-called Digital Services Act, a catalog of laws with which the Commission wants to encourage platforms to comply more closely with due diligence requirements. How strict these rules will be is still unclear.
According to Transparency International, Google, Facebook, Amazon, Apple and Microsoft have given a 2020 statement a budget of around 19 million euros for lobbying.