CBS’s 60 Minutes posted an interview with former employee Frances Haugen Facebook who decided to unveil i secrets of the algorithm of the social network and links with disinformation.
Frances Haugen, 37, is an Iowa data scientist with a bachelor’s degree in computer engineering and a master’s in business from Harvard. For 15 years he worked for companies such as Google and Pinterest. The woman said: “Imagine knowing what is going on inside Facebook and knowing that no one on the outside knows. I knew what my future would be like if I continued to stay inside Facebook: person after person after person has dealt with. this problem from within Facebook and eventually gave up. ” To find all the information you need, it has secretly copied tens of thousands of pages of internal Facebook searches. One of the many reasons he did this is the loss of a friend, which occurred due to conspiracy theories circulating online.
One study who found, this year, says: “We estimate that we can intervene on a number of situations equal to 3-5%, as regards hate messages, and about 6 tenths of 1% of V&I [violenza e incitamento] on Facebook, despite the fact that we are the best in the world in this field “.
The woman also explains that she was assigned to the department Civic Integrity who worked on the risks of elections, including disinformation. But after these last elections, there was a turning point.
Haugen said, “They told us, ‘We’re dissolving Civic Integrity.’ They basically said, ‘Oh well, we got through the election. There were no riots. Now we can get rid of Civic Integrity.’ couple of months, we had the uprising. When they got rid of Civic Integrity, it was the moment when I said to myself: ‘I don’t trust that they are willing to really invest what needs to be invested to prevent Facebook from being dangerous’.”
Haugen said that the root of the Facebook problem It’s in a change it made in 2018 to its algorithms, the programming that decides what you see on your Facebook news feed.
Frances Haugen said: “You have your phone. You may only see 100 pieces of content if you sit and scroll for five minutes. But Facebook has thousands of options it could show you. The algorithm chooses from these options based on the type of content with the which one you interacted with the most in the past. ” And obviously the disinformation and hatred tend to generate more reactions and, therefore, push people to stay on Facebook more time.
Frances Haugen says: “Yes. Facebook understands that if they change the algorithm to be more secure, people will spend less time on the site, click on fewer ads and Facebook will make less money“Haugen says Facebook understood the danger to the 2020 election. So, it activated security systems to reduce disinformation but many of these changes were temporary.” And as soon as the elections ended, they deactivated them again or they changed the settings back to what they were before, to prioritize growth over safety. And this really seems to me a betrayal of democracy. ”
Interviewer Scott Pelley said, “Facebook essentially amplifies the worst of human nature.” To which Frances Haugen replied: “It’s one of the unpleasant consequences, right? Nobody on Facebook is evil, but the incentives are misaligned, right? Facebook makes more money when consuming more content. People like to interact with things that elicit an emotional reaction. And the more anger is offered to them, the more they interact and the more they consume content “.
From the same documents comes the report according to which Instagram is bad for the mental health of girls.