At a hearing by the US Senate’s Subcommittee on Consumer Protection for Consumer Protection, executives from YouTube, TikTok and Snapchat were asked this Tuesday (26) to support changes in US legislation to increase protection for children and adolescents on the networks social, but avoided committing directly to the changes and said they are already adopting their own policies to avoid harm to this public.
Among the suggested changes are the implementation of new privacy rules for underage users and banning specific ads and automatic playback of videos for children.
According to information from the Associated Press, TikTok vice president and head of public policy for the company for the Americas, Michael Beckerman, said that “sex and drugs are violations of community standards” and “no longer have a place in TikTok”.
He argued that the platform has tools, such as screen time management, to help young people and parents to moderate how much time children spend in the app and what they see. It also claimed that some features, such as direct messages, are not available to younger users and that the company tightened its privacy policies for under-18s following recommendations from federal regulators.
When Republican lawmakers emphasized that the company belongs to a Chinese group, Beckerman responded that all TikTok data is stored in the United States, backed up in Singapore.
Leslie Miller, Google’s vice president of government affairs and public policy, owner of YouTube, argued that the company has adopted parental protection and control systems, such as time limits, to limit content viewing to what is appropriate. the age of the viewer.
Jennifer Stout, vice president of global public policy at Snapchat, cited that the platform has implemented detection measures against resellers of products that are inappropriate for certain audiences and that content moderation is done by humans and not through artificial intelligence, on the contrary from other social networks.
The executives’ responses did not convince the senators. “We’re seeing the same conversation, over and over and over and over,” said Democratic Senator Richard Blumenthal, chair of the subcommittee.
During a hearing at the same subcommittee earlier this month, data analyst Frances Haugen, a former Facebook employee, asked for the creation of legal mechanisms to regulate the platform and other social media, to prevent the dissemination of content harmful to democracy and to the mental health of adolescents.
Prior to the hearing, she had leaked information to the Wall Street Journal that would show that Facebook is aware of the damage caused by its apps, but is not taking the necessary steps to address these problems, contrary to what it preaches in public statements.
This week, a group of media outlets, including the Washington Post, started a series of stories called Facebook Papers, which addresses the social network’s “double standard” for content removals.
#YouTube #TikTok #Snapchat #dont #talk #legislation