Parents of Texas children under 18 can now monitor and restrict their children’s activity on digital platforms, including Facebook and Instagram, but only if they know their child uses the service.
Meta, the parent company of Facebook and Instagram, rolled out parental control features in Texas last week to comply with House Bill 18, the Protecting Children Online Through Parental Empowerment Act, which went into effect on September 1.
The Legislature passed it last year to restrict children from viewing harmful material on the Internet, such as content that promotes self-harm or substance abuse, while giving parents more power to regulate what their child does online.
Meta’s tools allow parents who can prove their identity with a valid ID to view and update their teen’s account settings, set time limits for the child’s usage, and even delete a minor child’s Instagram or Facebook account entirely. Parental rights advocates say the new tools are helpful, but they don’t go far enough to protect young people online.
“It will be difficult to intervene unless you know your child is using the product,” said Zach Whiting, policy director and senior fellow at the Texas Public Policy Foundation, who testified in favor of the bill. He said a stricter policy would restrict teens under 18 from creating a social media account unless they first get parental consent. Most social media companies already restrict children under 13 from creating accounts.
“If we treat social media like any other harmful product, there are age verification requirements for it, just like smoking and drinking,” Whiting said. “I think it’s an appropriate extension to do that with social media.”
Texas is among a growing number of states that have passed laws limiting tech companies’ interactions with children, citing research that found a link between social media use and negative psychological well-being among young people. Texas lawmakers have also expressed concern about the vast amounts of data tech companies could be collecting on minors.
But like those other states, Texas faced legal challenges and resistance from the tech industry, which was able to limit the scope of the legislation.
An earlier version of HB 18 would have prohibited minors from creating social media accounts unless their parents gave consent. That version failed to pass the state Senate.
Rep. Shelby Slawson, R-Stephenville, who introduced the bill, told colleagues on the House floor last May that she had hoped to spend more time working with the Senate to amend the bill, but there was not enough time. Still, she said, “this bill is a monumental step in the right direction.”
Days before the law was set to take effect, a federal district judge temporarily blocked a key piece of legislation that would have required digital service providers to filter harmful content such as self-harm, substance abuse, eating disorders or child pornography from minors’ feeds.
The judge called such restrictions “unconstitutionally vague” and wrote that they could even prevent children from seeing useful information. “In its attempt to prevent children from accessing harmful content, Texas also prohibits minors from participating in the democratic exchange of opinions online,” Judge Robert Pitman wrote in his opinion. “A state cannot pick and choose which categories of protected speech it wishes to prevent teens from discussing online.”
Attorney General Ken Paxton has filed a notice to appeal Pitman’s decision, which stems from a case brought by tech industry groups. A free speech advocacy group has also filed a lawsuit to block the new law.
“No one with a working knowledge of the First Amendment would say, ‘Oh, this is a bill designed to pass constitutional scrutiny,’” said Ari Cohn, a Chicago-based attorney who specializes in the First Amendment. “It’s obviously overbroad and infringes on First Amendment rights.”
While those lawsuits are being resolved, parts of the law are enforceable, including a requirement that companies create tools for parents to monitor their children’s accounts. The law also prohibits digital service providers from disclosing minors’ data or personally identifiable information, or showing them targeted ads.
Meta does not share or sell personal data, a spokesperson said, adding that the only information used to serve ads to teens is their age and location, which helps the company ensure it shows them relevant ads for products and services available where they live. The company will no longer store precise geolocation data associated with teens’ accounts in Texas to comply with the new law, the spokesperson said.
Other companies, including Snap and TikTok, did not respond to inquiries from The Tribune, leaving it unclear whether and how they are complying with the new data and advertising requirements.
Snap offers tools for parents to restrict their teen’s account, but the teen would have to opt in to supervision. Since 2020, TikTok also offers a family pairing setting, which would allow a parent or guardian to link an account to a teen’s and manage privacy settings and set screen time limits. This feature also requires the child to consent to pairing.
It’s also unclear how Paxton’s office intends to enforce the law. His office’s consumer protection division has sole authority to enforce the law. Violators could face civil penalties of up to $10,000 per violation and attorney fees. His office did not respond to The Tribune’s request for comment.
#parents #monitor #childrens #social #networks