Cantwell Wants COPPA Update After Facebook Testimony
Thursday’s testimony from Facebook underscores the need for the FTC to update the Children's Online Privacy Protection Rule (see 2105110052), Senate Commerce Committee Chair Maria Cantwell, D-Wash., told us after a Senate Consumer Protection Subcommittee hearing (see 2109240009). Members said Global Head-Safety Antigone Davis evaded questions about the company’s internal research showing a link between youth mental health issues and Instagram activity (see 2109150053). “They had information that they basically said they didn’t, which is a problem,” said Cantwell.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
Subcommittee Chairman Richard Blumenthal, D-Conn., said he left Thursday’s hearing with more questions than answers. Facebook won’t hold itself accountable, so Congress will have to do it, he said. The Facebook whistleblower set to testify next week will have a role in creating actual “disclosure and transparency in contrast to the unfulfilled promises we saw this morning,” said Blumenthal.
The Senate Judiciary Committee has learned that the internal research is “pretty damning,” said Sen. Josh Hawley, R-Mo. He introduced legislation Thursday that would amend Communications Decency Act Section 230 and make social media companies “liable for bodily or mental harm their products cause to children.” House Commerce Committee ranking member Cathy McMorris Rodgers, R-Wash., told Cheddar she wants Facebook to cancel plans for Instagram for kids: "The evidence is clear Big Tech is harming our kids."
Facebook released two reports to the committee Wednesday, but it declined to release further studies showing the harm caused by platforms, Blumenthal said during Thursday’s hearing. He cited findings about the correlation between young people’s use of Instagram and mental health issues, specifically on body image. Ranking member Marsha Blackburn, R-Tenn., called it a perfect storm of intense social pressure, addiction, body image issues, eating disorders, anxiety and suicidal thoughts brought on by social media.
Content moderation and data collection practices remain largely in the dark, said Senate Commerce Committee ranking member Roger Wicker, R-Miss.: “We are serious about taking action.” Asked after the hearing about specific, concrete action he hopes Congress will take, he said, “We’ll get back to you.”
On the decision to delay launching an Instagram for kids, Davis told the subcommittee that Facebook is consulting experts and parents about product development. She declined to tell Sen. Amy Klobuchar, D-Minn., who would be responsible for deciding when to “unpause” launch of the product. Davis said Facebook will look to a collaborative team of advisers, parents and policymakers before it makes any decisions.
Sen. Ed Markey, D-Mass., asked Davis to commit to not launching any products targeting those younger than 12 on platforms with “like” buttons, follower counts and other means of quantifying popularity. Facebook products enrich lives and help people connect, said Davis, noting it's analyzing age-appropriate features with advisers and experts. The platform doesn’t market to children 12 and under, she said, noting the platform removes underage users from Instagram. In the past three months, Instagram removed more than 600,000 accounts of underage users, she said. Davis told Blackburn the company can enforce this through age verification screens, access restriction and reporting from other users. Facebook is also investing in artificial intelligence to identify younger users, she said.
Markey asked Davis if Facebook supports his legislation with Blumenthal, the Kids Internet Design and Safety Act. That bill would ban autoplay features, push alerts and badges earned for screen time, and prohibit “manipulative marketing” (see 2003050023). Facebook has made it well-known that it agrees it’s time to update internet regulation, said Davis. She said she would happily follow up about the specific legislation.
Members share this “deep” concern about lack of transparency for social media companies, said Senate Minority Whip John Thune, R-S.D. He asked if users should be able to engage on platforms without “algorithmic manipulation,” an issue he has attempted to legislate. Facebook’s news feed is designed to connect users with friends and family and promote meaningful connection, said Davis. That the company turned to an oversight board shows it values transparency, she said.