After coaching the same companies and their reluctant, over-trained executives time and time again, Congress is turning its attention to two of the fresh but important faces of the tech industry: TikTok and Snap.
On Tuesday, lawmakers on the Senate subcommittee on consumer protection, product safety and data security will question policymakers at these two companies and YouTube on how their platforms are affecting young, vulnerable users. Facebook whistleblower Frances Haugen testified before the same commission on parallel issues in early October, shortly after revealing her identity.
The hearing airs Tuesday at 7 a.m., with testimony from Snap Vice President of Global Public Policy Jennifer Stout, TikTok Vice President and Chief Public Policy Michael Beckerman and Leslie Miller, who directs government affairs and public policy on YouTube.
Subcommittee chair Senator Richard Blumenthal (D-CT) will lead the hearing, which will focus on the harmful effects of social media on children and adolescents. “The explosive reports on Facebook and Instagram – their toxic impacts on young users and the lack of truth or transparency – raise serious concerns about Big Tech’s approach to children at all levels,” said Blumenthal, linking reports on the dangers of Instagram for teens to social media. widely. The ranking of the subcommittee, Republican Marsha Blackburn (R-TN), indicated that she was particularly interested in the privacy concerns surrounding TikTok.
We expect topics such as eating disorders, harassment, bullying, online safety and data privacy to be addressed as members of the subcommittee take turns lobbying them. three politicians to get answers. The group of lawmakers also plan to discuss legislation that could help protect children and teens online, although it remains to be seen how much of the solution-focused hearing will be. Some of these potential solutions include the Kids Internet Design and Safety (KIDS) law, which would create new online protections for people under the age of 16. Blumenthal and fellow Democratic Senator Ed Markey reintroduced the bill last month.
Child and adolescent mental health isn’t the only pressing societal crisis that social platforms are embroiled in right now, but it’s a crisis that Republicans and Democrats are rallying around. On the one hand, it is a rare arena of criticism with a lot of political overlap for both sides. Both sides seem to agree that the biggest tech companies need to be controlled in one way or another, although they usually play different parts of the why: For the Tories, it is that these companies have too much power to control. decision as to what content is deleted. their platforms. Across the aisle, Democrats are typically much more concerned with the kind of content left out, such as extremism and misinformation.
Tuesday’s hearing will also likely dive into how algorithms amplify harmful content. Because social media companies are playing their cards close to the chest when it comes to how their algorithms work, audiences are a rare opportunity for the public to learn more about how these companies are serving their personalized content. users. Ideally, we would learn a lot about this stuff in the often long and repetitive technical hearings that Congress has held over the past two years, but between lawmakers pushing uninformed or irrelevant questions and elusive technical executives with hours of media training. under their belt, the best we can usually hope for is some new information.
While Facebook isn’t appearing to this particular audience, expect the recent revelations around this business and Instagram to tell what will happen on Tuesday. The three social media companies ready to testify have had an eye on the public response to the Facebook document leaks and further reports on that data just arrived on Monday.
Right after the first reports that Instagram is aware of the risks it poses to teenage users, TikTok introduced a new set of safety measures, including a wellness guide, better search interventions, and pop-ups. activation for sensitive search terms. Last week, Snap announced a new set of family-focused safety tools to give parents more visibility into what their kids are doing using the platform. Both social networks are heavily geared towards younger users compared to platforms like Facebook, Instagram, and Twitter, making robust security tools even more necessary. Ahead of the hearing, YouTube announced its own changes to the type of children’s content eligible for monetization, while highlighting its other child-focused safety measures.