Major tech CEOs, including Meta’s Mark Zuckerberg, are once again facing congressional scrutiny over the potential harm caused to teenagers by social media platforms. Growing concerns link these platforms to issues like depression and suicidal tendencies among young users.
Lawmakers in Washington, D.C. are demanding concrete actions from tech giants, going beyond their usual promises of empowering teens and parents to make responsible online choices. With a presidential election on the horizon and state lawmakers taking the lead, Congress is pushing for more substantial efforts to address these concerns.
Scheduled to testify alongside Mark Zuckerberg at the Senate Judiciary Committee hearing are CEOs from TikTok, Snap, Discord, and X. For some, including X CEO Linda Yaccarino, Snap CEO Evan Spiegel, and Discord CEO Jason Citron, this marks their first appearance before Congress.
These tech CEOs plan to highlight their platforms’ tools and policies designed to protect children and provide more parental control over their online experiences. Companies like Snap and Discord are distancing themselves from Meta’s approach by emphasizing that they do not rely on addictive or harmful algorithmically recommended content.
However, critics, including parents and online safety advocates, argue that these tools fall short, placing too much responsibility on parents and young users. They believe tech platforms can no longer be trusted to self-regulate.
Experts urge the congressional committee to press for significant changes, such as disconnecting advertising and marketing systems from services that target youth. The rise of generative artificial intelligence tools has added urgency to the need for default safety features on tech platforms.
Several major platforms, including Meta, Snapchat, Discord, and TikTok, have introduced oversight tools allowing parents to monitor their teenagers’ online activity and exert control. Some platforms, like Instagram and TikTok, have implemented features like “take a break” reminders and screentime limits to protect teens from harmful content.
Meta recently proposed federal legislation calling for app stores, rather than social media companies, to verify users’ ages and enforce age restrictions. They also announced various youth safety measures, including hiding “age-inappropriate content” from teen feeds and encouraging stricter security settings.
Snapchat expanded its parental oversight tool, Family Center, offering parents more control over their teens’ interactions with the app.
This hearing is the latest in a series of congressional appearances by tech leaders, driven in part by revelations from Facebook whistleblower Frances Haugen in late 2021. While some updates have been welcomed, critics argue that they still place too much responsibility on parents. They believe the tech industry’s delay in implementing safety updates shows self-regulation is no longer effective.
Tech companies aim to balance safety and empowerment for young users while avoiding strict content views. Meanwhile, momentum for social media regulation is growing outside Congress. Several states, including Arkansas, Louisiana, Ohio, and Utah, have passed laws restricting social media for teens, some requiring parental consent for minor accounts. Legal challenges from the tech industry cite potential threats to First Amendment rights and privacy.
State-backed and consumer lawsuits against tech companies are increasing, adding pressure for stricter regulation. The hearing offers lawmakers an opportunity to question smaller industry players, like X and Discord, about their youth safety efforts.
As calls for industry-wide solutions intensify, Wednesday’s hearing becomes a pivotal moment in shaping the future of child safety on social media platforms.