Lawmakers Duel With Tech Execs on Social Media Harms to Youth Mental Health
During fiery Senate hearing, Meta CEO Mark Zuckerberg apologized to families who said their children suffered, or died, because of social media.
During a hostile Senate hearing Wednesday that sometimes devolved into bickering, lawmakers from across the political spectrum accused social media companies of failing to protect young people online and pushed rules that would hold Big Tech accountable for youth suicides and child sexual exploitation.
The Senate Judiciary Committee hearing in Washington, D.C., was the latest act in a bipartisan effort to bolster federal regulations on social media platforms like Instagram and TikTok amid a growing chorus of parents and adolescent mental health experts warning the services have harmed youth well-being and, in some cases, pushed them to suicide.
In an unprecedented moment, Meta founder and CEO Mark Zuckerberg, at the urging of Missouri Republican Sen. Josh Hawley, stood up and turned around to face the audience, apologizing to the parents in attendance who said their children were damaged — and in some cases, died — because of his company’s algorithms.
“I’m sorry for everything you’ve all gone through,” said Zuckerberg, whose company owns Facebook and Instagram. “It’s terrible. No one should have to go through the things that your families have suffered.”
Senators argued the companies — and tech executives themselves — should be held legally responsible for instances of abuse and exploitation under tougher regulations that would limit children’s access to social media platforms and restrict their exposure to harmful content.
“Your platforms really suck at policing themselves,” Sen. Sheldon Whitehouse, a Rhode Island Democrat, told Zuckerberg and the CEOs of X, TikTok, Discord and Snap, who were summoned to testify. Section 230 of the Communications Decency Act, which allows social media platforms to moderate content as they see fit and generally provides immunity from liability for user-generated posts, has routinely shielded tech companies from accountability. As youth harms persist, he said those legal protections are “a very significant part of that problem.”
Whitehouse pointed to a lawsuit against X, formerly Twitter, that was filed by two men who claimed a sex trafficker manipulated them into sharing sexually explicit videos of themselves over Snapchat when they were just 13 years old. Links to the videos appeared on Twitter years later, but the company allegedly refused to take action until after they were contacted by a Department of Homeland Security agent and the posts had generated more than 160,000 views. The lawsuit was dismissed in May by the Ninth Circuit, which cited Section 230.
“That’s a pretty foul set of facts,” Whitehouse said. “There is nothing about that set of facts that tells me Section 230 performed any public service in that regard.”
In an opening statement, Democratic committee chair, Sen. Dick Durbin of Illinois, offered a chilling description of the harms inflicted on young people by each of the social media platforms represented at the hearing. In addition to Zuckerberg, executives who testified were X CEO Linda Yaccarino, TikTok CEO Shou Chew, Snap co-founder and CEO Evan Spiegel and Discord CEO Jason Citron.
“Discord has been used to groom, abduct and abuse children,” Durbin said. “Meta’s Instagram helped connect and promote a network of pedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially extort young victims. TikTok has become a, quote, ‘platform of choice’ for predators to access, engage and groom children for abuse. And the prevalance of [child sexual abuse material] on X has grown as the company has gutted its trust and safety workforce.”
Citron testified that Discord has “a zero tolerance policy” for content that features sexual exploitation and that it uses filters to scan and block such materials from its service.
“Just like all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes,” Citron said. “All of us here on the panel today, and throughout the tech industry, have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals both online and off.”
Lawmakers have introduced a slate of regulatory bills that have gained bipartisan traction but have failed to become law. Among them is the Kids Online Safety Act, which would require social media companies and other online services to take “reasonable measures” to protect children from cyberbullying, sexual exploitation and materials that promote self-harm. It would also mandate strict privacy settings when teens use the online services. Other proposals would compel social media companies to report suspected drug activity to the police — some parents said their children overdosed and died after buying drugs on the platforms — and a bill that would hold them accountable for hosting child sexual abuse materials.
In their testimonies, each of the tech executives said they have taken steps to protect children who use their services, including features that restrict certain types of content, limit screen time and curtail the people they’re allowed to communicate with. But they also sought to distance their services from harms in a bid to stave off regulations.
“With so much of our lives spent on mobile devices and social media, it’s important to look into the effects on teen mental health and well-being,” Zuckerberg said. “I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes.”
Zuckerberg pointed to a recent analysis by the National Academies of Sciences, Engineering and Medicine, which concluded there is a lack of evidence to confirm that social media causes changes in adolescent well-being at the population level and that the services could carry both benefits and harms for young people. While social media websites can expose children to online harassment and fringe ideas, researchers noted, the services can be used by young people to foster community.
In October, 42 state attorneys general filed a lawsuit against Meta, alleging that the social media giant knowingly and purposely designed tools to addict children to its services. U.S. Surgeon General Vivek Murthy issued an advisory last year warning that social media sites pose a “profound risk of harm” to youth mental health, stating that the tools should come with warning labels. Among evidence of the harms is leaked internal research from Meta which found that Instagram led to body-image issues among teenage girls and that many of its young users blamed the platform for increases in anxiety and depression.
Republican lawmakers devoted a significant amount of time during the hearing to criticizing TikTok for its ties to the Chinese government, calling out the app for collecting data about U.S. citizens, including in an effort to surveil American journalists. The Justice Department is reportedly investigating allegations that ByteDance, the Chinese company that owns TikTok, used the app to surveil several American journalists who report on the tech industry.
In response, Chew said the company launched an initiative — dubbed “Project Texas” — to prevent its Chinese employees from accessing personal data about U.S. citizens. But employees claim the company has struggled to live up to its promises.
YouTube and TikTok are by far the platforms where teens spend the most hours per day, according to a 2023 Gallup survey although Neal Mohan, the CEO of Google-owned YouTube, was not called in to testify.
Mainstream social media platforms have also been exploited for domestic online extremism. Earlier this month, for example, a teenager accused of carrying out a mass shooting at his Iowa high school reportedly maintained an active presence on Discord and, shortly before the rampage, commented in a channel dedicated to such attacks that he was “gearing up” for the mayhem. Just minutes before the shooting, the suspect appeared to capture a video inside a school bathroom and uploaded it to TikTok.
Josh Golin, the executive director of Fairplay, a nonprofit devoted to bolstering online child protections, blasted the tech executives’ testimony for being little more than “evasions and deflections.”
“If Congress really cares about the families who packed the hearing today holding pictures of their children lost to social media harms, they will move the Kids Online Safety Act,” Golin said in a statement. “Pointed questions and sound bites won’t save lives, but KOSA will.”
The safety act, known as KOSA, has faced pushback from civil rights advocates on First Amendment grounds, arguing the proposal could be used to censor certain content and violate the privacy of all internet users. Sen. Marsha Blackburn, a Republican from Tennessee and KOSA co-author, said last fall the rules are important to protect “minor children from the transgender in this culture” and cited the legislation as a way to shield children from “being indoctrinated” online. The Heritage Foundation, a conservative think tank, endorsed the legislation, stating on X that “keeping trans content away from children is protecting kids.”
Snap’s Evan Spiegel and X’s Linda Yaccarino both agreed to support the Kids Online Safety Act.
Aliya Bhatia, a policy analyst with the nonprofit Center for Democracy and Technology, said that although lawmakers made clear their intention to act, their directives could end up doing more harm than good. She said the platforms serve as “peer-to-peer learning and community networks” where young people can access information about reproductive health and other important topics that they might not feel comfortable receiving from adults in their lives.
“It’s clear that this is a really tricky issue, it’s really difficult for the government and companies to decide what is harmful for young people,” Bhatia said. “What one young person finds helpful online, another might find harmful.”
South Carolina’s Sen. Lindsey Graham, the committee’s ranking Republican, said that social media companies can’t be trusted to keep kids safe online and that lawmakers have run out of patience.
“If you’re waiting on these guys to solve the problem,” he said, “we’re going to die waiting.”
Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter