Meta CEO Mark Zuckerberg and the CEOs of X, TikTok, Discord and Snap spoke to U.S. lawmakers Wednesday over the dangers social media platforms pose to children and teens. The company is facing severe scrutiny by
Technology industry leaders will be asked about the impact of social media in a session called “Big Tech Companies and the Online Child Sexual Exploitation Crisis'' convened by the U.S. Senate Judiciary Committee.
The hearing will be a tough one for executives who are facing political ire for not doing enough to stop sexual predators and others from endangering children online. This is certain.
Imran Ahmed, founder and CEO of the Center to Combat Digital Hate, said: “These CEOs have a shameful track record on child safety and a long history of trying to hide these failures from public view. “We have serious questions that need to be answered about any attempt to do so.”
Testifying to senators will be Zuckerberg, X's Linda Yaccarino, TikTok's Show Zhi Chiu, Snap's Evan Spiegel, and Discord's Jason Citron.
Advertisement – SCROLL TO CONTINUE
“The work our team has done to improve the safety of children online, not just on our services but across the internet,” Meta's Zuckerberg told the committee, according to prepared testimony obtained by AFP. I'm proud of that,” he said.
Under U.S. law, web platforms are largely exempt from liability for content shared on their sites.
Lawmakers want to enact more rules to improve online safety, but new legislation has been blocked by political divisions in Washington and intense lobbying by big tech companies.
Advertisement – SCROLL TO CONTINUE
One existing proposal is the Kids Online Safety Act (KOSA), which aims to protect children from algorithms that can cause anxiety and depression.
Another idea would require social media platforms to verify the age of account holders and completely ban children under 13 from signing up.
Ahead of the testimony, Meta and former Twitter company X announced new measures to respond to political backlash.
Advertisement – SCROLL TO CONTINUE
Meta, which owns the world's leading platforms Facebook and Instagram, announced it would block direct messages sent to teens by strangers.
By default, youth under 16 can now send messages and be added to group chats only from people they already follow or connect with.
Meta also tightened content restrictions for teens on Instagram and Facebook, making it harder to see posts that talk about suicide, self-harm, or eating disorders.
Advertisement – SCROLL TO CONTINUE
Zuckerberg told lawmakers that Meta has about 40,000 employees working on online safety, and that $20 billion has been invested since 2016 to keep the platform secure. I plan to.
He also plans to support legislation that would provide age verification and clear parental controls.
X also announced last week ahead of the hearing that it would establish a content moderation team in Austin, Texas.
Advertisement – SCROLL TO CONTINUE
The initial goal of Austin's Trust and Safety Center of Excellence is to hire 100 content moderators, with a focus on eliminating child sexual exploitation content and other violations of the platform's rules. .
Ahead of the tech executives' appearance on Capitol Hill, X's Yaccarino was in Washington to meet with lawmakers from both parties on topics including child protection.
According to the company, X currently has more than 2,000 content moderators, both full-time and contract employees.
But when Elon Musk first took over Twitter in late 2022, he imposed massive layoffs and decimated the company's trust and safety team.
Musk, who calls himself a “free speech absolutist,” also vowed to remove content restrictions and allow many banned individuals to return.
For now, America's only deterrent to the online giants is through lawsuits brought by American states, with about 40 states jointly suing meth for alleged missteps with children.
The lawsuits allege that Meta intentionally allows users under the age of 13 on the Instagram platform and disables only some of their accounts.
The lawsuit also accuses Meta of concealing an internal investigation that showed it was harming users on Instagram and Facebook.
ARP/MLM