Zuckerberg apologises to families in heated Senate hearing

US

Mark Zuckerberg was one of several social media bosses accused of having “blood on [their] hands” at a hearing where companies were criticised for not doing enough to protect children from being exploited on their platforms.

Mr Zuckerberg, the chief executive of Meta, which owns Facebook and Instagram, faced a sea of people who held pictures of their dead children all affected by online harms.

Also at the Senate Judiciary Committee hearing were the chiefs of X, Linda Yaccarino, Snap Inc’s Evan Spiegel, TikTok’s Shou Zi Chew and Discord’s Jason Citron.

Mark Zuckerberg stands and faces the public, some with placards, during the Senate Judiciary Committee hearing on online child sexual exploitation.
Pic: Reuters
Image:
Mark Zuckerberg faces the room. Pic: Reuters

Meta's CEO Mark Zuckerberg, X Corp's CEO Linda Yaccarino, TikTok's CEO Shou Zi Chew and Discord's CEO Jason Citron.
Pic:Reuters
Image:
(L to R) Discord’s Jason Citron, Snap Inc’s Evan Spiegel, TikTok’s Shou Zi Chew, X’s Linda Yaccarino and Meta’s Mark Zuckerberg. Pic: Reuters

All were grilled by US senators about inadequate protections online for children who, some politicians and activists argue, are susceptible to sexual predators, eating disorder content, unrealistic beauty standards and bullying on the platforms.

The room was first shown a video of children speaking about their victimisation on social media and senators recounted stories of young people taking their lives while being extorted after sharing photos with sexual predators.

Senator Lindsey Graham said: “Mr Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands.”

Mark Zuckerberg returns to his seat after standing and facing the public.
Pic: Reuters
Image:
Pic: Reuters

Referring to the founder of Facebook specifically, Mr Graham said: “You have a product that’s killing people.”

Mr Zuckerberg apologised to the families present, saying: “I’m sorry for everything you have all been through.

Meta's CEO Mark Zuckerberg
Pic: Reuters
Image:
Zuckerberg apologised to the families present. Pic: Reuters

“No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.”

Instagram, which is operated by Meta, was further denounced as one of its features included alerting a user to an image that might show sexual abuse but allowed them to see it anyway.

Mr Zuckerberg responded that it can be helpful to redirect users to resources rather than blocking content. He reiterated the company had no plans to pursue a previous idea to create a child version of the app.

Meta has said it will block harmful content from being viewed by under-18s, and will instead share resources from mental health charities when someone posts about their struggles with self-harm or eating disorders.

The 39-year-old chief executive has faced a committee before, with the first being over a privacy scandal in 2018 for Cambridge Analytica.

Read more:
Should you delete your Facebook?
Paedophiles using AI to ‘de-age’ celebrities
Elmo delivers message to fans

It is only the second time for Mr Chew and the first for Ms Yaccarino.

X has faced heavy criticism since Elon Musk’s takeover of the platform, and this week has been embroiled in a deepfake scandal, when sexually explicit pictures appearing to show Taylor Swift went viral.

Her name was temporarily unsearchable as the platform sought to redress the situation.

Please use Chrome browser for a more accessible video player

Online victims write to tech bosses

What did the other chiefs have to say?

The boss of X said the company did not cater to children and the firm supported the STOP CSAM Act, a bill which facilitates restitution for victims of child exploitation.

It is one of several aimed at addressing child safety – but none have become law.

Meanwhile, TikTok’s chief executive was grilled on the app’s potential detriment to the mental health of children.

Mr Chew insisted his platform made “careful product design choices to help make our app inhospitable to those seeking to harm teens”, reiterating the enforcement of a policy that would ban children under 13 from using the app.

He also said TikTok would spend $2bn (£1.57bn) on trust and safety measures.

Discord’s boss said safety tools already existed on its platforms, adding it had worked with NGOs and law enforcement to protect children.

Before the hearing, Mr Spiegel, the chief executive of Snap Inc, which operates Snapchat, said the company would back a bill to hold apps and social media platforms legally accountable if they recommended harmful material to children.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK

Products You May Like

Articles You May Like

‘I’m a different person when I play’: The unexpected impact of pickleball on prison life
Tom Cruise receives US navy honour for Top Gun ‘recruitment boost’
Sustainable performance with ERANGE – the best EV tires for top EVs
Surprisingly low retail sales in key Christmas shopping month – official figures
Coroner’s conclusion into TV doctor’s death

Leave a Reply

Your email address will not be published. Required fields are marked *