The former Meta employees were shockingly calm as they laid it all out: The company that owns Facebook and Instagram knows its platforms are exposing children to serious dangers.
It knows the videos its services distribute to children have damaged their self-esteem and left many of them anxious, depressed, even suicidal.
And it knows children make up most users in Meta’s virtual-reality rooms, where they are regularly stalked, harassed and sexually abused.
The whistleblowers, Jason Sattizahn and Cayce Savage, explained this month to a Senate Judiciary subcommittee how the company routinely placed profits over kids’ safety.
They revealed the company erases evidence of sexual abuse and manipulates data to obscure harm to users.
To avoid creating an incriminating paper trail, they said, Meta’s lawyers won’t let its own researchers ask questions that might produce answers they don’t like.
I wasn’t surprised by that testimony because I’ve heard many horrifying tales of Big Tech’s practices damaging children and teenagers.
My family, unfortunately, has one of those stories.
My daughter Becca’s life unraveled because of social media.
She was a lot like other kids — she hung out with her friends, texted them all the time and followed them on social platforms. When she was 15, however, she and some friends met a group of 18-year-old boys online.
They agreed to convene at a party, where one of the boys drugged and raped Becca.
Becca was never the same. In a different era, she might have healed. But because someone posted a compromising photo of her on Snapchat, she was forced to endure cyberbullying, and her misery intensified.
Becca had a supportive family and access to therapists and school counselors, yet it wasn’t long before she turned to drugs. We discovered the extent of her problem when she suffered an overdose.
To get her away from nearby drug dealers, we moved from our Massachusetts home to my sister’s house in Maine, thinking it was safer.
But social media undermined all our efforts to keep her safe.
In Maine, she and a friend went on Facebook to connect with someone who could get them drugs. Both girls were set to enter residential drug treatment the next day, and they were looking for one last high before they went. The drugs they purchased online were laced with fentanyl.
The next morning, I found her friend, overdosed but alive. My daughter, 18, was dead.
There’s nowhere you can go to escape the platforms and access to illicit drugs. It’s as easy as ordering a pizza or requesting an Uber.
My daughter’s death is just one reason I wish we could get kids off social media. If that’s not possible, we at least need guardrails to protect children online.
Big Tech companies need to be held accountable when they provide a haven for illegal activity, particularly when that activity leads to children’s exploitation and death.
We also must be able to hold these companies legally responsible for the consequences of their design choices, including what they distribute through their algorithms.
Both Republican and Democratic senators were appalled by what they heard from the Meta whistleblowers. They promised the time for talking is over — they understand the urgency of passing the Kids Online Safety Act.
The bill would force Big Tech to take the same kind of reasonable steps to protect their users required of every other industry supplying products to children.
KOSA would establish a duty of care, compelling online platforms like Facebook and Instagram to design their products to prevent and mitigate specific dangers to young users, such as addiction, sexual exploitation and bullying.
It would also provide parents with the tools to opt out of personalized algorithms, protect their kids’ privacy and avoid addictive usage patterns.
The Senate overwhelmingly approved KOSA last year, but House leaders refused to bring it to a vote. Big Tech is spending millions on lobbyists to stop the bill.
Meta even announced plans last year to build a $10 billion artificial-intelligence data center in Louisiana, the home state of House Speaker Mike Johnson and Majority Leader Steve Scalise.
Clearly, there’s a lot of pressure on congressional leaders to cave to Big Tech’s demands.
It would be a mistake to let KOSA die or weaken it so substantially that it doesn’t actually force change.
Every time there is a new congressional hearing, we hear more heartbreaking stories of parents who have lost children.
A press conference before this month’s hearing included Maurine Molak, whose son David died by suicide at 16 after months of being bullied on Instagram.
Brian Montgomery spoke of his son Walker, who died three years ago, also at 16. Walker became the victim one evening of an online predator on Instagram posing as a teenage girl. He was goaded into a sexual exchange of photos. The predator demanded $1,000 to keep the picture offline. A few hours later, Walker killed himself.
American teenagers spend an average of nine hours a day online, and most parents have no idea what’s happening in that virtual world.
Social-media companies know how to capture attention, feed insecurities and create social anxiety. Unlike other industries, tech companies have a legal safe harbor to act irresponsibly, even maliciously.
The whistleblowers who addressed the Senate weren’t random employees who were briefly at Meta.
Sattizahn told the committee he wanted to use his PhD in integrative neuroscience to make these technologies better.
But over the six years he spent at Meta, he saw the company consistently choose profits over safety. “Product teams are afraid to do anything that could decrease engagement,” he said. “We were directed to write reports to limit harm to Facebook. Lawyers were in control of my research.”
He said Meta controlled what topics could be researched and how. Meta monitored the research so the company could stop it if it seemed to be going in a negative direction. To further protect itself, Sattizahn said, “Meta demanded third-party contractors be used to house reports of user harm so Meta could claim ignorance to ‘knowing’ these findings.”
So, he was asked, why do research at all? Sattizahn’s answer exposed the company’s cynicism. “Some research is necessary,” he said, “to create a paper trail to show you were doing it.” It was all for show.
Savage, who has a graduate degree in experimental psychology, said her four years heading research on youth safety at Meta led her to believe user engagement is the top and only priority.
Meta is aware children are being harmed and does nothing about it. “It was not uncommon in virtual reality for children to experience bullying, sexual assault, to be solicited for nude photographs and sexual acts by pedophiles, to be regularly exposed to mature content like gambling and violence and to participate in adult experiences like strip clubs and watching pornography with strangers,” she said.
“I wish I could tell you,” Savage told senators, “what percentage of children using VR experience these harms,” but Meta would not allow her to conduct this research.
She assessed the audience’s age by voice. “Every time,” she said, “the majority have been audibly under the age of 13.”
But if Meta acknowledged the presence of underage users, it would have to kick them off — which it doesn’t want to do because that would decrease the number of active users it’s reporting to shareholders. “It is more profitable,” Savage said, “to pretend to have no way of better identifying the real ages of their users.”
If the Kids Online Safety Act were law, Meta wouldn’t be allowed to turn a blind eye to the virtual assaults and other harms that are all too common on its platforms.
And it couldn’t bury its research or silo its researchers to avoid the truth because the bill also requires social-media platforms to report on the frequency of harms minors experience and the steps the companies are taking to mitigate evils like sexual predation and cyberbullying.
I had no idea what my daughter had access to through her phone.
Parents need to understand that the parental controls on their kids’ phones will not protect them.
The Meta whistleblowers said fewer than 10% of parents use the controls, and they don’t work well anyway because kids know how to get around them.
Parents can’t keep up with this. The only solution is for the tech companies to bear legal responsibility for the harm they cause, just as other companies do.
Maybe they’ll make a little less money, but is that such a tragedy?
As the mother of a child I will always love and miss every day, I don’t think so.
Deb Schmill is the founder and president of the Becca Schmill Foundation, a nonprofit established in memory of her 18-year-old daughter, Becca.