“You have blood on your hands.”
Republican Senator Lindsey Graham spoke these words at a U.S. senate hearing on January 31. He wasn’t addressing a murderer or a terrorist. He lobbed this charge at Mark Zuckerberg, CEO of Meta and founder of Facebook.
This week, leaders from Meta, TikTok, X, and other social media platforms testified before the Senate Judiciary Committee. They gathered on Capitol Hill in Washington, D.C. The hearing lasted for hours, but it focused on one question: Are these companies doing enough to protect kids and teens?
The hearing began with testimony from parents. They recounted how social media harmed their children. Some kids faced bullying. Some saw harmful or violent content. Algorithms recommended videos promoting unrealistic beauty standards and worse. Parents say these dangerous influences led to some children’s deaths.
Then the CEOs took the stand. They touted the safety features of their apps. At one point, Senator Josh Hawley pointed Zuckerberg to the gathered families.
“Would you like to apologize to them?” said Hawley.
Zuckerberg did so. Some family members held photos of their children.
“I’m sorry for everything you have all been through,” Zuckerberg said. “No one should go through the things that your families have suffered.”
But for many families—and many senators—an apology doesn’t cut it. Throughout the hearing, Republican and Democrat senators found themselves agreeing with one another.
Senator Graham promised to work with Democrats on the problem. “After years of working on this issue . . . I’ve come to conclude the following,” he says. “Social media companies, as they’re currently designed and operate, are dangerous products.”
Several proposed bills, such as the Kids Online Safety Act, could require increased protections for children and annual reviews for companies. For now, no such legislation has passed.
But should the blame fall on the companies? Shouldn’t parents bear responsibility for the products they let their children use?
There’s no simple answer. For example, Meta apps don’t allow users under 13 years old. But young people might (and sometimes do) lie about their age. State officials say Meta knowingly allows millions of underage users. They also accuse social media companies of creating addictive experiences for kids. According to one study, such companies made $11 billion from users 17 years old and under in 2022. Researchers claim these businesses rely on teenage users for profit.
Basic safety features, such as age verification, allow companies to say they’re doing something to help. But is that enough to protect real people?
Little children, let us not love in word or talk but in deed and in truth. — 1 John 3:18