Lawmakers target social media harms to children

Jason M. Shepard, Ph.D.
7 min readMar 3, 2024

--

Originally published in the Spring 2024 issue of California Publisher.

By Jason M. Shepard

What legal duty do tech companies have to protect children from the harms of social media?

Facebook, Instagram, Snapchat, Tiktok, X (formerly Twitter), and other platforms use algorithms and other design features to hook children into endless hours of usage, exposing them to exploitive content that causes bullying, eating disorders, depression and anxiety, according to critics.

“Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk,” Illinois Democratic Senator Dick Durbin, chair of the U.S. Senate Judiciary Committee, said at a tense hearing in January.

As parents in the gallery held up photos of their children lost to suicide, the CEOs of leading social media companies faced a barrage of bipartisan criticism about the harms their platforms are causing children.

“Your products are killing people,” Republican Senator Josh Hawley of Missouri told the CEOs. Senator Lindsey Graham, Republican of South Carolina, said they had “blood on your hands.”

The hearing underscores an intense period of legal wrangling not seen since the early days of the internet over legal liability in the interests of protecting children from online harms.

In 1997, the U.S. Supreme Court in Reno v. ACLU struck down provisions of the Communications Decency Act, which criminalized indecent speech on the internet if children could access it. The law violated First Amendment rights to speech and press, the court ruled in a landmark First Amendment decision that paved the way to the internet we have today.

“It is true that we have repeatedly recognized the governmental interest in protecting children from harmful materials. But that interest does not justify an unnecessarily broad suppression of speech addressed to adults,” the Supreme Court wrote. “As we have explained, the Government may not ‘reduc[e] the adult population … to … only what is fit for children.’”

In the decade following the Reno decision, Congress passed several laws aiming to protect children from explicit content, but many of them which were struck down by the courts as violations of the First Amendment. The Child Online Protection Act, passed in 1998 as more narrowly tailored law than the CDA, spent a decade in litigation until a permanent injunction against it was issued in 2009.

Some laws aimed at protecting children from the harms of the internet were ultimately upheld as constitutional, including the 1999 Children’s Online Privacy Protection Act (COPPA), which limited some tech practices aimed at children under 13.

Now, with a new generation of children growing up with ubiquitous social media, a new inflection point seems to have been reached.

Congress is debating several bills creating new limits for social media companies. One bill would extend the protections of COPPA to children under 16 years old.

Another bill, the Kids Online Safety Act, would require social media companies to prevent and mitigate harms to children through their design and algorithms, limit the collection of minors’ personal data, and give guardians tools to supervise minors’ usage. The law also would require disclosure of how personal information is used, require annual reporting on foreseeable risks of harms to minors, and fund studies on the risk of harms to minors. The law empowers greater enforcement by both state attorneys general and the Federal Trade Commission.

Both COPPA 2.0 and KOSA passed the Senate Commerce Committee unanimously in July 2023. As of February 2024, KOSA boasts 62 bipartisan Senate cosponsors.

At the state level, lawmakers in at least 35 states in 2023 introduced bills to mitigate harms to children on social media. Several states, including Utah, Arkansas and Ohio, passed laws requiring age verification and other limitations before someone could create a social media account.

All are now facing lawsuits challenging age verification as unconstitutional.

Age-verification requirements “violate the First Amendment,” the American Civil Liberties Union said in a statement challenging the state laws. “They rob users of anonymity, pose privacy and security risks, and could be used to block some people from being able to use social media at all.”

The ACLU noted while social media platforms present risks, they also open up a world of learning for people.

“A wealth of communication and expression takes place online. Children and adults use social media to share news, opinions, and ideas; participate in social movements; interact with government representatives; explore their spirituality; and express themselves creatively,” the ACLU wrote.

A multitude of civil lawsuits also are seeking to hold social media companies liable for failures to protect children from harm.

State attorneys general from 33 states, including California and New York, filed a lawsuit in U.S. District Court for the Northern District of California against Meta, the parent company of Facebook and Instagram, accusing the company of designing and deploying harmful features that “addict children and teens to their mental and physical detriment.”

“Meta has repeatedly misled the public about the substantial dangers of its social media platforms,” the lawsuit alleges. “It has concealed the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children.”

The lawsuit alleges Meta is in violation of the federal COPPA, and several state laws, including California’s False Advertising Law and California’s Unfair Competition Law.

“Meta has been harming our children and teens, cultivating addiction to boost corporate profits,” California Attorney General Rob Bonta said in a statement. “We must protect our children and we will not back down from this fight.”

In at least nine states, attorneys general have filed individual lawsuits alleging similar violations, according to the Associated Press.

As myriad of cases make their way through the courts, a potential early precedent may come from a lawsuit over California’s AB 2273, the California Age-Appropriate Design Code Act (CAADC), signed into law by Gov. Gavin Newsom in September 2022.

The CAADC aims to protect privacy of users under the age of 18 and require companies to use “age-assurance” tools to estimate the age to their users to a reasonable level of certainty. The law limits algorithms and “dark patterns,” or design features that nudge children to spend more time on the platforms by “subverting or impairing user autonomy, decision-making or choice.”

The law also requires companies to create a “Data Protection Impact Assessment” of its platforms, which the state argues will prompt companies to proactively assess risks and prompt greater action to reduce preventable harms.

NetChoice, a nonprofit national trade association defending tech companies, filed a lawsuit in the U.S. District for the Northern District of California, alleging, among other things, that the law violates the First Amendment. The case is NetChoice v. Bonta.

The law acts as a prior restraint against speech, is overbroad and vague, and fails to meet stringent “strict scrutiny” requirements when laws regulate speech based on content, NetChoice argued in its complaint.

The law is a “content-based restriction on speech that will subject a global communications medium to state supervision and hobble a free and open resource for ‘exploring the vast realms of human thought and knowledge,’” NetChoice said its complaint.

U.S. District Court Judge Beth Freeman granted a preliminary injunction against the law in September 2023, finding that NetChoice is likely to prevail on the merits.

California is appealing Judge Freeman’s ruling to the Ninth Circuit Court of Appeals.

The case is generating significant support from outside groups filing amici curiae briefs.

A dozen briefs were filed in support of California, arguing that the district court misapplied First Amendment principles and discounted the harms children face on social media.

For example, the American Federation of Teachers and its California chapter filed a brief outlining children’s “unprecedented decline in their mental health and wellbeing,” arguing among the causes are the practices of social media exploit young people’s vulnerabilities.

A group of privacy scholars filed a brief arguing that the First Amendment does not prohibit data privacy laws and argued the district judge applied the wrong standard of review in the case.

Meanwhile, others filed briefs urging the Ninth Circuit to uphold the district ruling.

The Reporters Committee for Freedom of the Press and 14 other press organizations said the vague law could restrict access to lawful content, including public interest journalism.

The ACLU also opposes the law. Its brief argues that strong data and consumer privacy laws are essential, but they need to designed carefully and narrowly. California’s is not, the ACLU argues. They urge the Ninth Circuit to strike down the law on narrow grounds, keeping the door open for a better law in the future.

“Legislators around the country are understandably concerned with the privacy, wellbeing, and safety of children. But restricting and burdening speech, including speech that addresses difficult and critical concepts like parental abuse or depression, is not the right policy solution,” Vera Eidelman, senior staff attorney with the ACLU Speech, Privacy and Technology Project, said in a statement.

“The government cannot regulate speech solely to protect kids from ideas it thinks are unsuitable or harmful. Nor can it limit adults’ access to speech in the name of protecting children.”

Jason M. Shepard, Ph.D., is a media law scholar, and professor and chair of the Department of Communications at California State University, Fullerton. Contact him at jshepard@fullerton.edu or Twitter at @jasonmshepard.

--

--

Jason M. Shepard, Ph.D.

Media law prof and COMM dept chair @CSUF. Past: @CapTimes @isthmus @TeachForAmerica @UWMadison PhD. More at jasonmshepard.com.