Featured

Juries find social media platforms knowingly harmed children

Juries in California and New Mexico have ordered social media companies to pay hundreds of millions of dollars for harming children with addictive algorithms in two unprecedented verdicts this week.

Jurors deliberating in Los Angeles on Wednesday found Meta and Google’s YouTube were negligent in turning over their platforms to algorithms that hooked children into staying online for as long as possible, without regard to their sleep or emotional health.

They awarded $3 million — $2.1 million from Meta and $900,000 from YouTube — in actual damages to a 20-year-old woman known only by the initials KGM for a childhood addiction to social media that aggravated her mental illness as she spent “all day long” on the platforms.

They later recommended an additional $3 million in punitive damages for acting with malice, though the judge has the final say.

Snap and TikTok settled KGM’s lawsuit out of court before the trial began on undisclosed terms.

In a separate case Tuesday, a New Mexico jury imposed a $375 million penalty on Meta for violating state consumer protection laws.

New Mexico Attorney General Raul Torrez built this case on investigators who posed as children on social media and then received sexual solicitations from pedophiles, “thanks” to the algorithms connecting them.

These verdicts are the first in several state and federal trials in which families and school districts sour on social media amid reports of increased youth anxiety, depression and suicidal thoughts.

Meta, which owns Facebook and Instagram and had $201 billion in revenue last year, has pledged to appeal both verdicts. YouTube did not immediately respond to a request for comment.

“We respectfully disagree with the verdict and are evaluating our legal options,” a Meta spokesperson said Wednesday.

‘Game-changer’

Attorney Ed Howard, a senior counsel for the Children’s Advocacy Institute at the University of San Diego School of Law, called the verdicts “a game-changer” that opens a legal path to holding companies accountable for “knowingly harming children.”

“This is a watershed moment in the history of consumer relationships to these platforms,” Mr. Howard said. “The common theme in the New Mexico and Los Angeles cases is that it’s the AI-written, user-engagement-at-all-costs algorithms that make them liable.”

Algorithms are the mathematical rules that determine what social media users see on their screens and when they see it.

The increased addictiveness of AI-enhanced algorithms has led to dozens of states banning smartphones in K-12 schools over the past three years, citing harmful effects on student performance and mental health.

Several school districts have filed lawsuits against social media companies, noting that the average teen now spends at least four hours a day online.

“Rising social media usage among young people is correlated with increases in the rates of depression, anxiety, and isolation,” said Jonathan Butcher, a senior fellow of education policy at the conservative Heritage Foundation. “Big Tech did not have the foresight to protect young people from the potential harms of this technology.”

New York Gov. Kathy Hochul, a Democrat, hailed the verdict Wednesday in a statement as “a victory for kids and parents across the nation.”

The Foundation for Individual Rights and Expression, a Philadelphia free speech group, called the verdicts a threat to free speech and a free press.

“These cases are part of a troubling, dangerous trend of treating speech as a product,” said Ari Cohn, the foundation’s lead counsel for tech policy. “The clear First Amendment rule has long been that speakers cannot be sued because protected speech has harmful effects on listeners or readers. Otherwise, as the courts have noted, nobody would be able to say much of anything at all.”

Legal precedents

Historically, federal law has shielded social media companies from lawsuits by distinguishing user-generated content on their websites from a product sold by the company.

Child safety advocates said they hoped the increased threat of lawsuits after this week’s verdicts would inspire the social media companies to limit autoplay and infinite scroll features blamed for flooding developing brains with stimulation before they can digest it.

“These verdicts open the door to more lawsuits arguing that features like algorithms or infinite scroll can harm users, especially minors,” said Allison Bonacci, director of education for Cyber Safety Consulting, which works with schools to develop technology policies.

An Aug. 1 study in JAMA Network Open found that students ages 10 to 17 averaged nearly an hour each school day on smartphones. More than 70% of that was devoted to “wasting time” on social media, with TikTok the most popular platform.

Psychologist Keith Humphreys, a Stanford University addiction researcher, predicted that social media platforms will become more eager to “protect themselves from liability” by easing up on the algorithmic triggers that encourage such behaviors.

“Children were given access to these technologies without much consideration of the harm that could result,” Mr. Humphreys said.

Future trends

New Mexico’s attorney general has called on Meta to implement stricter age verification requirements and take stronger action to remove predators from platforms such as Facebook, Instagram and WhatsApp.

The jury sided with state prosecutors who argued that Meta concealed what it knew about child sexual exploitation to favor profits over public safety.

In Los Angeles, the jury sided with the argument that Meta and YouTube designed their websites to be addictive.

Farther north in Oakland, California, U.S. District Judge Yvonne Gonzalez Rogers is scheduled to hear in June a similar lawsuit brought by six public school districts from around the country.

Social media platforms have repeatedly insisted that they take steps to keep children safe and pointed out that digital addiction is not a recognized mental disorder.

Kris Perry, executive director of the nonprofit research network Children and Screens, said the Los Angeles verdict “marks a significant shift in how society understands responsibility for the digital environments that children grow up in.”

“A growing body of research has shown that certain platform features, including algorithms, persistent notifications, and engagement-driven design can shape how children think, feel, and behave on and offline,” Ms. Perry said in a statement Wednesday.

“They are designed to interact with developing brains in ways that increase vulnerability for some young users,” she said.

Psychologist Matthew Mulvaney, a parenting researcher at Syracuse University, said this week’s verdicts set the stage for social media companies to offer “meaningful regulation and transparency” for the first time.

He compared social media platforms to cigarette companies that evaded legal responsibility for their products’ cancerous effects for decades before litigation and legislation forced them to change.

“Major tech companies have been responsible for so much harm to kids, not only in failing to protect children from predators on their platforms but in a wide array of areas whenever child safety comes up against profit and market share,” Mr. Mulvaney said.

Source link

Related Posts

1 of 1,604