An 11-year-old dies by suicide after she is sexually exploited on Instagram and Snapchat. Two teenagers are killed in a crash following a race using a Snapchat speed filter. A sexual predator uses Facebook to lure a 15-year-old girl into trafficking.
Social media companies for decades have been shielded from legal consequences for what happens on their platforms. But a sharp shift in public opinion and a bend in recent court rulings have the industry nervous that this could change — especially when damage is done to children online.
And for the first time in nearly 30 years, lawyers for grieving families see an opening.
Lawsuits blaming social media platforms for teen suicides, eating disorders and mental collapses have picked up in the months since Facebook whistleblower Frances Haugen told Congress that her company knew its products were addictive to kids and that their mental stability was suffering as a result. And a bill moving through the California statehouse would make companies liable for addicting children, drawing comparisons to a strategy used against the tobacco industry.
The nation’s largest federal appeals court last year ruled that the legal shield — known as Section 230 of the U.S. Communications Decency Act — didn’t apply to a Snapchat filter blamed for the car crash deaths of two teenagers. Texas’ Supreme Court let a sex trafficking case against Facebook proceed, citing Congress’ 2018 changes to federal law. And an appellate court recently refused Facebook’s attempt to circumvent that lawsuit. Georgia’s Supreme Court in March likewise ruled that a separate complaint over Snapchat’s speed filter can move forward because the plaintiffs have a good case the app made a risky product.
“I am pretty optimistic that tides are turning and we are going to see a backlash on Section 230 from the courts,” said Carrie Goldman, a New York-based trial attorney who used product liability law to challenge Grindr’s federal shield. Powerful social media companies, she added, “were never supposed to be immune from liability.”
Tech companies and their lawyers are watching with trepidation. Cathy Gellis, an internet attorney, says the industry is increasingly turning to the First Amendment — rather than Section 230 — as the first line of defense in content moderation lawsuits.
“It’s all on fire,” she said.
The tech industry’s legal protections, enshrined in 1996, came from the thinking that companies trying to create a free marketplace of ideas online shouldn’t have to worry about getting shut down based on someone saying or doing something the website can’t control. But that was when Netscape reigned supreme, email arrived via dial-up modem and “apps” weren’t yet gleams in a techy’s eye.
Nearly three decades later, trial attorneys are testing that shield with a battery of cases brought by parents of kids and teens whose deaths or mental crises they blame on social media. The roster includes a claim by the mother of 11-year-old Selena Rodriguez who alleges her daughter was addicted to Snapchat and Instagram for two years and pulled into sending sexually exploitative messages. The lawsuit details a downward spiral of depression, eating disorders and self-harm that ended in suicide.
Carrie Goldman’s case against Grindr alleged that the hookup app eased the way for her client Matthew Herrick’s abusive ex-boyfriend to set up a false profile that disclosed Herrick's location — and said, falsely, that he was HIV-positive and liked violent, unprotected sex. Stalkers began shadowing Herrick, who filed cease-and-desist orders and police reports even as Grindr said it couldn't block the profile, the lawsuit alleged.
Herrick's claim ultimately failed in 2019, with the U.S. 2nd Circuit Court of Appeals citing the Section 230 shield. But Carrie Goldman’s argument — that it was a question of product safety and liability, rather than one of content — was later used in a key case against Snapchat.
Gellis and others in the tech industry argue that any dent in the federal shield can have far-reaching consequences on the internet, and that unfavorable rulings could come to haunt internet companies trying to fight state laws. State legislatures in Texas and Florida are debating a slate of proposals to bar platforms from censoring content, while a pair of bills aimed at making the internet safer for kids is advancing in California.
“It's a problem to have any language on the books that Section 230 is supposed to block,” she said. “As a litigator, I'll look to using prior precedents upholding Section 230 to protect people from these sorts of bad laws being enforced, but it's playing with fire if that's the only thing protecting them.”
One case in particular has been widely cited by California lawmakers who want to make social media companies liable for addicting children. A lawsuit known as Lemmon v. Snap alleged that the high-speed car crash death of two teenagers while they were using Snapchat’s “speed filter” function for a virtual race was the app’s fault, since the filter was a function Snapchat designed itself.
The San Francisco-based U.S. 9th Circuit Court of Appeals ruled last year that the company’s design of the speed filter wasn’t covered by Section 230.
California lawmakers and advocates see the Snapchat ruling as a green light for a state bill that would explicitly authorize lawsuits against social media companies if they’re shown to hook kids with their products. Assemblymember Jordan Cunningham, the bill’s Republican lead co-author, says the decision shows that his proposal won’t violate federal law.
Trial attorneys also see it this way. Matthew Bergman, who six months ago founded the Social Media Victims Law Center to bring major cases involving kids’ and teens’ addiction using product liability law, compares the push to his years of suing companies for asbestos poisoning. He launched his new crusade after Haugen’s testimony and warnings from the U.S. surgeon general about the mental health harms of teen social media use. One of the most recent cases he filed invokes Haugen’s leaks to allege that Instagram purposefully targets youngsters.
He sees signs everywhere that the tide is turning. He pointed to recent court decisions — and even the Supreme Court’s surprisingly close vote to block Texas’ social media bill on censorship, at 5-4.
“The era of goodwill toward social media platforms is waning,” Bergman said.
Some lawyers in tech’s corner say they don’t see judges making a big swing away from longtime orthodoxy on the law.
“Courts are certainly scrutinizing 230, but courts are bound by precedent,” said Adam Sieff, a lawyer who represents tech companies in Section 230 and First Amendment claims. “Virtually without exception,” he said, the courts are finding that the precedents hold up the Section 230 defense.
Eric Goldman, co-director of Santa Clara University’s High Tech Law Institute, argues the 9th Circuit decision in the Snapchat case is fairly narrow. Still, he is alarmed at the state policymaking that he sees as meddling in private companies’ operations and know-how.
“Legislatures are enacting laws that they know are garbage,” he said. “They don’t care about actually implementing policy — it’s all about the press releases and tweets. When states pass laws that are garbage we hope that the courts will fix the obvious problems that the legislatures have created.”
In California, the tech industry and internet freedom groups like the Electronic Frontier Foundation are working feverishly to kill Cunningham’s bill that would explicitly create a cause for liability lawsuits against social media companies — a red line for the industry. But the proposal has advanced with significant momentum and could get a final vote in August.
The proposal was narrowed last month to allow only public prosecutors, rather than all Californians, to bring cases. But supporters still see it as a huge step in making social media liable for features with documented risks — and opponents see a major threat to tech companies’ autonomy.
That said, Sieff and Eric Goldman suggest that lawyers in the business of suing tech companies may be exaggerating the significance of the recent decisions.
“The prevailing and uniform interpretation of Section 230 is squarely on the side of the platforms,” Sieff said, “and the plaintiffs’ bar is definitely stretching, or willfully misreading, decisions like Lemmon well beyond their application.”
----------------------------------------
By: Susannah Luthi
Title: A legal shield for social media is showing cracks
Sourced From: www.politico.com/news/2022/07/14/legal-shield-social-media-facebook-00045226
Published Date: Thu, 14 Jul 2022 03:30:00 EST