Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
30 years of legal protection expires! Meta and Google face a crisis
A protective charm that tech giants have relied on to dodge legal responsibility for three decades is now facing unprecedented challenges.
Last week, Meta and YouTube, owned by Google, suffered defeats in two separate jury trials, with damages totaling about $400 million.
At the same time, multiple new lawsuits have been filed one after another, with plaintiffs’ attorneys systematically dismantling the long-standing legal immunity enjoyed by tech platforms by finding ways to get around Section 230 of the U.S. Communications Decency Act.
The U.S. Communications Decency Act was passed by the U.S. Congress in 1996 and signed into law by then-President Bill Clinton. This law allows websites to act as content moderators without being held responsible for the content ultimately retained.
Over the past three decades, platforms such as Meta, Google, TikTok, and Snap have all benefited from this provision, enabling them to define themselves as neutral platforms and thus avoid a large number of potential lawsuits.
As the tech industry moves from the era of traditional search and social networks into a new landscape led by artificial intelligence, the nature of legal risk is also quietly changing. Platforms are no longer just passively carrying users’ content; instead, they actively shape user experiences through algorithmic recommendations, autoplay, and even AI-generated content.
Two defeats at trial—product design becomes the breakthrough point
Last week, a plaintiff using the pseudonym Jane Doe filed a class-action lawsuit against Google, alleging that the company’s AI model created its own summaries and links, leaking the personal identifying information of Epstein’s victims, including names, phone numbers, and email addresses.
According to CNBC, plaintiff attorney Kevin Osborne said the lawsuit was filed because Google refused the plaintiffs’ request to delete victims’ contact information from the AI model. Osborne said that because information spreads extremely fast, the case must move quickly:
Osborne added that given Meta’s defeat in court last week, the timing was “pure coincidence,” but he said, the common thread in these cases is that the plaintiffs are trying to evade Section 230. Osborne said:
Last week, a jury in New Mexico found that Meta was liable in a case involving child safety; at the same time, a jury in Los Angeles found that Meta, Facebook’s parent company, was negligent in another personal injury case.
Both companies said they plan to appeal last week’s rulings.
Legislative gridlock and judicial outlook
At the level of the U.S. Congress, both parties have previously proposed various reform plans for Section 230 of the Communications Decency Act, but none have been implemented.
During his first term, Trump supported imposing more restrictions on social media companies; during his 2020 campaign, the Biden administration also publicly said the provision should be repealed.
Nadine Farid Johnson, policy director at the Knight First Amendment Institute at Columbia University, attributed the legislative difficulty to “these issues being extremely complex.”
Farid Johnson is currently calling for Congress to take a more cautious reform path, proposing that technology companies be allowed to obtain Section 230 protections only after meeting specific conditions such as data privacy and platform transparency.
She warned that:
Legal experts say that after an appeal, the cases above may ultimately end up before the U.S. Supreme Court, where an authoritative ruling will be made on whether platforms can receive legal protection.
David Greene, senior legal counsel at the Electronic Frontier Foundation, also noted that there is currently no consensus in the legal community on whether product functions are protected by Section 230 of the Communications Decency Act—and even by the First Amendment. Greene said:
Risk warning and disclaimer