This week, over 40 US states sued Meta, the parent company of social media platforms Instagram, WhatsApp, Facebook, Messenger, and Threads. The lawsuit alleges that Meta knowingly designed its social media sites to be highly addictive to young people, inflicting mental health damage.
Also: Beware of these popular Temu scams circulating social media
The lawsuit hones in on Facebook and Instagram specifically, as they "exploit and manipulate children," according to the attorneys general.
So, why is almost every US state collectively suing Meta? And what is the anticipated outcome by the suing parties? Here's everything you need to know.
Over 30 states filed a federal lawsuit against Meta Platforms, Instagram, Meta Payments, and Meta Platforms Technologies. Eight other states filed separate lawsuits under similar claims.
The lawsuit alleges that Meta runs a "scheme to exploit young users for profit" by increasing engagement, harvesting data, falsely advertising safety features, and promoting unhealthy social expectations, sleep habits, and body image.
Also: The 3 biggest social media scams Americans are falling for
Most importantly, the lawsuit alleges that Meta knows how its platforms affect young people and does not act accordingly to protect them. Additionally, the plaintiffs claim that Meta, concerning Instagram and Facebook, is in noncompliance with the Children's Online Privacy Protection Rule (COPPA).
The lawsuit states that Instagram and Facebook collect the personal information of children without parental consent, do not verify parental consent before collecting such data, and violate COPPA because the platforms do such practices while being marketed towards children.
The lawsuit points to Meta's algorithms and claims they are exploitative and predatory. There's not much public information available about how Meta's algorithms work, but we do know that multiple algorithms are responsible for the content users see on Meta's platforms.
Also: The best parental control apps to keep your kids safe
Another aspect of the lawsuit claims that Meta knowingly markets its platforms to children despite evidence that Instagram can have a direct impact on mental health and body image issues, especially for teenage girls.
In 2021, Facebook whistleblower Frances Haugen shared internal Facebook documents and research with members of the US Congress. Haugen alleged that the documents, collectively called the Facebook Papers, proved that Meta repeatedly and knowingly prioritized profits over the public good.
The documents also suggested that Meta knew that Instagram's content, filters, and features fueled teenagers' body image issues, anxiety, and depression.
Since then, the US Surgeon General released a social media health advisory for American teens, and lawmakers and lawyers have been grappling with how to keep children safe online in the era of big tech.
The federal lawsuit includes 33 US attorneys general. Instead of individually suing a company, the 33 plaintiffs are combining their resources and legal expertise to create a united front in the fight against online harms against children. Many legal experts and news publications liken this lawsuit to those pointed toward Big Tobacco and Big Pharma, both of which were struck by severe ramifications and costly payouts.
"Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximize its profits at the expense of public health, specifically harming the health of the youngest among us," Phil Weiser, Colorado's attorney general, said in a statement.
Also: How to become a content creator: Everything you need
The plaintiffs of the federal lawsuit are seeing financial penalties from Meta, while the eight individual states are seeking injunctive relief to forcefully stop Meta from using certain features mentioned in the lawsuit that allegedly harm children.
The lawsuits are expected to be a lengthy legal battle, as Meta is likely to fight them. According to The New York Times, Weiser said in a news conference that he filed the lawsuit because he was unable to reach a settlement agreement with Meta out of court.
The last few years have been pivotal for those seeking extra child protection online. Tech companies like Amazon, Google, YouTube, Microsoft, and Meta have been hit with lawsuits in the US and the European Union for failure to comply with online child safety laws.
TikTok is also embroiled in ongoing legal issues regarding child safety, as an investigation was launched in 2022 by 46 US attorneys general to find out if TikTok violated consumer protection laws.
Also: TikTok bans explained: Everything you need to know
Although there's insufficient research, many researchers and mental health advocates blame social media for the decline in teens' social skills, socialization levels, and emotional and mental well-being.
Many parents agree with this sentiment, too, but are stuck between a rock and a hard place of not wanting to deprive their children of online interaction but wanting to preserve their kids' real-life relationships, mental health, and social skills.
Lawmakers seem to understand the importance of children staying connected online in the digital age, but it must be within proper boundaries. The issue is that parents work, have multiple children, take care of their aging parents, maintain a household, and maintain their own personal and social life. Most parents do not have the bandwidth to constantly and accurately monitor their children online, which leaves their kids vulnerable to Big Tech.
But what happens when Big Tech prioritizes money over impressionable and vulnerable children? Attorneys and lawmakers agree tech companies have a responsibility to improve the safety measures on their platforms to keep kids connected yet safe and healthy.