Facebook, Instagram algorithms intentionally harm kids, but Meta won't stop. We're suing.

3-minute read

Letitia James and Matthew Platkin
Special to USA TODAY Network

Meta — the parent company of Instagram and Facebook — has been fueling what the U.S. surgeon general calls “a national youth mental health crisis.” Features like curated algorithms, infinite scrolling, haptic notifications that tap or vibrate, quantified popularity via likes or follows, and face-enhancing filters are contributing to higher rates of suicide, self-harm, low self-esteem, depression, anxiety, sleep disturbances and more among children and teens. Despite these demonstrated harms, Meta refuses to do better. 

Companies that reject responsibility or risks to their profit margins despite serious safety issues with their products are not new. But neither is putting a stop to them. Echoing efforts against Big Tobacco in the 1990s, our bipartisan coalition of 42 state attorneys general sued Meta last month to hold it accountable for prioritizing profits over young users’ mental health. Filed in both federal and state courts, these coordinated lawsuits are the culmination of years of investigation by hundreds of public servants from across the country and the political divide. 

FILE - The Meta logo is seen at the Vivatech show in Paris, France, on June 14, 2023. A group of 33 states are suing Meta Platforms Inc. for harming young people’s mental health and contributing the youth mental health crisis by knowingly designing features on Instagram and Facebook that addict children to its platforms.

Since 2022, in response to state subpoenas or demands, Meta has sent our offices tens of thousands of pages of internal records that illustrate how Meta’s core business model relies on maximizing the time users spend on its platforms — without due regard to the resulting harms. Although most of the details in these records are still under court seal, what they reveal is no secret to any parent of a young teen.  

Attorneys general unite:41 states sue Meta alleging that Instagram and Facebook is harmful, addictive for kids

Meta knows Instagram harms teens' self-esteem

If anything, even Meta’s public records confirm our worst nightmares. Meta’s algorithms regularly stoke youth engagement by pushing provocative content to their feeds, including about eating disorders, violence, bullying and negative self-talk. Girls have been especially susceptible to the negative impact of Meta’s social media platforms, with rates of suicide, self-poisoning and depression skyrocketing as Instagram gained popularity in 2012 — the same year Meta purchased Instagram.

We do not have to look far for evidence of this crisis within our own states, where stories of kids suffering as a result of their addiction to Instagram and its peer platforms have only grown in recent years. In recent weeks, New York announced two bills to regulate social media for users under age 18 after spikes in teen suicides, depression and anxiety related to social media. Earlier this year, New Jersey enacted legislation establishing a 19-member commission to study the impacts of social media on youth and recommend guidelines to improve student health and academic performance.  

Privacy ruling:Meta will charge for ad-free versions of Facebook, Instagram in Europe

None of this should come as a surprise to Meta. The company has long tracked its impact on youth, with its own research showing that Instagram makes many feel worse about themselves. Instead of reversing course as a cascade of social science research confirmed what the company already knew, Meta doubled down while publicly denying any wrongdoing. In 2018, Mark Zuckerberg claimed that Meta — then still called Facebook — was focused on being “good for people’s wellbeing.” The company now issues regular reports to try to create the illusion that it is making good on this claim, but our investigation has revealed just how rife with misrepresentations Meta’s reports really are. 

In fact, Meta’s own employees have become so fed up with the company’s public misinformation efforts that they have started speaking out. In September 2021, after leaving her role on Meta’s civic integrity team, Frances Haugen filed a whistleblower complaint and testified before Congress that “Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children.” The next month, Facebook rebranded as Meta. 

New York Attorney General Letitia James arrives outside New York Supreme Court ahead of former President Donald Trump's civil business fraud trial on Oct. 2, 2023 in New York.

Throughout, Meta has evaded both moral responsibility and legal accountability. It has consistently attempted to convince the public that its social media platforms actually help, rather than hurt, youth. Time and again, it has disavowed any connection between its design choices or product policies and the crisis embroiling its young users — including those under 13. 

But parents know Meta’s role in this crisis. We know its role. And we know that Meta knows its role. Just watch us prove it. 

Matthew Platkin, the attorney general of New Jersey.

Letitia James is attorney general of New York. Matthew J. Platkin is attorney general of New Jersey.