🔥 Trade with Pros on Discord → 21 Days Free (No Card)JOIN FREE

Meta downplayed risks to children and misled the public, court filings allege

In this post:

  • A new court filing has accused Meta of exposing teens and misleading the public.
  • Plaintiffs claim Meta did not do enough to protect teens across its platforms.
  • Plaintiffs’ brief paints an uncomfortable picture of Meta’s internal organization.

A new court filing has revealed that sex trafficking was difficult to report and widely tolerated on Meta platforms. According to the filing, which was unsealed on Friday, the lawsuit was a part of a wider lawsuit filed against four social media companies.

In the plaintiffs brief, Instagram’s head of safety and well-being, Vaishnavi Jayakumar, testified that when she joined Meta in 2020, she learned that the company had a 17x strike policy for accounts that carried out trafficking of humans for sex. This meant that users could incur this infringement 16 times, with the company only suspending their account on the 17th occurrence. “By any measure across the industry, [it was] a very, very high strike threshold,” she added.

Meta accused of downplaying risks to children and misleading the public

The allegations against Meta stem from a brief filed in an unprecedented multi-district litigation. More than 1,800 plaintiffs, including children, parents, school districts, and state attorneys general, have banded together in the lawsuit. The lawsuit claims that parent companies behind TikTok, Snapchat, and YouTube “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.”

According to the brief filed by the plaintiffs in the Northern District of California, Meta was alleged to have been aware of serious harms on its platform and engaged in a pattern of deceit to downplay risks to young users. The platforms claim that internal company documents can corroborate the testimonies. In addition, the plaintiffs claimed Meta was aware that millions of adults were contacting minors on their platforms.

See also  Tesla doors that Elon Musk personally insisted on are now at center of safety investigations

The plaintiffs also claimed that Meta was aware that its products increased mental health issues in teens, and content related to eating disorders, suicide, and child sexual abuse was frequently detected, yet hardly removed. The brief also noted that the company failed to disclose these harms to the public or to Congress, and also refused to implement safety fixes to protect young users from being exposed to them.

“Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” says Previn Warren, the co-lead attorney for the plaintiffs in the case. “Like tobacco, this is a situation where there are dangerous products that were marketed to kids,” Warren adds. “They did it anyway, because more usage meant more profits for the company.”

Brief paints a bad picture of its internal structure

The plaintiffs’ briefs, which were first reported by TIME, were based on sworn depositions of current and former Meta executives, internal communications, and company research and presentation materials obtained during the discovery process. It includes several quotes and experts from thousands of pages of testimony and internal company documents. TIME was unable to view the testimonies or research quoted in the brief at the time because they were sealed.

See also  China’s “homegrown Nvidia” boss launches brand new AI chips after funding windfall

However, the brief still paints a bad picture of the company’s internal research and deliberations about issues that have affected its platforms since at least 2017. Plaintiffs noted that since 2017, Meta has been enticing young users, even though its internal research suggests its social media products could be addictive and dangerous to kids. According to the brief, Meta employees proposed ways to reduce these harms but were often blocked by executives.

Meanwhile, in the years after the lawsuit was filed, Meta has implemented new safety features designed to address some of the problems that the plaintiffs described. Last year, the company unveiled its Instagram Teen Account, which sets a user’s account private by default, provided they are between 13 and 18. In addition, it limits sensitive content, turns off notifications at night, and bars communications from unconnected adults.

Sign up to Bybit and start trading with $30,050 in welcome gifts

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Editor's choice

Loading Editor's Choice articles...

- The Crypto newsletter that keeps you ahead -

Markets move fast.

We move faster.

Subscribe to Cryptopolitan Daily and get timely, sharp, and relevant crypto insights straight to your inbox.

Join now and
never miss a move.

Get in. Get the facts.
Get ahead.

Subscribe to CryptoPolitan