New allegations against Meta, which operates Facebook, WhatsApp, and Instagram, claim, with piles of evidence, that the company downplayed its products’ harmful side effects on children. Basically, Meta executives know that the social media apps they sustain hurt kids and yet did nothing to change. Charlotte Alter reports in Time,
The following allegations against Meta come from the brief filed in an unprecedented multidistrict litigation. More than 1,800 plaintiffs — including children and parents, school districts, and state attorneys general — have joined together in a suit alleging that the parent companies behind Instagram, TikTok, Snapchat, and YouTube “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health,” according to their master complaint. The newly unsealed allegations about Meta are just one small part of the sprawling suit.
The allegations get even worse than that. The complaint alleges that Meta even targeted young users as a part of this “reckless” growth strategy, going so far as to dismiss company employees who suggested ways to protect kids from getting trapped in social media addiction. Alter further reports,
Plaintiffs claim that since 2017, Meta has aggressively pursued young users, even as its internal research suggested its social media products could be addictive and dangerous to kids. Meta employees proposed multiple ways to mitigate these harms, according to the brief, but were repeatedly blocked by executives who feared that new safety features would hamper teen engagement or user growth.
The New York Post reported that Meta employees referred to Instagram as a “drug” in text exchanges and that the company were “pushers.” A Meta representative has come out and said that the allegations are overblown and are based on selective interpretation.
The Post also reported on Meta’s frightful “17x strike policy,” meaning accounts had to violate sexual ethics standards up to 17 times before getting suspended. In addition, Meta is also accused of suppressing research-backed evidence showing how getting off Facebook was good for users’ mental health:
The legal brief also unpacks Meta’s alleged handling of research codenamed “Project Mercury,” a 2020 study that examined what happens when users stopped using Facebook and Instagram for a month compared to those who continued normal usage.
To Meta’s alleged “disappointment,” the study showed “[p]eople who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness, and social comparison,” according to documents cited in the filings. Social comparison refers to people determining their self-worth based on how they are similar to or different from others.
We’ve seen stories like this before in connection with Zuckerberg’s social media empire, but this new litany of complaints feels like it will be a lot tougher for the tech behemoth to successfully battle. It’s a version of the same tale, though: Social media companies often care about engagement and growth over everything else, including the mental health of their users. Mental and spiritual wellbeing are tertiary concerns compared to business growth.
