Introduction
In October 2023, over 42 State Attorneys General filed lawsuits against Meta in the Superior Court of California County of Los Angeles. The lawsuits allege that Meta knowingly used features on Instagram and Facebook to maximize revenue with no regard for the negative mental health impact these platforms had on minors. The plaintiffs argue that these manipulative features led to teens developing an addiction to social media, resulting in depression, anxiety, eating disorders, self-harm, and, in severe cases, suicide.
The following month, November 2023, California Attorney General Rob Bonta released a new version of the complaint filed by 33 attorneys general, which revealed previously redacted information. The new filing indicated that Meta was aware millions of children under 13 were using their platform and still collected their data.
That the company is aware of the millions of underage children on the platform contradicts previous testimony given by the Head of Instagram, Adam Mosseri. The complaint also highlighted Meta’s prioritization of increasing teen users’ time spent on the platform as it downplayed the harm teen users experience.
Rampant Bullying and Unwanted Sexual Advances
Arturo Bejar, who worked in online security, safety, and protection at Facebook, testified before the Subcommittee on Privacy, Technology, and the Law on November 7, 2023. Bejar’s testimony included statistics from the company’s internal research and sent an email to senior leadership, including Zuckerberg. According to Bejar:
Approximately 21.8% of 13-15-year-olds reported bullying in the past week.
39.4% of children in the same age group experienced negative comparisons in the previous seven days.
Initially, 24.4% of 13-15-year-olds reported receiving unwanted advances in a week, but this figure was later revised to 13%, still a concerning number.
Sheryl Sandberg expressed empathy toward Bejar, whose daughter had been on the receiving end of unwanted sexual advances, but offered no concrete action plans. Adam Mosseri requested a follow-up meeting, while Mark Zuckerberg did not respond.
One week after the hearing, a bipartisan committee of US Senators demanded communications in a letter to Mark Zuckerberg, Facebook’s founder.
“Meta’s representations to the public and in response to Congressional inquiries concealed and misrepresented its extensive knowledge about the threats to young people on its platforms,” wrote US Senators Richard Blumenthal, Marsha Blackburn, Dick Durbin, Lindsey Graham, Elizabeth Warren, and Josh Hawley.
Child Sex Abuse Material and Predators on Instagram
New Mexico’s Attorney General Raúl Torrez sued Meta in state court in December 2023, alleging that the design flaws in Facebook and Instagram led to child sexual abuse material (CSAM) and child predators being recommended underage accounts on its platforms.
Investigators in New Mexico created decoy child accounts and found that Meta’s platforms exposed these accounts to sexually explicit content and unmoderated groups related to commercial sex. The platforms also enabled adults to contact minors for explicit images.
New Mexico’s Attorney General accused Meta of prioritizing profit over child safety. Meta, while not directly addressing these allegations, stated its commitment to protecting young users through technology, expert hiring, reporting to the National Center for Missing and Exploited Children, and disabling accounts violating child safety policies.
Exploitation by Design
Facebook, as detailed in the less-redacted court documents, stands accused of practices that exploit and manipulate children. These practices can be categorized into several key areas:
Monetization of Young Users’ Attention: Meta’s business model focuses heavily on monetizing the attention of young users through data harvesting and targeted advertising. This involves designing and deploying features that capture and prolong young users’ time on social media platforms.
Targeted Features and Misrepresentation of Safety: State Attorneys General accuse Meta of designing psychologically manipulative product features that induce compulsive and extended platform use among young users.
One strategy weaponized FOMO (fear of missing out) with disappearing content available through push notifications. Users enabled more notifications to avoid missing these posts, increasing time spent in the app. Dopamine-manipulating recommendation algorithms incentivized “Likes” and social comparison, while audiovisual and haptic alerts lured them back into the app. Despite knowing the potential harm of these features, Meta misrepresented them as safe and suitable for young users.
Concealment of User Harms and Misleading Public Reports: The complaint alleges that Meta concealed internal data showing high user harm rates on its social media platforms, instead publishing misleading reports suggesting a low incidence of such harms. This concealment and misrepresentation extended to the public discourse, where Meta downplayed the impact of its features on mental and physical health.
When researchers at Facebook found interventions that had a positive effect on young users and had a modest effect on engagement and revenue, the interventions were rejected. Facebook then downplayed how effective the options had been.
Noncompliance with COPPA: Meta has been found noncompliant with the Children’s Online Privacy Protection Act (COPPA), particularly in handling Instagram and Facebook users under 13. The company allegedly collected personal information from children without obtaining verifiable parental consent and designed its registration processes in a way that allowed underage children to use Instagram. Internal documents revealed Meta’s awareness of its user base, including children under 13, and its decisions regarding this demographic.
Negative Psychological and Physical Impacts: The complaint details the significant negative impacts of Meta’s platforms on young users’ mental and physical health. This includes associations with depression, anxiety, insomnia, interference with education, and other adverse outcomes. Meta was aware of these harms but continued to promote its platforms as safe for young users.
The practices revealed in the complaint suggest a systematic approach to exploiting young users for profit, prioritizing engagement and revenue over the safety and well-being of children and teenagers.
Although Meta had viable options for reducing the harm they estimated would have a 1% negative effect on revenue, Meta claimed the options were ineffective. Facebook’s scientists were adamant that it helped.
How to Counter Tactics That Distract
Our brain’s “salience network” prioritizes what’s important based on alerts like notifications. Social media exploits this by using vibrations, red dots, and banners, often less about providing crucial information and more about drawing us into the app. This constant stimulation pulls at our attention, making it seem like something important is always happening.
Educate yourself and your children about the different types of notifications and their purposes. Explain how some notifications are designed to attract attention and increase platform engagement, often at the cost of real-life interactions and focus.
Identifying Non-Essential Notifications: Work with your children to identify non-essential notifications. This might include alerts from social media apps, game notifications, or any other alerts that are not critical for their day-to-day activities and well-being. Especially with older children, invite them to take ownership in deciding here.
Setting Specific Times for Notifications: Establish times when notifications should be minimized. Consider not having a device in the child’s room overnight.
Using Do Not Disturb Features: Teach children to use the ‘Do Not Disturb’ or similar features on their devices. These can automatically silence notifications during designated times, such as when they are at school or asleep.
Turning Off Push Notifications for Social Media: Turn off push notifications for social media. These are designed to increase engagement and time spent on the platform. Disabling them can help reduce distractions. Try summary notifications that deliver alerts at pre-determined times.
Regular Review and Adjustment: Regularly review and adjust notification settings with your child. Revisit to gauge if they notice any difference or whether you may need to limit alerts completely.
Balancing Notifications with Real-World Activities: Encourage and model a lifestyle where digital notifications do not override real-world activities and interactions. Kids often learn their phone habits from adults. Make these same changes to your devices and model a healthy relationship with social media. Discuss the importance of being present in the moment, whether during family activities, studying, or hobbies.
Infinite Scroll and Unit Bias
Have you ever opened your phone to check social media, and before you know it, an hour or even hours have gone by? This may result from social media’s design. The concept of ‘infinite scroll’ on these platforms can be explained through the psychological principle of ‘unit bias.’
This bias can lead us to consume whatever is presented as a single unit. For example, in a study, individuals given a larger scoop to serve M&M’s consumed more than those given a smaller scoop due to the perception of the larger scoop as a single serving.
Similarly, the infinite scroll on social media is like the larger scoop — there’s no clear end to the feed, leading us to scroll longer than we intended. This design is deliberate.
Understanding how ‘infinite scroll’ encourages users to spend more time on the app may empower us to make more conscious choices about our time on social media.
The Goal of Maximizing Engagement: Explain that the primary goal of these algorithms is to keep users engaged on the platform as long as possible.
Infinite Scroll and Its Effects: Explain how ‘infinite scroll’ provides a never-ending content stream that can lead to spending more time on the app than intended, as there is no natural stopping point.
Understanding Dopamine and Reward Systems: Educate about the brain’s reward system, particularly dopamine, released during enjoyable activities, including social media use. Explain how features like ‘Likes’ and ‘Shares’ are designed to exploit this system, creating a loop of seeking more engagement and validation.
Psychological Vulnerabilities of Youth: Discuss how developing brains are more susceptible to these tactics. Highlight how the prolonged time on the platform may affect their self-esteem, anxiety, attention span, and real-world relationships. Compulsive social media use can cause real changes in the brain. A more concrete example of the consequences, like the changes in the brain, may help you understand and explain the real consequences of use.
Understanding Algorithms
The design of social media algorithms, particularly on platforms like Facebook, plays a significant role in creating a personalized user experience. These algorithms analyze a vast array of personal data to tailor the content in each user’s feed. Understanding how this works and the privacy concerns it raises is crucial for adolescents and their guardians.
The brain’s mechanism for ‘wanting’ is more powerful than that for ‘satisfaction.’ Social media platforms use this to their advantage by offering endless scrolling, recommendations, and swiping features. This design keeps users constantly seeking new content, capitalizing on the brain’s desire for novelty and stimulation. But no matter what social media shows you, people who turn off social media report feeling better.
By educating children about how social media algorithms influence our attention and exploit psychological vulnerabilities, parents and guardians can empower them to make more informed decisions about social media use and develop healthier online habits.
Algorithm and Content Distribution: Facebook’s algorithm significantly changed in 2018 by prioritizing content for Meaningful Social Interactions (MSI). MSI-driven content selection prioritized divisive, sensational, and sometimes harmful content because such content typically generates more user engagement in the form of likes, comments, and shares. This incentive can create an online world that distorts reality.
Impact on Youth and Mental Health: Multiple studies, including those from Facebook itself, reveal a concerning link between Instagram use and issues like body image concerns, social comparison, and mental health challenges among teenagers, particularly teenage girls. Facebook was aware of these impacts and failed to address them or disclose them to the public.
Content Moderation Challenges: Facebook has faced challenges moderating harmful content, including hate speech and content promoting violence. The internal documents suggest that while Facebook has attempted to combat these issues, it has been a persistent struggle, worsened by the platform’s refusal to risk profits.
Personalized Manipulation
To counteract the adverse effects of social media algorithms and content manipulation, especially on platforms like Facebook, parents, and guardians can take several proactive steps:
Educate About Algorithmic Manipulation: Explain to children how social media algorithms work to create addictive patterns, such as endless scrolling and personalized feeds based on user interactions. Discuss how these algorithms prioritize content that generates strong emotional responses, often leading to exposure to sensational or divisive material.
Discuss the Impact of Social Comparisons: Have conversations about the impact of “Likes,” comments, and social comparison features on social media. Features like filters and masks that dramatically alter one’s appearance can create unrealistic standards and affect self-esteem and mental health, especially in adolescents.
Teach Critical Thinking Skills: Encourage children to analyze the content they see on social media critically. This includes questioning the authenticity and motive behind posts and recognizing that much of the content is curated and may not represent reality.
Promote Digital Literacy: Educate children about digital footprints and privacy concerns. Explain how platforms like Facebook use their data to tailor content and advertisements and the importance of being mindful about what they share online.
Encourage Healthy Social Media Habits: Guide children to develop healthier online habits, such as taking regular breaks from social media, engaging in offline activities, and limiting time spent on these platforms.
Model Positive Behavior: Parents and guardians should model positive social media use. This includes being mindful of their social media habits and demonstrating a balanced approach to online and offline activities.
Powered by Your Data
Facebook’s business model relies heavily on collecting and analyzing vast amounts of user data to drive its advertising revenue. This involves tracking user behaviors, preferences, and interactions within its platforms and, in many cases, across the web.
Concerns about user privacy have been a recurring issue for Facebook. The revelations about Facebook’s data practices, like those highlighted by the Cambridge Analytica scandal and other incidents, have led to a decline in user trust and increased calls for regulatory oversight. This includes demands for more stringent data protection laws and reforms in how social media platforms manage and share user data.
Facebook has shown a keen interest in attracting younger users, as evidenced by its plans for products like Instagram Kids. This initiative highlights the company’s intent to engage children under 13 in its ecosystem.
Not unlike Big Tobacco, this company expends considerable effort in determining how it could reach kids sooner and ensure they use its products across the lifespan. This is despite in-house research showing its products may cause harm to these young users.
Data Collection from Younger Users: Any engagement of younger users on platforms like Instagram or Facebook can be collected. This data could include usage patterns, preferences, social interactions, and standard metrics Facebook collects.
Compliance with COPPA: In dealing with younger audiences, Facebook must navigate stringent privacy laws like the Children’s Online Privacy Protection Act (COPPA) in the United States, which restricts the collection of personal data from children under 13 without parental consent. According to court filings, Meta collects data on users who are under 13, and it is aware of millions of potentially underage accounts.
Predatory Data Harvesting
This list provides practical and effective strategies to enhance online privacy for families. From discussing the inherent risks of data sharing to utilizing tools that enhance privacy, these guidelines aim to empower parents and children to navigate the digital world safely and responsibly. Whether it’s adjusting privacy settings on social media, using privacy-focused browsers, or staying updated on data protection laws, each point offers a step towards a more secure online presence for children.
Educate About Online Privacy: Teach children the importance of online privacy. Explain how platforms like Facebook can collect and use their data, such as preferences, interactions, and location.
Discuss the Risks of Data Sharing: Have open conversations about the risks of sharing personal information online. This includes discussing incidents like the Cambridge Analytica scandal to illustrate how data can be misused.
Use Privacy-Enhancing Tools: Consider using privacy-enhancing tools and software that can limit data tracking. Firefox has a Facebook container plugin that makes it harder to track you and your family on the web. Other options that may improve your data privacy include:
Virtual Private Networks (VPNs): VPNs can encrypt internet traffic, making it harder for platforms to track online activities.
Privacy-Focused Browsers: Firefox or Brave offer enhanced privacy features to prevent trackers from collecting data.
Ad Blockers: Tools like uBlock Origin or AdGuard can help block online trackers and ads. Ghostery is another product that blocks tracking that has built a Firefox browser, offering more privacy. There is also a web-based version of the search engine.
Check Privacy Settings: Regularly review and adjust privacy settings on Facebook and other social media platforms. Ensure that the least amount of personal data is being shared or collected.
Encourage Safe Social Media Practices: Talk with teens about being careful of what they post online, emphasizing not to share sensitive information like their location, contact details, or personal schedules. Many of these can be inferred from surprisingly little information.
Stay Informed About Data Protection Laws: Pay attention to changes to data protection laws and regulations, such as COPPA, GDPR, or any new legislation that might affect how children’s data is handled online. This can be technical, so consider following an advocacy organization like the Center for Countering Digital Hate, which is good about informing parents about the risks of online platforms.
Appendix
Less-Redacted Court Documents
This section contains links to specific locations in the court documents from the State of California V. Facebook case. See the full document here: Original Document.
Insomnia, anxiety, depression, and interference with education p. 8
Meta deceived the public and lawmakers about its goals p. 30
Personalized algorithms manipulate young users to encourage compulsive use p. 33
Researchers describe how the algorithm manipulates users p. 34
Meta confident on the causal link between like count and social comparison p. 46
“we know…social comparison is liked to multiple negative well-being outcomes” p. 47
66% of teen girls and 40% of teen boys experience negative social comparison p. 47
Meta knows about users under 13, and it’s collecting identifiable data p. 142
Citation
@article{li2023,
author = {Li, E. Rosalie},
publisher = {Information Epidemiology Lab},
title = {Penny for {Their} {Harm}},
journal = {InfoEpi Lab},
date = {2023-12-20},
url = {https://infoepi.org/posts/2023/12/20-penny-for-their-harm},
langid = {en}
}