This week, the spotlight turned to social media giants Meta and Google as they faced a landmark legal challenge regarding the potential addictive nature of their platforms. Adam Mosseri, Instagram’s CEO, firmly stated that social media does not constitute a “clinical addiction,” sparking a heated debate about the ethical implications of design choices that keep users engaged—especially children. The trial in Los Angeles has drawn comparisons to historical lawsuits against tobacco companies, raising critical questions about corporate responsibility in an increasingly digital world.
The Case Against Meta and Google
At the heart of this six-week trial were claims made by attorney Mark Lanier, who argued that the companies are guilty of “addicting the brains of children” through design elements such as infinite scrolling and autoplay features. Lanier’s assertions were met with firm denials from the tech firms, with Meta asserting that providing a “safer, healthier experience” for young users is central to their mission.
The ramifications of this trial could redefine the relationship between tech companies and their users, particularly in how they design their platforms. The trial’s closing arguments have ignited a broader societal conversation about the balance between user engagement and ethical responsibility.
Understanding Infinite Scrolling
Once, social media feeds had a definitive end; now, they are designed to keep users scrolling indefinitely. Arturo Béjar, a whistleblower who previously worked in child online safety at Meta, explained that infinite scrolling creates a cycle where users are constantly seeking the next dopamine hit. Internal communications during the trial revealed concerns among Meta employees about rising “reward tolerance” among users, with one stating, “Oh my gosh y’all IG is a drug.”

The mechanics of infinite scrolling rely on the promise of endless rewards. This is a deliberate design choice aimed at keeping users engaged, but it raises ethical questions about the psychological impacts, especially on younger audiences.
The Impact of Autoplay Features
Autoplay videos have become a ubiquitous feature across various platforms, from Netflix to Instagram. Béjar noted that, while users initially disliked these disruptive features, they ultimately led to increased viewership. This feedback loop benefits advertisers but may come at the cost of user satisfaction.
Lanier drew an apt analogy, likening the experience to endless free tortilla chips at a restaurant, where the inability to stop eating can lead to overindulgence. This comparison underscores the concerns that many experts have regarding the design choices made by social media platforms.
The Psychology of Likes and Engagement
Another contributing factor to user engagement is the fear of missing out (FOMO), which is amplified by notifications and likes. According to Mark Griffith, a behavioural addiction expert, the competition for likes can provide users with a pleasurable rush driven by dopamine and adrenaline. However, he cautioned that this experience does not equate to the addictive properties found in substances like nicotine or cocaine.

While some individuals may develop problematic patterns of use, Griffith emphasised that most social media engagement falls into the realm of habitual use—an issue that can affect productivity and relationships without leading to true addiction.
In his testimony, Mosseri reiterated that social media should not be viewed as clinically addictive, drawing parallels to the emotional attachment many people have to their favourite television shows. This distinction is crucial as jurors consider their verdict in this landmark case.
Why it Matters
The outcome of this trial could significantly influence how tech companies approach platform design and user engagement strategies going forward. As the line between user engagement and addiction continues to blur, the verdict may pave the way for a new era of accountability in the tech industry. This case holds the potential to reshape societal norms around digital consumption and define the ethical responsibilities of platforms in safeguarding their users, particularly vulnerable populations like children. The implications are profound, as they challenge the balance between innovation and responsibility in the rapidly evolving digital landscape.