3 min readMar 27, 2026 06:05 AM IST
First published on: Mar 27, 2026 at 06:05 AM IST
The verdict delivered by a California court this week holding Meta and YouTube accountable in a social media addiction case could be a bellwether for the larger movement towards accountability for Big Tech. The 20-year-old plaintiff, who claimed that the platforms run by these companies led to anxiety and depression, was awarded millions of dollars in damages — Meta is to pay $4.2 million and YouTube $1.8 million. The verdict is being called technology’s Big Tobacco moment. The analogy may be imperfect, but the underlying shift it points to is real: A diffuse, widely felt harm has finally been named and, importantly, attributed.
The idea that social media is engineered to be addictive has hovered at the level of common sense. The verdict lends greater weight to what insiders like former Google design ethicist Tristan Harris and Justin Rosenstein, creator of the Facebook “like” button, have long argued: Social media platforms are built to capture and hold attention through design choices that exploit human psychology. From the allure of “likes”, described by Rosenstein as the “bright dings of pseudo pleasure”, to the pull-to-refresh feature which has been likened to slot machines, from Snapstreaks to the subtle coercion of WhatsApp read receipts, these choices have shaped behaviour, recalibrated social expectations, and blurred the boundary between choice and compulsion.
The challenge now is to work out the most sustainable ways for minimising harms. Governments across the world, from Australia to France, and within India, Karnataka and Andhra Pradesh, are experimenting with bans and age restrictions, driven by legitimate concerns about children, whose neuroplasticity makes them especially vulnerable and susceptible. But in a world where education, work, and social life are deeply entangled with digital platforms, prohibition cannot be the full or only solution. The harder path lies in shared responsibility. Public pushback in recent years has already prompted some rethink in Big Tech, leading to measures like the removal of beauty and “plastic surgery” filters on Instagram and the implementation of age verification on multiple platforms. But it must go further, factoring in care, not compulsion, at the level of conception and design. Children, too, must be equipped with the cognitive and emotional tools to navigate the attention economy. It may not be possible to push the social media genie back into the bottle. It can, however, be made less predatory.
