Every weekend, I have a standard routine. Friday evening, after I wrap up my work, I open Instagram and YouTube. Throughout the weekend, I doomscroll on Instagram, and I lie in bed at night watching YouTube videos of people remodelling their houses. My wife can stop herself from wasting time on these apps without removing them from her phone, but even after repeated attempts, I still can’t.
And I know that I’m not alone in this either — I’ve sat in autos where the auto driver was doomscrolling and commenting on Instagram Reels, sat next to people in public transport who spent their entire time on Facebook, and I know of people who obsessively check and respond to WhatsApp statuses.
Given my experience in the software development ecosystem, I know that these systems have been designed to make us addicted, and an American court ruled a few days ago that Meta and Google intentionally built addictive social media platforms.
Since 2020, there have been whistleblower reports from Meta employees that the Instagram algorithm was designed to be addictive. Discussions about the “algorithm” have become surprisingly commonplace, even outside the software industry, because every single one of us who uses social media platforms has experienced the “algorithm” changing what we see and getting us more addicted.
But what is the “algorithm”? An algorithm is a set of instructions that a programmer gives to a computer to make it do a specific task. Software engineers write algorithms every day. So software engineers at Meta, Google, and other social media companies wrote these algorithms that made us addicted.
Leadership at these for-profit organisations decided that getting us addicted to these applications was in their best financial interest, and they directed the software engineers to create and fine-tune the necessary algorithms. These algorithms didn’t appear out of nowhere — people you know likely wrote them.
Simon van Teutem coined the term “Bermuda Triangle of talent” to describe the fact that the smartest college graduates we know end up in banking, consulting, and law because of the allure of wealth and power. Big Tech has also, unfortunately, become such a black hole of talent. Instead of democratising access to information, social networking sites have become cesspools of misinformation.
Despite efforts by organisations such as the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers (IEEE), most software engineers don’t know what a code of ethics is. It doesn’t help that most software engineers don’t have the relevant educational training or vocabulary to understand how their work affects people’s lives, and the few who do often have “golden handcuffs” that prevent them from enacting change.
Where do we go from here?
Legal precedent acknowledging the addictive nature of social media applications is a good start, but we need to do better. Alcohol and nicotine are addictive, and prolonged use can cause significant negative health effects. Would it help to have health warnings for addictive social media applications when we attempt to install them from a mobile app store? How about a pop-up displayed every time we open a social media application, warning us of depression and other mental side effects?
And the next time we talk to someone who works at a software company, let’s not dissociate from the conversation when they start talking about “data analysis”. Let’s ask them what data they are analysing. Are they analysing patterns of app usage to understand changes in user behaviour? The next time someone mentions machine learning or AI, ask them if the algorithms they are developing are making people addicted to social media applications. And tell your governments that banning social media applications for minors isn’t sufficient.
These applications have a devastating impact on the growing minds of kids, adolescents and our elders, who are still coming to terms with our fast-paced technological world. Algorithmic transparency and accountability are what we need to solve this problem for once and for all.
The writer is CEO, FOSS United
