In India, people, society, politics, and economics operate under the utilitarian principle of “maximum benefits” for all. However, in recent times, surveillance technologies, techniques, and processes are disrupting democracy and the inclusive idea of Digital India. A few months ago, Delhi Police announced that they would implement a state-of-the-art Integrated Command, Control, Communication, and Computer Centre (C4I) across all regions of Delhi by June 2026 to control crime and identify “criminals” by scanning faces on the streets and flagging suspects through CCTV cameras.
For this, the Delhi Police drew inspiration from an Israeli software (a surveillance van equipped with cameras both outside and inside, along with a full data and computer setup and police operators), which has been in use since 2018 in North and Northeast Delhi. Initially, it was used to match pictures of lost and found children. However, it has since been used on several occasions: During the Prime Minister’s rally at Ram Leela Maidan in 2019, in the North and Northeast Delhi riots of 2020 and 2022 (Jahangirpuri), and the Republic Day celebrations in 2023.
What was a temporary solution is quickly becoming the norm. As a result, certain areas and people are facing ghettoisation. Surveillance technologies (CCTVs, webcams, etc.), new technologies (especially those that track and record human activity through mobile applications and software), are strangely intensifying this process.
Data is a deeply political entity
The Delhi Police have stated that the C4I system will function with CCTV cameras, along with national database agencies, including e-challan, telecom, and banking records, as they were doing earlier using Israeli technology. A person’s image will be analysed with real-time recording, along with existing recorded data from government agencies. The data that has been fed to the technology, especially in North and Northeast Delhi, will now be used for C4I. If there is a match of 60 per cent based on real-time data and national agencies’ records, that person will be perceived as a “potential threat”, even if they committed a petty crime and the case was quashed by the judiciary. Unfortunately, a judicial dismissal is not the end of it. Data does not get automatically deleted; the accused has to approach the police for the same. In the long term, this data harms people in multiple ways.
AI and law professor Ifeoma Ajunwa notes that, as in the case of minorities in the US, marginalised communities are likely to be more susceptible to harm because of biased data and preconceived notions. This bias was also evident in job interviews: AI-based recruiting processes, influenced by human managers’ biases, did not shortlist CVs if certain key terms — such as the name of a prestigious school or college — were absent. Political scientist Partha Chatterjee calls such people “entitled citizens”, who have rights but are not recognised as “proper citizens” and live at the mercy of the state, never recognising “legitimate citizenship”. Therefore, we need “responsible technologies”, with multiple checks as we did in the field of finance by introducing UPI (Unified Payments Interface). Is it possible to scale AI tech the same way in the field of law and order?
Space, livelihood, and identity
Delhi is known for its migrant population, with a large section coming from economically weaker sections of rural India. Most people from North and Northeast Delhi are employed in construction, manufacturing, retail, and services, according to the Economic Survey of Delhi, 2022–23.
In Delhi’s service economy, private firms often check candidates’ residential addresses and require police verification, even for low-paid domestic call centre jobs. This sometimes leads to the possibility of “bribery”. Individuals may pay to receive a No Objection Certificate (NOC) from the local police station. Ajunwa refers to this process as “hidden biases”, and surveillance scholar David Lyon calls it “social sorting”, which happens because of human bias. In India, AI-based recruitment is still in its early stages and is not widely used due to its high cost and technical requirements. However, this could be the reality soon due to the rapid growth of AI usage in every job sector.
Data and technologies, especially surveillance-based technologies, are transforming some people and communities into “second-class citizens”. Technologies are acting like a “deep state”, and weaker sections are becoming more “suspicious”. Digital India can be an inclusive India if AI-based technologies are trained by comprehensive data rather than historical, biased data, which has been dominated by powerful caste and class for centuries.
The writer is assistant professor at Lal Bahadur Shastri Institute of Management, Dwarka, Delhi
