As 2026 begins, India’s education system faces a major shift. The government’s decision to introduce Artificial Intelligence (AI) from Class 3 under the National Education Policy (NEP), alongside large-scale teacher training via the NISHTHA platform, signals a growing role for technology in everyday schooling.
AI is being presented as a tool that will improve access, make teaching more efficient, and help personalise learning. But the deeper question is this: Are we reshaping education, or simply making it more automated?
Teachers are already noticing the shift. In many schools, they find themselves managing AI platforms more than guiding students. With AI now capable of producing essays, giving feedback, and even responding to emotional cues, students often turn to machines for help and reassurance. The concern is not whether AI can replace teachers, but whether the space for teachers to do what only they can do — build relationships, guide reflection and notice what goes unsaid — is slowly being reduced or redefined.
AI tools often promise “personalised learning.” Algorithms adjust lessons based on how quickly or accurately a student completes tasks. While this can help with practice and pacing, it often depends on tracking how students behave — recording clicks, screen time, facial expressions, and how they engage with material. This kind of tracking changes how students relate to learning. It can turn a classroom into a monitored space, where attention is shaped more by what the system can measure than by what the learner is curious about.
The Digital Personal Data Protection Act (DPDPA), 2023, does try to protect minors by limiting behavioural tracking and targeted ads. But Section 9(5), the “Verifiably Safe” clause, allows exceptions for schools certified as safe. These schools can still work with vendors whose systems collect data. Without clear oversight, this exception risks becoming a way around the very protections the Act was meant to create. Families and educators alike need clarity about how these tools are operating and what data they are using.
AI offers speed. It can generate lesson plans, check assignments, and handle routine tasks. This seems useful, especially in overloaded school systems. But faster is not always better. Indian education policy has often equated progress with more infrastructure — digital tools, smart boards, devices — without focusing as much on how students actually learn and make meaning.
AI fits into this pattern. It allows for measurable results, quick feedback, and scalable solutions. But real learning takes time. It involves trying, failing, reflecting, and slowly building understanding. This kind of learning cannot be rushed or reduced to a score. It is shaped by relationships, attention, and the ability to sit with uncertainty.
Immersive learning approaches build this kind of capacity. They give space for slow thinking, reflection, and making sense of ideas through dialogue and experience. These methods depend on teachers — not just to explain, but to support the learner’s journey. AI can assist with delivery, but it cannot replace the role of the teacher in creating a space for growth and questions. That space, full of meaning and emotional safety, is where real education grows.
Not all schools can adopt AI in the same way. UDISE+ 2024–25 data shows that while many private schools have strong digital infrastructure, government schools often do not. In states like West Bengal, internet access and digital readiness remain limited in public schools.
This difference creates another problem. Well-resourced schools can afford AI platforms that are tested and secure. Others may rely on free tools that collect more data or offer fewer protections. This leads to unequal exposure: some students get tools that support them, others get tools that track them.
This is not just a gap in technology. It shapes how students are seen and treated. It affects privacy, control, and how much say a student has in their own learning. If left unaddressed, it will widen the gap not just in outcomes, but in dignity and autonomy.
Currently, most AI tools come from outside the school system. Teachers are asked to use them, not help design them. But teachers know their students and communities. They are best placed to decide how AI should be used — and when it should not be used at all.
Teachers need to be part of the process. The government could set up district-level working groups where teachers evaluate tools, suggest changes, and share what works. Schools should also have data governance committees that include teachers and parents. No AI tool should be used without independent review. Consent must be informed, flexible, and student-focused.
This is not just about efficiency. It’s about trust. When teachers are involved in shaping how technology enters the classroom, it leads to better use and fewer risks. More importantly, it shows that we value their judgment and everyday experience of teaching.
The way AI is brought into education will shape how students understand learning itself. If tools are used only to improve scores and speed up delivery, then success will be defined by metrics. But education can be more than that. It can be a space where students think deeply, ask questions, and build habits of attention and care.
We already have strong foundations — a forward-looking NEP, a legal framework around data, and teachers willing to adapt. What we need now is clear direction. Will AI be used to support every student — or just make some things easier? Will it strengthen the teacher’s role — or make it less central?
These are not just technical questions. They are about what kind of education we want. The time to decide is now — openly, carefully, and together.
The writers are affiliated with BML Munjal University. Views are personal
