Special | Watched But Unprotected: How Lucknow’s Safe City Project Fails Women
Astha Savyasachi
Real journalism holds power accountable
Since 2015, The Wire has done just that.
But we can continue only with your support.
Lucknow: It was just past midnight on March 19, 2025, when Sarika*, a 32-year-old woman, arrived in Lucknow after a day’s trip to Varanasi. The job interview she had gone for was intended to be the first step towards her own earnings – something she hoped would ease her family’s financial struggles.
At around 1:30 am, she stepped off the bus at the Alambagh terminal and flagged down an autorickshaw to take her to Chinhat, where her brother lived. She had plans to go back the same evening to Ayodhya, where she lived with her 10-year-old son and husband.
She never made it to her brother’s house that night. Her body was discovered in a mango orchard on the outskirts of Malihabad.
Around 2:30 am, Sarika had called her brother in panic, saying an auto driver was taking her on a strange route. The call ended with a scream. Her location was traced to Malihabad, 35 km off the route they were supposed to take, where she was allegedly raped and murdered by the driver and his brother.
She was rushed to the hospital at King George's Medical University (KGMU), where she was declared dead by doctors.
Sarika’s death added to the growing list of women, girls, teenagers and even toddlers who have fallen through the cracks of Lucknow’s Rs 195-crore Safe City Project – a project heavily reliant on Artificial Intelligence that promises safety but often fails when it matters most. With massive spending on AI tools, SMART cameras and control rooms, the project has devoted little attention or resources to strengthening the human oversight needed to act on those alerts.
The Wire has reached out to Director General of Police, Uttar Pradesh Rajeev Krishna, Commissioner of Police, Lucknow Amrendra K. Sengar and Additional Superintendent of Police (PRO) Rahul Srivastava to ask about lapses in the project’s functioning, including in Sarika’s case. This article will be updated when a response is received.
Lucknow Safe City Project
Launched in 2018, the Safe City Project aimed to “prevent and curb all forms of crimes against women and girl children in public places by providing safer urban infrastructure and efficient access to law enforcement agencies”. Lucknow was among eight cities selected for the pilot, with the Union Ministry of Home Affairs approving Rs 194.44 crore for its implementation.
Led by the Uttar Pradesh Police, the project included an Integrated Smart Control Room, ‘Pink Outposts’ and Patrols led by women police, Women Help Desks in all police stations, safety measures in buses, improved street lighting in hotspots, and integration of women’s helplines with emergency number 112. It said it would transform Lucknow into a safe city “by the implementation of a video surveillance system in accordance with the highest standards available for monitoring the activities of citizens of Lucknow City”.
In April 2022, Allied Digital Services bagged a Rs 85-crore contract as a System Integrator for the Lucknow Safe City project. System Integrator is the organisation/agency/company responsible for the “Design, Implementation and Maintenance of Integrated Smart Control Room”. The company was tasked with setting up a city-wide surveillance system, AI-based video analytics, drones, mobile device terminals, integrated smart control room, a data centre and cloud-based disaster recovery.
Illustration: Pariplab Chakraborty
The Wire spoke to Jayant*, a technical expert from Allied Digital Services, who oversees the Integrated Smart Control Room in Lucknow. He explained that the project has officially been handed over to the Lucknow police, with Allied Digital now responsible solely for its technical support operations and maintenance.
At the heart of the project is the Integrated Smart Control Room, which relies heavily on “an intelligent video surveillance system” comprising digital security cameras installed across key locations in Lucknow. This system’s biggest selling point is the deployment of 700 ‘AI-enabled SMART cameras’, alongside conventional CCTV units and Integrated Traffic Management Cameras. Nearly half of the project’s total budget has reportedly been channelled into building the Integrated Smart Control Room and its vast network of AI-powered cameras.
Jayant* said that around 1,061 cameras have been installed, covering 221 locations across Lucknow. While all cameras can technically be AI-enabled, only 700 – about 66% – are currently operational with AI functionalities to detect specific actions, such as hand-waving, fighting, stalking, public smoking/drinking, accidents, fights, harassment or violent behavior. Of these, 100 are equipped with facial recognition systems.
Video feeds from the entire camera network are routed to the control room, where they are continuously monitored and analysed. For monitoring and analysis, Jayant explained, the Smart Control Room relies on CogniTec’s facial recognition system (FRS) and Graymatics’ video analytics system.
According to the tender documents for System Integrators accessed by The Wire, FRS and other AI analytics are designed to “enhance the system's capability to identify people, objects and characters that can enable faster and efficient decision support and ensure a preventive security mechanism”.
Although the surveillance system is designed to be linked with multiple government databases, authorities did not disclose any details about these integrations in response to The Wire’s RTI queries. The Director General of Police (DGP) transferred our RTI request (dated June 3, 2025) to the Special Task Force Headquarters, which has not yet responded.
Tender documents reviewed by The Wire show that the system is meant to integrate with several government databases for law enforcement purposes, including those of registered sexual offenders, missing persons (maintained by the Anti-Human Trafficking Units and Child Welfare Committees), suspicious vehicles, and criminals involved in crimes against women. It is also expected to connect with the Crime and Criminal Tracking Network & Systems (CCTNS), the 1090 women’s helpline, vehicle tracking databases, and other state government criminal databases through an “Integrated C4i [Command, Control, Communications, Computers, and Intelligence] Platform” equipped with data analytics and data mining capabilities.
However, it remains unclear how many of these integrations have actually been implemented. The Wire has reached out to the UP Police to ask about the status of these integrations. This article will be updated when a response is received.
How does the alert system work?
The system was designed with the understanding that operators cannot be expected to manually review hours of video footage, especially without knowing what to look for. To address this challenge, an AI-based video analytics system was integrated with an alert engine. This engine uses deep learning algorithms to analyse footage and generate proactive alerts for relevant stakeholders.
Since AI relies on datasets and machine or deep learning techniques, the police designed specific scenarios to train the model on what to detect. These scenarios underpin the basis for generating automated alerts under the Lucknow Safe City Project.
With insights from human rights and privacy defenders – Hyderabad-based hacktivist Srinivas Kodali, writer, journalist and political activist Amita Shireen, and human rights activist and media coordinator of the Indian National Congress Sadaf Jafar – The Wire found that several of these “scenarios” are shaped by patriarchal assumptions and risk enabling unwarranted surveillance of citizens.
Kodali and Shireen pointed out that S. No. 21 and S. No. 22 violate the human rights of individuals released on bail. A 2024 Supreme Court ruling stated, “If a constant vigil is kept on every movement of the accused released on bail by the use of technology or otherwise, it will infringe the rights of the accused guaranteed under Article 21, including the right to privacy.” These use cases directly contravene that judgment.
Shireen also noted, “Among the 45 scenarios listed, two – S. No. 36 on ‘road blockages’ and S. No. 44 on ‘crowd formation’ – directly relate to protests, demonstrations and rallies. Using these scenarios to trigger alerts risks curbing dissent, restricting freedom of assembly, and expanding surveillance on activists and ordinary citizens.”
Illustration: Pariplab Chakraborty
None of the listed use cases specify generating alerts for vehicles without number plates, a gap Jayant acknowledged. In Sarika’s case, police confirmed that footage along the route revealed the auto was missing a number plate and had visible damage to its front.
The Wire also spoke to Ram*, Sarika’s brother, who said, “The biggest lapse was on the part of the police. An auto was moving without a number plate right next to two police booths at the Alambagh bus terminal, yet no action was taken.” He added that the accused had 23 prior cases and was released from jail just six months earlier.
The Wire has asked the UP Police why a vehicle without a number plate did not trigger an alert. This article will be updated when a response is received.
What happens when an alert is generated?
Jayant told The Wire that when the system generates an alert, it first appears as a silent notification on a software application. On average, the Control Room receives 5,000-6,000 alerts every day, each accompanied by images or video footage from the site. IT operators review these alerts to determine whether they are genuine or false. Jayant, along with other technical experts and support staff working at the Smart Control Room, told us that only around 10% of alerts are ultimately found to be valid.
Until September 2025, the Control Room operated only from 8 am to 8 pm, staffed by about 10-15 operators per shift responsible for monitoring and verifying alerts – an overwhelming workload given the sheer volume of notifications generated each day.
When an alert is verified, it is escalated to the nearest Pink Patrol vehicle. Each patrol unit is equipped with a Mobile Data Terminal (MDT) – a tablet-like Android device – which receives the alert along with an image, the camera’s identification and the nearest landmark. The patrol team then proceeds to the location and takes the necessary action, forwarding a case to the police station only if a suspect is arrested. According to control room staff, more than 100 police personnel have been deployed across Lucknow for mobile patrolling – a number that remains low given the city’s size and population. The Wire could not independently verify this figure.
The Wire has asked the Director General of Police, Uttar Pradesh Rajeev Krishna, Commissioner of Police, Lucknow Amrendra K. Sengar and Additional Superintendent of Police (PRO) Rahul Srivastava about the manpower deployed to track and respond to alerts. This article will be updated when a response is received.
Blind spots in the Safe City Project: Unmonitored alerts, unprotected citizens
In September this year, two minors were abducted in broad daylight from BG Colony, Alambagh. Station House Officer Subhash Chand told the media, “CCTV footage shows the children cycling at 3 pm. Then a man approaches them, speaks to them briefly, and takes them with him. Later, he is seen moving towards Charbagh.” According to the police, the kidnapper first took the children to Charbagh, where he left their bicycles, before moving on to Qaiserbagh and boarding a bus to Lakhimpur Kheri.
When asked why the SMART cameras along the route failed to trigger an alert, Jayant argued it was not a system failure. He said that cameras can only capture incidents within their effective range; beyond that, they become “blind”. Jayant and Uday*, another technical support staff, noted, “Despite the large number of cameras installed, they cannot cover every space. There will always be blind spots. Also, a camera can only monitor one location at a time.”During on-ground inspections, we found that cameras at multiple locations covered all angles – left, right, front, and back – with several also equipped with pan-tilt functions. When we pointed this out, the technical experts agreed that the general vicinity was indeed covered.
Uday also clarified that not all cameras function the same way. Some are equipped with facial recognition software, while others are designed to detect specific actions, such as hand-waving. If a camera can’t recognise an action, it won’t trigger an alert. Jayant further disclosed, “Electricity and network disruptions keep occurring, and as a result, the cameras go offline almost every day.”
Illustration: Pariplab Chakraborty
The senior police official overseeing Sarika’s case confirmed that there were 163 cameras along the route her auto took, several of which are AI-enabled SMART cameras. Our ground inspection also verified the presence of these SMART cameras not only at the site of Sarika’s abduction but also at multiple points along the auto’s route.
An investigating officer in Sarika’s case, speaking on the condition of anonymity, told The Wire that neither AI-enabled cameras nor traffic management cameras generated any real-time alerts that could have helped in the case. He said that they were used only after the crime, like ordinary CCTV cameras, to trace the auto’s route. The officer added, “Many cameras under the Lucknow Safe City Project are non-functional, but authorities do not disclose this publicly, fearing it would undermine the perceived deterrent effect of the surveillance system.”
Uday admitted, “You see, AI is AI. Even humans can’t always make sense of what is happening, or identify individuals in a crowd – how can AI? The cameras only do what they are programmed to do. Right now, they generate a high number of false positives. Facial recognition, in particular, is not functioning effectively. Testing is still underway, and improvements will be gradual.”
Insufficient human oversight
Sarika’s brother, Ram, told The Wire that the police only began checking CCTV footage after he and his wife arrived at the Alambagh station, despite his repeated calls to 112. “By the time we got there, the person responsible for monitoring the cameras was unavailable and took a very long time to show up,” he said. “If the CCTV operator had been on duty and actively monitoring real-time feeds, the auto could have been tracked immediately.”
His account highlights a critical flaw in the Safe City Project: the overreliance on AI systems without sufficient human oversight. “AI alerts are useless if officers aren’t present to respond,” Ram said. “The spot where my sister was found is just 5-6 kilometers from Malihabad police station – a five-to-ten-minute drive if the patrol car had responded immediately after I called 112. Prompt police action could have saved her.”
He concluded, “AI can only assist if the police act promptly. But if officers remain idle, nothing will work – AI won’t run to the crime scene on its own. When human vigilance is sidelined in favor of technology, even the most advanced AI cannot prevent tragedies.”
Is AI surveillance failing the most vulnerable?
The Wire interviewed over 15 women and trans individuals across Lucknow – including Charbagh, Mahanagar, Aliganj, Aishbagh, PGI, Kaiserbagh, Hazratganj, Chawk and Gomtinagar – to understand their experiences with AI-based SMART cameras deployed under the Safe City Project and assess whether these initiatives genuinely make public spaces safer, as claimed by the authorities.
The responses overwhelmingly highlighted the ineffectiveness of the Lucknow Safe City Project – its inability to guard against imminent danger, its propensity to intensify gender biases, and in some cases even exacerbate privacy and safety risks through intrusive and unwarranted surveillance.
Most interviewees reported frequent sexual harassment – ranging from verbal abuse and lewd comments to stalking, groping and physical assault in public spaces. Many noted that incidents of harassment and violence continue unabated even in areas under constant surveillance, with several interviewees pointing out that most such incidents have occurred after the launch of the Safe City Project – often in locations already monitored by AI-enabled SMART cameras.
According to respondents, the police made no official announcement about the installation of AI-based cameras in Lucknow. Most learned of them through news reports and acquaintances, or by spotting them in public.
Nearly all respondents expressed skepticism about the effectiveness of these cameras, largely rooted in a deep lack of trust in the police’s ability to respond promptly to gender-based crimes. When asked if they felt safer in areas monitored by AI cameras, only one interviewee said yes.
None of the interviewees had ever received any help from the AI-enabled cameras, with most saying they inspire little confidence. Many emphasised that the real issues lie in police attitude and responsiveness, describing the cameras as mere eyewash and a waste of public funds.
Illustration: Pariplab Chakraborty
Trans persons and sex worker respondents further noted that surveillance has worsened existing biases in policing, deepening rather than reducing their vulnerability in public spaces.
A 22-year-old trans woman who begs on the streets to survive alleged that she was recently assaulted by the police in Charbagh. “The police were patrolling when they suddenly assumed all the trans women there were creating trouble. They must have received a wrong tip. I was just standing quietly when one officer, holding a thick stick, asked, ‘Why are you here?’ Before I could respond, he started hitting me. Soon, several others joined in and beat me in front of everyone at the main bus adda. I’m still in pain – the swelling hasn’t gone, and my hand hurts a lot. I believe cameras were present at the main bus adda in Charbagh; they would have recorded everything,” she said.
She asked, “Do male police officers have the right to hit us? Don’t we have the right to walk freely without being judged?” Fearing retaliation, she chose not to file a complaint since the perpetrators were police officers themselves.
What is the real motive?
For women, public scrutiny is not new, but most respondents believed that the gaze of AI-enabled SMART cameras – operating without legal or privacy safeguards – has heightened fears of misuse and mass surveillance.For sex workers and trans people, already vulnerable to police harassment, such intrusive monitoring has made their livelihoods even more precarious.
Under the Safe City Project, the police are authorised to use a variety of surveillance techniques, including patrol-mounted Mobile Data Terminals (MDTs) and drones. MDTs allow “patrol vehicles to capture the photos of suspected persons and transmit the data back to the ISCR [Integrated Smart Control Room] for verification and comparison with the criminal database”.
Srinivas Kodali, a hacktivist and tech transparency expert, explained that such a system gives police unchecked power to label anyone a suspect, heightening risks of profiling, misuse and a lack of accountability. He noted that while such authority has always existed, technology has amplified its reach, effectively weaponising policing. “The police system already carries deep social, religious and caste biases,” he explained. “These prejudices aren’t built into the technology itself, but when police use tools like facial recognition, those institutional biases get embedded and magnified through the technology.”
Drone-based surveillance is intended to “act as an active deterrent to unscrupulous elements during challenging law and order situations like rallies, rasta roko, and similar mob / crowd-led activities”. However, the guidelines do not clarify whether public protests fall under this definition.
Illustration: Pariplab Chakraborty
All the activists, journalists and civil society members we interviewed believed that these tools – especially facial recognition technology – are being used to monitor human rights defenders and suppress lawful dissent. They noted that the Uttar Pradesh government has faced repeated accusations of spying on activists and argued that the deployment of AI-enabled cameras has only deepened these concerns, especially since the 2019-2020 Citizenship (Amendment) Act-National Register of Citizens protests. Political activist Sadaf Jafar, who was arrested during the December 2019 demonstrations, said, “The surveillance system is designed to keep tabs on politically active citizens and members of civil society, creating a constant sense of being watched.”
None of the interviewees were comfortable with their images, videos, or personal data being stored or accessed by the police or private companies.
Rethinking safety
The interviews conducted across Lucknow reveal a significant gap between the promises of the “Safe City” initiative and the lived realities of those they claim to protect. For women, trans people and sex workers, these projects have done little to ensure safety or justice. Instead, they have intensified surveillance, reinforced existing biases within policing and failed to foster trust between citizens and the state.
At the core of these concerns lies a deeper question of accountability and consent – who decides when, how and why people are monitored? Respondents cautioned that without transparency, safeguards or oversight, AI surveillance risks targeting women, trans people and dissenters, reflecting the biases of those operating it.
The responses stand as a stark reminder, especially as India expands AI-based policing: surveillance cannot equal safety. Until privacy, gender sensitivity and accountability become central to such initiatives, these projects will continue to watch – but fail to protect – the very people they claim to serve.
*Name changed to protect anonymity.
With inputs from Ashma Izzat.
This article went live on October thirty-first, two thousand twenty five, at zero minutes past eight in the morning.The Wire is now on WhatsApp. Follow our channel for sharp analysis and opinions on the latest developments.
