INTRODUCTION
A man walks through a bustling railway station, looking for his train. A child sleeps on a chair. A young girl reunites with her family for the vacation. And they are all scrutinized through a lens unbeknownst to them. Numerous cameras scan their faces, working overtime to identify, analyse, compare and log them into their database. This unnerving scenario could very soon be our reality, if not already. Welcome to the world of Facial Recognitions Technology (FRT). This revolutionary tool promises enhanced security, better policing and can even reunite long-lost families. But there might be a catch—would we have to hand over our right to privacy? This tech tug-of-war requires us to find the sweet spot between an individual’s rights and public safety.
WHY FRT IS OUR NEW BEST FRIEND?
Automatic Facial Recognition Technology is a newfound development of the digital era aiding in the identification of a person. This technology employs biometric scanning to identify individuals based on their features, akin to fingerprints and iris scans. The technology thoroughly analyses photographs and videos to determine distinctive facial features to create a mathematical sequence known as a face template unique to each person. Identification and authentication are the two primary uses of this technology. It uses machine learning techniques such as deep learning to train itself to “learn” to recognize characteristic features. It is imperative that the image is captured with optimal lighting and slight variation to facial expression, as even a minute change could hamper the effectiveness of the system.
When one mentions the term ‘facial recognition’, what usually comes to mind is unlocking smartphones or how Google Photos ‘recognises’ faces and sorts them into albums. However, FRT has more sophisticated applications, the most crucial of which is public safety. The growing crime rates in urban cities have inspired increased FRT usage. In fact, it could be one of the most effective weapons in crime prevention and investigative methodologies. FRT can be used to identify missing persons and suspects in criminal investigations, to monitor public spaces for known offenders, and prevent any potential terrorist attacks by comparing captured images or CCTV footage with pre-existing data of known offenders or wanted criminals. For instance, in 2019, the Delhi police used FRT to identify over 3,000 missing children, reuniting them with their families.
It is also becoming a popular tool of surveillance to be used in large scale events and public areas. Airports such as Delhi, Varanasi, Hyderabad and Bangalore have implemented FRT as a part of the Digital Yatra initiative aimed at improving passenger verification and security, doing away with the traditional boarding passes. The technology has also been deployed to counter cross-border terrorism in areas such as Jammu and Kashmir to identify persons of interest in sensitive areas. This technology played a role in public health when Kerala introduced its first Thermal and Optical imaging camera with AI-powered facial recognition software during the pandemic, aimed at monitoring people while social distancing. FRT has several more distinguishable use cases in various sectors, from ensuring student attendance in education to confirming voter identity in elections.
CAN SAFETY COEXIST PRIVACY AND ETHICS?
The common consensus is that people value their privacy, and no one appreciates being watched 24/7. Analysing FRT in light of the Puttaswamy judgement, the technology feels like a double-edged sword. The risk of data leakage or misuse is ever present. There is no guarantee or limitation preventing data collectors from turning India into a surveillance state. The technology is also far from being foolproof. Studies in the West have shown that the technology is not advanced enough to accurately identify individuals belonging to certain demographics, particularly dark-skinned people. Such a bias could be lethal in a country like India.
A major concern associated with FRT is its ethicality. When public safety is weighed against the ethicality of watching people constantly, it is hard to discern which way the scale tips. The fear of being watched constantly might deter people from public gatherings, protests or public activities. This would be a blow to the institution of democracy. For instance, the usage of the technology during the 2020 anti-CAA protests in Delhi prompted criticism for potentially stifling free expression. People’s awareness is next to none regarding FRT usage and India has been exploiting this by employing the technology without obtaining proper consent. Indian consent and privacy protection laws are also still struggling to catch up.
INDIA’S LEGAL BLACK HOLE
While the technology has been deployed for beta projects, no legal recognition has been conferred upon it. Unlike the USA or the EU, India is yet to address the legal implications and develop a dedicated code. However, India is not entirely fumbling in the dark—we have some basic laws in place to build up on. The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (SPDI Rules) provide a definition of biometrics that encompasses facial patterns, classifying them as sensitive personal data.. These rules apply to body corporates in India and are responsible for the regulation of their collection, storage, use, disclosure, transfer and any associated security practices and procedures. Further, the broad definition of personal data under the Digital Personal Data Protection Act, 2023 allows for biometric data to be interpreted within its ambit. This act, however, takes away any rights and requirements if the processing of sensitive data is for the sake of prevention, detection, investigation, or prosecution of any crime in the country. Conversely, the Information Technology Act, 2000, India’s mother legislation on electronic format, falls short of addressing facial recognition technology.
Additionally, the Supreme Court of India has recognised the right to privacy as an inalienable right in the landmark judgement K.S. Puttaswamy v. Union of India. The right is subjectable to 3 reasonable restrictions, namely, (i) the existence of a law, (ii) legitimate state aim, and (iii) proportionality. A strict interpretation of this would spell trouble for FRT, as it infringes on the right. However, the National Crime Records Bureau (NCRB) advocates for the introduction of the National Automated Facial Recognition database, citing public safety. However, uncontrolled surveillance would bring its legality into question, necessitating a list of narrowly defined use cases.
BALANCING THE SCALE
The merits of Facial Recognition Technology far outweigh its challenges. Thus, the debate boils down to a simple question: can we use FRT responsibly without turning into a surveillance state? The imminent crucial step would be conferring adequate legal protection to all involved stakeholders. This would include removing ambiguities in defining related terminologies, clarifying boundaries in surveillance, providing precise rules for obtaining consent, managing, storing and using collected data, and imposing stringent penalties for any contraventions. Given the sensitive nature of FRT, it would add an additional layer of protection and accountability if a special regulatory body was established, overseeing all FRT applications, their adherence to the legal code and their enforcement. This body could establish best practices and protocols such as annual FRT audits, privacy and ethical standards and field public grievances.
The efficient management and application of this technology is largely dependent on the public. Their consent is paramount to its effectiveness. Public awareness campaigns can be undertaken both to boost transparency and to educate. Securing the people’s vote is the key to the success of FRT. This implies that the implementation of the technology must go hand in hand with improvements in data protection. The data collected must be encrypted, stored safely, and only retained as long as necessary.
CONCLUSION
As with any new development, the challenges presented by FRT are but a stepping stone towards its success. With other nations stepping up their game and swiftly adapting to the evolving technological landscape, it is high time that India joins the bandwagon. With solid foundational laws backing up any future changes, it remains to be seen how India maneuvers around the obstacles surrounding FRT implementation.
Facial Recognition Technology is a powerhouse of potential for enhancing public safety. It would be remiss of policymakers to not leverage this technical advancement for the betterment of our nation. However, this hinges on its ability to balance surveillance with privacy. By implementing comprehensive regulations, fostering transparency and encouraging active public discourse, we can progress as one towards a future where technology serves the public without compromising on their individual rights.
Comments are closed