The ugly face of the crime-fighting move

Implementation of National Automated Facial Recognition System in India lacks adequate safeguards

Why could no meaningful debate take place in the monsoon session of Parliament? Controversy over Pegasus, spyware. Some Indian journalists, civil society activists and political leaders and a top election strategist were probably under surveillance. There has been no outright denial by the government and that the Israeli software was not purchased. But above that, there is a huge issue of privacy of the entire citizen which has not received much attention from the public. Joint committee investigating on June 23, 2021 Personal Data Protection Bill (2019) Was Fifth extension given by Parliament. While informational privacy is not a priority of the government, it is simultaneously exploring the possibilities of facial recognition technology.

a stinging technique

To empower the Indian Police with Information Technology, India Approved implementation of National Automatic Facial Recognition System (NAFRS) To facilitate “Crime Investigation and Detection of Criminals” in a quick and time bound manner. Upon its implementation, it will serve as a national-level search platform that will use facial recognition technology: facial recognition to facilitate crime investigation or to identify a person of interest (for example, a criminal). Regardless of the mask, makeup, plastic surgery, beard or hair extension.

The technology is completely intrusive: computer algorithms convert unique facial-landmarks (biometric data) such as cheekbones size, lip shape, forehead to chin distance, and convert these into a numerical code – called a faceprint. Is. Thus, for the purposes of ‘verification’ or ‘identification’, the system compares the generated faceprints with a larger existing database of faceprints (generally available to law enforcement agencies) via a database on driver’s licenses or police mugshots. does. But the real problem is that facial recognition doesn’t return a definitive result – it only ‘identifies’ or ‘verifies’ in probabilities (for example, there is a 70% chance that the person shown on the image is the same person who is likely to see). is on the list). Although the accuracy of facial recognition has improved over the years due to modern machine-learning algorithms, the risk of error and bias still exists. For example, there is the possibility of producing a ‘false positive’ – a situation where the algorithm finds a wrong match even though there are none – resulting in a false arrest. In addition, much research shows that facial recognition software is based on pre-trained models. Therefore, if certain face types (eg female, child, ethnic minority) are underrepresented in the training dataset, this bias will negatively affect its performance.

Because NAFRS will collect, process and store sensitive personal information: facial biometrics for a long period of time; If not permanently – it will affect the right to privacy. Accordingly, it is important to check whether its implementation is arbitrary and thus unconstitutional, that is, is it ‘legitimate’, ‘proportionate to its requirement’ and ‘least restrictive’? What is its potential for misuse and abuse with the pending status of the Personal Data Protection Bill (PDPB) and the absence of clear guidelines for its deployment? How does this affect other fundamental rights like the right to dissent? Should NAFRS be banned or simply regulated?

The Federal Bureau of Investigation in the United States uses facial recognition technology to potentially search for clues; Police forces in England use facial recognition to deal with serious violence. In other cases, countries like China use facial recognition for racial profiling and mass surveillance – to track down Uighur Muslims. With police and law and order being state subjects, some Indian states have started using new technologies without fully appreciating the dangers involved.

Test of ‘Proportionality’

Facial recognition being an intrusive technology has implications for the right to privacy. The right to privacy is not explicitly mentioned in the Constitution of India. However, a nine-judge bench of the Supreme Court, in Justice KS Puttaswamy v Union of India (2017) recognized it as a valuable fundamental right. Since no fundamental right can be absolute and thus even with respect to privacy, the state can impose reasonable restrictions on the grounds of national integrity, security of the state, public order etc.

The Supreme Court in KS Puttaswamy’s judgment provided a three-fold requirement (which was reiterated in) Anuradha Bhasin Investigating the deprivation of ‘right to internet’ to the people of Kashmir) to safeguard against any arbitrary state action. Accordingly, any encroachment on the right to privacy requires the existence of a ‘law’ (to satisfy the legality of the action); In the context of ‘legitimate state interest’ a ‘need’ must exist; And, the measure adopted should be ‘proportional’ (there should be a rational relationship between the means adopted and the objective pursued) and it should be ‘least intrusive’. Unfortunately, NAFRS fails each of these tests.

Read also | Compromising with Biometrics in Policing

First, NAFRS lacks ‘validity’. It is not proposed to identify any statutory act (such as the DNA Technology (Use and Application) Regulation Bill 2018 to identify criminals or an executive order of the central government. Rather, it was only approved by the Cabinet Committee on Economic Affairs in 2009. Second, and more importantly, even if we assume that NAFRS is needed to tackle modern day crimes, this measure is completely disproportionate. The reason for this That is, to meet the test of ‘proportionality’, the benefits of this technology deployment must be sufficiently great, and the disadvantages must outweigh the disadvantages. To do this would require systems to track people at large – CCTV in a public place is notoriously difficult to avoid – resulting in everyone becoming the subject of surveillance: a disproportionate measure. A strong data protection law or clear guidelines Where this technology is used in the absence of or who can be put on the watch list? And, for how long the system will retain the sensitive personal data of those surveyed, NAFRS will actually do more harm than good.

impact on rights

From a technical point of view, facial recognition technology can be tasked with ‘recognition’ among other uses, cases. In doing so, one faceprint is compared to several other faceprints stored in the database (known as a 1:N match). In some cases, it is known that the identified person exists in the database, whereas, in other scenarios, it is not (for example, when individuals are checked against watch lists). This is where its deployment becomes extremely worrying. With an element of error and bias, facial recognition can result in the profiling of some over-represented groups (such as the downtrodden and minorities) in the criminal justice system.

Furthermore, as anonymity is vital to the functioning of a liberal democracy, uncontrolled use of facial recognition technology would discourage independent journalism or the right to assemble peacefully without weapons, or any form of civil society activism. Due to its adverse impact on civil liberties, some countries have become cautious with the use of facial recognition technology. The Court of Appeals in the United Kingdom declared the use of facial recognition technology by South Wales to be illegal for lack of clear guidelines. In the United States, the Facial Recognition and Biometric Technology Moratorium Act 2020 was introduced in the Senate to prohibit biometric surveillance without statutory authorization. Similarly, the privacy watchdog in the European Union has called for a ban on facial recognition.

Read also | Investors call for ethical approach to facial recognition technology

uncontrolled route

At present, the Information Technology Act 2000, and the rules made thereunder, provide the central government with broad powers to violate confidentiality in the name of sovereignty, integrity or security of the state. The Personal Data Protection Bill 2019 is not much different. It gives uncontrolled power to the central government for surveillance purposes – it can exempt any agency of the government from the application of a proposed law in the name of legitimate state interest.

Without adequate safeguards, such as being punitive and sufficiently deterrent, police personnel may routinely use facial recognition technology. In short, even though facial recognition technology is needed to tackle modern-day criminality in India, without accountability and monitoring, facial recognition technology has strong potential for misuse and abuse. In the interest of civil liberties and to prevent democracy from becoming authoritarian, the use of facial recognition technology, in addition to the NAFRS’s statutory authority and deployment guidelines, will continue until we enact a strong and meaningful data protection law. It is important to stop. . If the government so desires, it can pass any law at God’s speed like the OBC bill or the 20 recently passed bills including three agriculture bills.

Faizan Mustafa is the Vice Chancellor of the National Academy of Legal Studies and Research (NALSAR) University of Law, Hyderabad. Utkarsh Leo is Assistant Professor at NALSAR, Hyderabad. Views expressed are personal

.

Leave a Reply