Once dehumanized in local media, the LGBTQI+ community in Zimbabwe are now showing how human they are – through social media.
No facing away: Why India’s facial recognition system is bad news for minorities
Facial recognition technology is severely limited by racial bias. As India seeks to build one of the world’s largest face databases, activists fear the impact for minorities beyond black and white.
On a hot May 2021 afternoon in India, when Hyderabad activist SQ Masood stepped out to get some paperwork done, he didn’t expect the police to stop and question him. Nor did he expect them to photograph him without consent.
At the time of the incident, Masood was passing through Shahran Market. It is an area inhabited by a number of Muslim residents in Hyderabad, the capital of the south Indian state of Telangana. His father-in-law was also with him.
“There were 8 to 10 police officers there randomly stopping people and questioning them. Quite a few people saw them and turned back. I knew I wasn’t doing anything wrong and wasn’t afraid. So I didn’t turn back.”
A constable stopped Masood, and asked him to take his mask off in order to photograph him. Masood wasn’t violating lockdown restrictions, and was well within the 6 am to 3 pm window (when Hyderabad authorities allowed for an easing of the lockdown). So he found this odd.
In 2020, the Indian government approved a plan to build the National Automated Facial Recognition System (NAFRS or AFRS). Led by the National Crime Records Bureau (NCRB), this system originally aimed to be able to extract facial biometrics from videos and CCTV footage, matching it with photographs present in existing databases.
Aiding and abetting rights’ violations
Across the world, facial recognition technology has played a role in wrongful arrests, intrusive surveillance and crackdown on protests. It is now outlawed in 13 US cities, including San Francisco and Boston. Regulators in Europe are also rethinking the indiscriminate use of facial recognition systems in public spaces. However, India is moving ahead.
Most privacy activists argue that the use of this technology is violative of human and digital rights.
Anushka Jain, associate counsel (surveillance and transparency) at Internet Freedom Foundation (IFF), an independent organization that advocates for digital privacy and rights in India, told Unbias the News,
The Central government of India appears to be leaning heavily on facial recognition for law enforcement and other purposes, in the absence of a data privacy law.
In December 2019, hundreds of thousands of Indian citizens came out to protest the Citizenship Amendment Act, a new citizenship law believed to be discriminatory towards Muslims and other marginalized communities. During the riots that followed (dubbed a targeted pogrom against the Muslim minority), Delhi Police used facial recognition system for over 100 of the 1,818 arrests they made.
At another protest against agricultural reforms (accused of favouring corporations over farmers) at Delhi’s historical Red Fort, facial recognition technology was used along with CCTV footage to arrest over 200 protestors.
Erroneous exclusion, coerced inclusion
Other instances include the use of facial recognition to authenticate identity for public food grain distribution systems and other welfare programs. Access to subsidized food rations, fertilizers, cooking gas, cash transfers and other social welfare benefits are also governed through a digital, biometric identification system known as UID (unique identification) or Aadhaar.
Internet Freedom Foundation’s Rohin Garg pointed out how this can lead to exclusion from state-funded welfare programs:
For Garg, “coerced inclusion” is also an issue raised by this tech. “What if I don’t want to provide my face to the government? A lot of airports are using facial recognition technology to verify passengers instead of the usual flight tickets they used to hand out. What if you want to opt out of something like that? There’s no alternative available, and I would call this coerced inclusion,” he concluded.
There is strong evidence of various facial recognition systems displaying racial and gender bias, leading to false matches and exclusion.
Like Masood in India said, “Usually, these practices [of stopping or questioning citizens] have been happening in Old Hyderabad, where you will find more poor citizens, Muslims, and Dalits. This sort of thing doesn’t happen in the new city [of Hyderabad]. They wouldn’t dare ask the people there to take off their masks,” he added.
As MIT’s Gendershades program conclusively proved, facial recognition systems developed by IBM, Microsoft and Face++ displayed relatively high accuracy overall but faltered in recognizing certain genders and races. The systems, with overall accuracy of 87.9% to 93.7%, identified male faces 8.1% to 20.6% better than female faces, and lighter-skinned faces 11.8% to 19.2% better than dark-skinned faces. IBM has since worked on improving their system, after digital activist Joy Buolamwini pointed out the bias.
Basis of the bias
“It is important to look at how this bias develops,” said IFF’s associate counsel Anushka Jain.
“This bias develops because the data set based on which these algorithms are developed themselves are biased. So if you get a technology that has been developed on data sets that are predominantly white in that situation also they will not be able to identify people of color in India.”
“Possible, not positive” matches
Not everyone agrees, however, that facial recognition systems are biased. Michael Furia, analyst certified in Adobe Photoshop and detective in law enforcement, told Unbias The News, “To say facial recognition technology is biased is ridiculous to me. I would never claim to have a positive match on someone if I wasn’t positive and, even then, we say it is possible, not positive.”
“The idea that Asians or women are more likely to be wrong is not accurate. It is just more difficult to distinguish facial identifiers between Asians and women because of make-up and beauty alterations in mugshots such as eyebrows tattooed on, unusual haircuts, and so on. Let’s face it: you don’t see blonde Asians with freckles and sunburn too often. You don’t see the majority of women sporting beards and trimmed facial hair as often as men,” he said.
Law enforcement worldwide has largely resisted the idea of regulating or banning facial recognition technology.
Targeting the vulnerable
Global investigations, reports and documentaries on technology indicate minority groups are the worst affected by invasive tech in the hands of majoritarian governments.
Talking about the time when cops photographed him on the streets, Masood shares, “My apprehension is that if they didn’t use my photograph to generate a challaan [a document issued upon a violation of traffic rules by the police], why did they take a photograph at all? Have they deleted it or stored it in some database? If they’ve stored it, what will they do with it? Which other databases will they link it to, and who will they share it with? For what purpose will they use it?”
Building a database is critical to the National Crime Records Bureau’s plan to deploy the Automated Facial Recognition System (AFRS) across the country.
In a 172-page Request for Proposal released by the NCRB in 2019 (and since removed but obtainable in cache), the originally desired specifications for such a system are laid out in detail: “The system shall be able to broadly match a suspect/criminal photograph with database created using photograph images available with Passport, CCTNS, ICJS and Prisons, Ministry of women and child development (KhoyaPaya) State or National Automated Fingerprint Identification System or any other image database available with police/other entity. Match suspected criminal face from pre-recorded video feeds obtained from CCTVs deployed in various critical identified locations, or with the video feeds received from private or other public organization’s video feeds.”
In response to legal demands by IFF, the original RFP was removed and replaced with one that excludes references to using facial images from CCTV video feeds. However, the vagueness of the new language still leaves the door open for video footage to be used, according to IFF.
The potentially broad scope of the program is chilling, particularly in a country where law enforcement has openly committed atrocities against Muslims, and has repeatedly displayed a communal and casteist attitude.
Not surprisingly, Masood expressed concern over where his photograph could end up: “Telangana police have developed an app called TSCOP that every constable has, and they’ve linked different databases to it. Databases of prisoners, history sheeters and others I don’t know of. If my photograph ends up in that database, what category will it fall under? I don’t know that and it worries me, especially since I’m Muslim and being a Muslim in India . . . well, you know how it is.”
Investigative tool or surveillance weapon?
Civil and digital rights activists globally have been calling for a complete ban on facial recognition systems. Anushka Jain from India’s Internet Freedom Foundation told Unbias The News that even though the organization is calling for a complete ban on facial recognition systems, they understand why it might not be feasible in India, and are willing to work with the government on building safeguards and regulating the use of this technology.
But not everyone wants a ban.
“Facial Recognition is never telling or giving you permission to lock someone up. It is merely a lead telling you that the person of interest looks almost identical (in most cases) to the suspect of the crime and you may want to speak to that person or keep an eye on him until you develop probable cause. I can’t see how that is a violation of anyone’s rights and am very angered to think the public wouldn’t want to have such a valuable tool at law enforcement’s hands when there are so many shootings and even terroristic acts in the modern world,” he added.
Even though law enforcement and governments continue to be at loggerheads with privacy and digital rights advocates around the world on the use of facial recognition systems, repeated human rights violations across the world show that the use of this technology comes at a steep cost.
The question is: can we afford it?
Please consider a donation to support the work of our all-women newsroom. We create a space for journalists facing structural barriers, working towards a more equitable, inclusive world of journalism. Join our mission today!
Facial recognition technology is severely limited by racial and gender bias. As India seeks to build one of the world’s largest facial databases, activists fear the impact for minorities beyond black and white.
In Washington state, a long tradition of prison activism is being continued by Vincent “Tank” Sherill, inmate at Monroe Correctional Facility
What it takes to change attitudes and secure women’s rights to inherit property.
Indian migrant workers fight for their right to get paid in Serbia