A collage with a woman wearing a headscarf taking a selfie and a man wearing glasses looking in a screen, both their images are distorted in reflection

Ways of Seeing

Behind the Scenes of "No Facing Away: Why India's Facial Recognition System is Bad News for Minorities"

My first assignment as a journalist was to report on the life of Africans in Delhi, who at the time had been accused of carrying on a drug and sex racket. A pamphlet distributed in the neighbourhood read: “All landlords are requested not to let their properties to Nigerians or other such disruptive elements.

The basis of this profiling was deep-seated racism  –  going back to brown Indians feeling inferior about their complexion due to their history of colonisation by the British. They were now trying to feel superior by believing they were better than black people. Many Indians who internalised the colourism received from their colonisers started passing on the discrimination to anyone darker than so that they could have the solace of not being on the lowermost rung of this conveniently concocted hierarchy.

In the past few years, humans have made huge advancements in technology, while transferring their biases to it. We outsourced our prejudice, and called technology neutral because of its non-human form. The lenses were supposed to be new but those holding the camera – and their vantage point – had not changed much.

Facial recognition technology has since proved this bias transference many times. From wrongful arrests to death threats, the tech has led to the targeting of innocents, especially those with vulnerable social identities. Such concerns have led to resistance against this identification system in many countries. In Europe, Belgium and Luxembourg took an official stand against the technology.

The latest addition to the Unbias the News repository of stories – written by Aishwarya Jagani and illustrated by Victoria Shibaeva – examines the use of facial recognition technology in the context of India, and how it particularly threatens minorities. The errors and inaccuracies of the system do not only lead to unfair persecution. They also end up excluding people from receiving state-sponsored benefits. This happens when these welfare programmes closely tie up their distribution structures with such identification systems.

On the other hand, law enforcement in various parts of the world continues to argue in favour of the tech. But how does one know for sure that after all the drawbacks of this technology, it won’t add to the long trail of “collaterals” historically created by the criminal justice system? This won’t be the first time technology hailed as being revolutionary in the initial days led to unjust convictions and was ultimately deemed fallacious.

Technology might be biased but we would keep trying to unbias the news, and our team deeply appreciates your contribution in helping us do that. If the story resonates with you, please share it in your circles. If you have feedback, critique or suggestions to offer, please send them our way. We always love to hear back from our readers and carefully consider your responses.

Please consider a donation to support the work of our all-women newsroom. We create a space for journalists facing structural barriers, working towards a more equitable, inclusive world of journalism. Join our mission today!

Related Posts

People in a crowd of protesters hold up signs against a firey backup, in the foreground people read a giant book that has been censored

Why we are launching the Democracy Playbook

What tactics and strategies work to defend democracy from elite capture? How do people build movements to protect institutions, the environment, and each other from authoritarianism? What are the strategic, cultural, emotional resources possessed by the majority that can counter the way of authoritarianism?

In an ink illustration, several people wrapped in blankets stare in the distance at ship on fire sinking

Missing data, missing souls in Italy

From 2013 to the present, Refaat has searched everywhere for their children. For ten years he has been traveling, asking, and searching. He has even appeared on TV hoping one day to be reunited with them. But to this day he still does not know if his children were saved or if they are two of the 268 victims of the October 11, 2013 shipwreck, one of the worst Mediterranean disasters in the last three decades.

Unbias your inbox

Do you share our mission? Sign up for our newsletter so we can keep in touch!


Please confirm that you would like to hear from us via email:
We use Mailchimp as our marketing platform. You can unsubscribe at any time by clicking the link in the footer of our emails. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.