When Technology is the Witness – Facial Recognition Software

When Technology is the Witness – Facial Recognition Software

Most people who discuss CCTV and digital cameras used by police to surveil the public as a privacy issue. And those who are committing or wish to commit a crime are cautious about avoiding cameras for this reason.

Don’t get me wrong, privacy IS an important issue. But would you believe that is actually not the ONLY reason these shots aren’t the best idea for courts and police to use?

The bigger issue is that the algorithms are inaccurate to varying degrees for anyone but middle aged white men….

And put that into the hands of a police force that has been shown to be racist/bigoted? ‘Houston, we have a problem!’

How can this be corrected? Well, changing the sampling of people who are studied to develop the algorithms in the first place might help. But you’d also have to change the cameras and the site’s lighting. That won’t change the police officer or dept who is racist/biased, but at least they won’t be able to use it to brand someone as a criminal and warehouse them in pretty horrific conditions like jails, prisons and forensic hospitals.

The tech giants are no longer willing to make or sell this tech to police depts as of now. Due to their misuse during the BLM protests. So maybe something good did come of it?

……

The NIST study evaluated 189 software algorithms from 99 developers — a majority of the industry. It focuses on how well each individual algorithm performs one of two different tasks that are among face recognition’s most common applications. The first task, confirming a photo matches a different photo of the same person in a database, is known as “one-to-one” matching and is commonly used for verification work, such as unlocking a smartphone or checking a passport. The second, determining whether the person in the photo has any match in a database, is known as “one-to-many” matching and can be used for identification of a person of interest. NIST

…..

A growing body of research exposes divergent error rates across demographic groups, with the poorest accuracy consistently found in subjects who are female, Black, and 18-30 years old. Harvard

…..

In June of 2020 IBM, Amazon, and Microsoft announced they would pause or end sales of their face recognition technology to police in the United States. ACLU

…..

the systems generally work best on middle-aged white men’s faces, and not so well for people of color, women, children, or the elderly ACLU

…….

Algorithms developed in the US were all consistently bad at matching Asian, African-American, and Native American faces. Native Americans suffered the highest false positive rates.
One-to-many matching, systems had the worst false positive rates for African-American women, which puts this population at the highest risk for being falsely accused of a crime. MIT

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s