Body Cam Ethics

Axon, the biggest supplier of police body cameras and software in the U.S. just called it quits on using artificial intelligence-powered facial recognition systems for its devices, The New York Times reports.

"Face recognition technology is not currently reliable enough to ethically justify its use," the company’s independent ethics board concluded in a review.

Formerly called Taser, Axon specifically supplies police agencies with a lineup of body cameras, non-lethal weapons, and even law enforcement drones. This decision is yet another sign that facial recognition technology might not be ready for prime time yet — especially in light of some inherent biases of the technology.

"At the least, face recognition technology should not be deployed until the technology performs with far greater accuracy and performs equally well across races, ethnicities, genders, and other identity groups," reads the ethics board's review.

"Whether face recognition on body-worn cameras can ever be ethically justifiable is an issue the Board has begun to discuss, and will take up again if and when these prerequisites are met."

A Growing Trend

The news comes after San Francisco became the first city in the U.S. to ban the use of facial recognition software by police and other municipal agencies back in May.

Even Microsoft turned down a request from law enforcement to use its facial recognition tech. "If we move too fast with facial recognition, we may find that people’s fundamental rights are being broken." wrote the company's president Brad Smith in a public letter back in July 2018.

Regardless of the backlash, the concept is catching on even outside the U.S. Police officers in Zhengzhou, China were spotted wearing sunglasses equipped with facial recognition software early last year. It took just seven minutes for China's nationwide CCTV camera network to identify a BBC reporter in 2017.

Biometric Bias

At the root of the controversy surrounding the use of facial recognition in law enforcement is the fact that the systems — at least in their current state — have had a lot of trouble with distinguishing between people of different ethnicities or genders.

The technology "does not perform as well on people of color compared to whites, on women compared to men, or young people compared to older people," concluded Axon's ethics board.

And there's plenty of evidence to back that claim up. Retail giant Amazon's so-called Rekognition system, software that is used by police and Immigration and Customs Enforcement (ICE), has drawn plenty of criticism in the past. MIT researchers found that Rekogntion had a lot more trouble identifying an individual's gender if they were female or dark-skinned.

The same piece of software even falsely matched 28 members of Congress to mugshots from a publicly available database, the American Civil Liberties Union (ACLU) found in an experiment last year.

Nip It in the Bud

While Axon has yet to deploy any facial recognition software in its body cameras, it's taking a pass for now.

"Consistent with the board's recommendation, Axon will not be commercializing face matching products on our body cameras at this time," reads Axon's press release.

But that doesn't mean the company's willing to give up on the technology altogether. By working "in tandem with leading independent researchers to de-bias training data and algorithms," Axon could still eventually bring the technology to its lineup of cameras.

READ MORE: A Major Police Body Cam Company Just Banned Facial Recognition [The New York Times]

More on facial recognition: Police Catch Woody Harrelson Look-Alike Using Facial Recognition


Share This Article