Even as big tech companies such as Amazon limit their sale of facial recognition software to law enforcement, one company has not: Clearview AI, a facial recognition search engine that contains three billion images scraped from the internet.
More than 3,000 U.S. law enforcement agencies employ the software, which uses an advanced algorithm to identify and match faces, the company says.
“The way it works is very similar to Google, but instead of putting in words, you're putting in photos of faces, and it will find anything publicly available on the internet that looks like that face,” said Hoan Ton-That, chief executive and co-founder of the company.
Police argue that facial recognition software is an important tool in fighting and solving crimes. But its increasing use has raised concerns that there are too few rules in place for when and how police can use it.
Limiting the scope of software
Police typically have image search engines at their disposal that pull drivers’ license pictures or other photos among police records.
Clearview AI, in contrast, has gathered billions of images from social media sites and other websites, which internet firms say were obtained by breaking their rules.
Clearview AI’s Ton-That says that the company only pulls publicly available information.
In one case, federal agents were able to identify a man suspected of sexual abuse of a girl using a single image from the “dark web,” an area of the internet only accessible by special software and matching it through Clearview AI.
“He was in the background of someone else's photo at the gym, in the mirror,” said Ton-That. “They were able to identify where the gym was, identify the person, he ended up doing 35 years in jail and they saved a seven-year-old.”
A tool for law enforcement
The software was also instrumental in helping federal as well as state and local law enforcement identify suspects that stormed the U.S. Capitol in January, according to Ton-That.
In one way, Clearview AI, which has created its database from people’s social media accounts and other public parts of the internet, was well- suited to help with this massive investigation of people whose mugshots wouldn’t necessarily be in police databases, he said.
Police were able to use Clearview AI, which runs about a second per search, he said, and find matching photos online of some suspects.
“So they were able to quickly identify them, and reduce a lot of false-positives, and also speed up the investigative process,” he said.
What about privacy?
When police violence protests swept the U.S. last year, Amazon and other tech firms suspended sales of their facial recognition technology to law enforcement, a suspension they have said is indefinite.
Clearview AI continues to sell to law enforcement, and internet firms such as Facebook, Google and Twitter as well as civil rights advocates are raising the alarm about its power and potential abuse of people’s privacy.
The American Civil Liberties Union (ACLU) has sued the company in Chicago and California.
Kate Ruane, senior legislative counsel for the ACLU, said that facial recognition technology raises the specter of the government “being able to surveil us throughout every single aspect of our lives.”
Federal, state and local governments, she says, “do admit that they use it, but they don't tell us how, when or how often.”
There needs to be oversight and regulation, she said, but until then, she is calling for a total moratorium on law enforcement use of facial recognition technology.
Legislation & regulation
In recent months, congressional leaders have introduced bills that would limit police use of purchased data that was “illegally obtained” via deception or breach of contract.
Clearview’s Ton-That agrees that there needs to be more transparency and even regulation around the technology’s use. But as for banning police use of Clearview?
“Given the success of our technology in solving crimes, especially crimes against children, it would be counterproductive and inappropriate to enact a moratorium or ban of facial recognition or Clearview AI’s product,” he said.
Ton-That has a code of conduct for customers and has built-in prompts in its software to help law enforcement customers prevent the software’s misuse.
Repressive governments’ use of facial recognition tech
The ACLU and other civil rights groups are also concerned about the implications of this technology in the hands of repressive governments like China.
“Because the implications are terrifying,” said the ACLU’s Kate Ruane, “especially what is going on in China, where it is trying to track citizens across every single aspect of their lives.”
Ton-That says his company does not sell its software to foreign governments and is focusing for now on law enforcement in the U.S.
“We've worked occasionally with some other private entities for investigative purposes, but we've decided just to focus on law enforcement,” he said. “It's the easiest, most explainable and best use case of our technology.”
Words in this Article
Facial recognition – a technology capable of matching a human face from a digital image or a video frame
Law Enforcement – the activity of some members of government who act in an organized manner to enforce the law
Accessible – adj. easy to approach or use
Well-suited – adj. appropriate for a particular purpose
Indefinite – adj. unlimited, no specified limit
Specter – n. ghost, some object of terror
Moratorium – n. a suspension of activity:
Counterproductive – adj. tending to go against one’s purpose
Comprehension Questions
What is the main problem with using the facial recognition system for police?
What was the example for using the technology for solving the crime?
What are ACLU concerned about?