IE11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

A Closer Look at Proposed Dallas Police Facial Recognition Software Use

The DPD and Clearview AI are entering into a three-year, roughly $89,000 contract to be paid from grant money, meaning it didn’t require city council approval.

A silhouette of a person’s head with their facial structure halfway drawn by a computer system. The background is green lines of code.
Proponents call facial recognition an important crime-fighting and security tool. Critics, however, warn the growing use of the technology brings with it the danger that flawed identification could put innocent people in prison. (TNS)
The Dallas Police Department will soon use a facial recognition software that has been immersed in privacy concerns to help identify people in criminal investigations.

The program, Clearview AI, bills itself as the largest law enforcement database in the world with more than 40 billion public photos from the Internet. Police submit a photo of an unidentified person, and, within seconds, the software can produce possible matches based on photos scraped from public social media profiles, media outlets and other websites.

Dallas Police Chief Eddie García told The Dallas Morning News the software can expedite investigations to help detectives considering staffing shortages plaguing the department.

“It will be a game changer,” García said in a sit-down interview at police headquarters. “We’re not going to snap our fingers and get 400 more detectives in the next year or two. We have to look at technology to leverage the amazing work that our detectives do every day.”

The technology is used by law enforcement agencies across the U.S. but has been criticized by the ACLU and data privacy experts. They describe Clearview as a secretive company that wields enormous, unchecked power, collects photos without consent and say that the technology has led to wrongful arrests in other cities.

Dallas police officials plan to begin to use the software this year with policies to strictly limit who will have access to the program and how it can be used.

The software will only be used to generate investigative leads, said Maj. Stephen Williams, who oversees DPD’s Fusion Center, also known as its intelligence hub. Only 15 DPD analysts will have access to the software, and they’ve already undergone 32 hours of FBI facial identification training and implicit bias training that will be required annually, according to Williams.

He emphasized the results will not be used as positive identification or to carry out an arrest without additional corroboration.

“Even if there’s a 100 percent match, that is not even close enough for a detective to go kick down a door or get an arrest warrant,” García added. “You still need to corroborate everything.”

Police will only run the software for certain investigations at first, Williams said, citing murders, aggravated assaults, robberies, sexual offenses, kidnapping, human trafficking, arson, deceased people, crimes with a terrorism nexus and threats to homeland security.

Although the technology itself is new, the chief said, the technique is not. He said detectives for decades have used databases to enter in descriptions of a suspect and try to find a match. With Clearview AI, detectives can do that in “a much, much quicker time frame,” he said.

Nate Wessler, deputy director of the speech, privacy and technology project for the national ACLU office, said Clearview is nothing like what police used to do because “it’s an incredible power to put in police’s hands” that has been “assembled in an incredibly abusive and privacy invasive way.” He said it’s distinctive from other facial recognition technologies because it brings its own matching database to police, rather than relying on police data like mugshots.

“Will this technology make true matches some of the time?” Wessler said. “Yeah, I’m sure it will, and that may seem helpful to police. But there are lots of things that would be helpful to police that we as a society decided are just too invasive or too risky.”

Clearview — led by CEO Hoan Ton-That — has “been scored by the National Institute of Standards and Technology (NIST) as the most accurate algorithm for recognizing faces in the country and second most accurate in the world,” according to its website.

After it was launched in the late 2010s, it was available to private companies and individuals in addition to the government, spurring concerns that people could use it to identify anyone at any point.

Then, the ACLU sued Clearview in 2020 under an Illinois law that prohibited private companies from using peoples’ unique biometric identifiers — like their faces — without notice and consent.

The lawsuit was settled with a nationwide prohibition on Clearview providing its services to private companies or people. The company could continue selling to government contractors.

“This technology is too dangerous for police use,” Wessler said. “Both because of the ways it doesn’t work well — the misidentifications leading to wrongful arrests — and dangerous because of its potential to give police the power to identify people anywhere and everywhere.”

HOW IT WILL WORK IN DPD


In DPD, Williams said the software will not be proactive — detectives must have the criminal offense already, “the case has to be there,” to request the use of Clearview.

According to Williams, a detective will send a request to run a photo or photos in Clearview to their supervisor, who will determine whether it fits DPD’s criteria.

Once approved by a supervisor, the request will go to the Fusion Real Time Crime Center — which will house the 15 analysts, including three supervisors, Williams said. Two analysts will separately run the photo in Clearview’s system to return a positive or a negative result indicating whether there’s a match.

If both come back positive, the results will be sent to the detective as an investigative lead. If the analysts get two different results, a Fusion supervisor will decide.

Clearview AI also returns links to where the photo is online, according to its website. Williams declined to specify whether detectives would reach out to social media users if Clearview AI links to their post and if DPD would have policies about that outreach, saying only: “Without getting into specific investigative techniques, the department currently utilizes social media during investigations and in following up on leads.”

Asked if the department will mandate that detectives disclose in arresting documents when Clearview is used to identify someone, he said the policies are “currently being finalized.”

Parameters will be laid out in DPD’s policies, he said. Police are prohibited from using the technology on First Amendment activities, like protests, or on any livestream, such as video events on Twitch, Williams said.

Police are still in the procurement process, but Williams said it should be in place well before the end of the year. Williams said it’s a three-year contract that costs about $89,000 total and will be paid for with grant money, meaning it didn’t require city council approval.

He declined to estimate how large the workload might be in the initial months, saying they need to walk before they run. He said they researched best practices and policies before moving forward on the technology and will do monthly audits in the Fusion Center.

Police will return to the city council’s Public Safety Committee every six months, he added, to detail how many searches they did and what the results were. The public safety committee signed off on the use of the technology with few questions at its monthly meeting Monday.

Williams said the Fusion center won’t store the photos — they’ll keep track of the case number, the offense, what the request was, who asked for it and who approved it, he said. Policies will make clear the technology is only meant to supplement investigative work, he emphasized.

“AI is now superseding the human brain at a hundred times the ability,” Williams said. “Everything is captured on video ... there’s so many sorts of avenues that if we can follow the investigation, it will help us try to identify who these people are because sometimes it is just a challenge.”

PRIVACY AND ACCURACY CONCERNS


Facial recognition technology has become increasingly popular among law enforcement agencies, said Caitlin Chin-Rothmann, a research fellow at the Center for Strategic and International Studies. In 2022, she co-authored a report about Clearview that found more than 3,000 law enforcement agencies were using the software.

The software “has really dramatically expanded the possible scope of facial recognition,” Chin-Rothmann said, but there isn’t much accountability for how law enforcement uses it. She said there’s no federal law that applies to how all agencies should employ it.

Historically, facial recognition has been “mostly accurate,” but is also known to be more biased toward women and people with darker skin, she said. It’s improved over time, she noted, but due to historical biases and policing patterns, Black and Latino people are more likely to already appear in law enforcement databases. That means the facial technology can set up a reinforcing cycle that can lead to false or disproportionate arrests of non-white people, she said.

Transparency is paramount, according to Chin-Rothmann. If someone is arrested in part because of a facial recognition match, investigators need to provide that information so defendants can challenge it, especially with the possibility of a bias or mismatch, she said.

There needs to be an accountability measure, she said, like assessments or public disclosures about how facial recognition is used and its general outcomes and statistics. She said DPD’s decision to present to the Public Safety Committee is “a step in the right direction.”

“It is good the Dallas Police Department has put thought into policies around facial recognition,” she said. “Especially limiting the context in which facial recognition can be used, requiring human oversight and also limiting the number of people who have access to these services.”

Clearview AI has said on its website that it has a more than 99 percent accuracy rate. But Wessler, of the ACLU, said it’s unclear how often it misidentifies people using real world conditions, not just tests in a controlled laboratory. Police could pull an image from a camera at a gas station when lighting is poor and the person’s face is turned away, he said as an example.

“In those conditions, face recognition in general is much, much less accurate,” he said.

Trained analysts are “still humans and still are going to make mistakes,” Wessler said. He said the technology is designed to identify faces that look like the face entered in the system, not to provide a positive identification of a sure match.

In a statement Tuesday to The News, Clearview CEO Ton-That said their tests have been done using “many algorithms,” including images from the real world, such as mugshots, webcams and photos from different angles with varying resolutions and quality of lighting.

“The tests are conducted with millions of different images from a diverse set of demographics,” Ton-That said, adding their accuracy was greater than 99 percent across all demographics.

He said their algorithm also “correctly matched the correct face out of a lineup of 12 million photos at an accuracy rate of 99.85 percent, which is much more accurate than the human eye.” He did not provide comment when asked about assertions that Clearview has led to wrongful arrests in some cities.

Wessler said more than 20 cities across the U.S. banned the use of any facial recognition and the ACLU knows of at least seven cases — including one in Houston — where police arrested innocent people after relying on an incorrect face recognition match. In some of those, policies were in place that required training, limited how the technology was used and warned against it as probable cause by itself, but those were insufficient to avoid a wrongful arrest, he said.

‘WE’RE BEHIND THE CURVE’


Dallas officers made headlines in the past for their unauthorized use of Clearview AI.

In 2021, the news outlet Gizmodo revealed that Dallas police officers used the software without department approval for hundreds of searches to identify people based on photographs.

Dallas police hadn’t entered into a contract with Clearview at the time, but officers downloaded the app by visiting the company website, Gizmodo reported, noting that some officers had installed it on their personal phones. Department leaders ordered it deleted from all city devices.

Williams said Clearview won’t be used by patrol officers — it’s solely for investigators working to identify someone, he said, noting Dallas’ policies are meant to avert misuse.

García said Dallas was behind on implementing facial recognition technology because of the long-held concerns with it. DPD is not naive to those worries, the chief said, adding they’ve learned from mistakes made elsewhere and the program has grown since its inception.

If implemented properly with checks and balances, it could be a major asset, the chief said.

“Nothing’s ever going to replace more detectives,” García said, “But if there’s technology that can make our detectives more efficient, can make their day-to-day lives better, that’s something we have to do.

“I’ll be honest with you, in this realm, we’re behind the curve. We need to get into this game.”

©2024 The Dallas Morning News. Distributed by Tribune Content Agency, LLC.