Video analysis in a recent violent crime investigation exemplifies why Lincoln police want to pay about $21,600 for new software from a California company, Chief Jeff Bliemeister said. 

His staff needed to watch more than 162 hours — almost seven days of film — to pinpoint times on surveillance footage where they might find skateboarders or bicyclists they wanted to talk to, the chief said. 

New video processing software from Vintra, the sole bidder for a Lincoln police contract, could save time by isolating moments on the video when a face or object, such as a car or bike, appears.

Advertisement

As more video surveillance has become available, reviewing that evidence has eaten up more and more time for Lincoln police officers in the last decade. 

Lincoln police reviews of digital evidence, such as video, for cases spanning from routine vandalism to murder, have grown from 170 in 2009 to 531 in 2018, a 212% increase, according to the department. 

Earlier this year, the department added a video analyst and announced it will rededicate veteran officers to help review video evidence in follow-up investigations.

The chief said he hopes the Vintra Fulcrum AI software the department is considering buying frees up time for his officers investigating crimes.

“What we’re really trying to do is narrow the manual review of any existing video,” Bliemeister said. 

But in addition to the motion-detection search function, the software also would expand the department’s facial recognition technology capabilities, a law enforcement practice that continues to concern civil liberty and privacy advocates. 

The department wants the software program to be capable of checking faces of suspects in video against LPD’s internal mugshot database, too.

The software needs to be capable of working with video captured on surveillance cameras, body cameras, dashboard cameras and cellphones. It would then deliver an analytical report on any matches to police in seconds, according to the police department’s request for proposals. 

Facial recognition is a secondary software feature that his department would use with care, Bliemeister said.

The American Civil Liberties Union, among others, says the use of facial recognition technology by police departments can put innocent people under investigation because of inaccurate algorithms.  

In May, the city of San Francisco became the first major police department to ban the software’s use by its police force. More departments have followed. Last week, Michigan legislators introduced a bill to outlaw its use in the state. 

“In 2019, I think our society’s enthusiasm for technology often obscures a more deliberate and contemplative effort to look at where technology succeeds and where it fails,” said Chad Marlow, senior advocacy and policy counsel with the ACLU’s national office in New York. 

Facial recognition technology is commonly used to help tag people in photos uploaded to Facebook, and Google employs it to help people identify faces in photos.

“When you’re talking about police departments using it, that’s no joke,” Marlow said.

Privacy concerns aside, the faulty technology isn’t “ready for prime time” and is so inaccurate police can’t use it responsibly, Marlow said. 

In 2013, Lincoln police began submitting suspect images for facial recognition analysis by the Nebraska Department of Motor Vehicles. The DMV used its driver’s license photo database to determine if there were matches.

The ACLU has called on the department to stop using facial recognition technology. 

Video evidence can be crucial to cases, Bliemeister said, but analyzing it can be laborious for the department’s two technicians. The department has tasked injured officers unable to be on the street with watching hours and hours of video to isolate relevant snippets for investigations. 

This software could help police connect culprits unknown to investigators in repeat crimes, such as shoplifting cases, he said.

“I understand people’s hesitancy about facial recognition,” he said. “But it’s not just a facial recognition match that is going to lead to the arrest. It has to be taken within the context of the totality of the investigation.”

Marlow said the software has shown itself to be inaccurate analyzing people of color, women, older people and children. 

What’s more, the software’s problems analyzing the faces of racial minorities are magnified when its algorithms are checking against databases of mugshots, he said.

Those databases often include disproportionate numbers of people of color, Marlow said. And he worries that inaccurate facial recognition software puts a population already over-policed at risk of being pinpointed as a suspect because of a false positives. 

There’s further risk that officers who evaluate possible matches may be more likely to side with the software on close calls, Marlow said.

The report highlighted problems using the technology in the New York Police Department, including instances where the suspect’s facial features had been photo-shopped or sketch artist renderings were input into the program. 

Photo quality also poses an issue, the report’s authors noted.

“Photos that are pixelated, distorted, or of partial faces provide less data for a face recognition system to analyze than high-quality, passport-style photos, increasing room for error,” the report said. 

Bliemeister said the quality of video images police receive continues to improve, and this software won’t be used as a substitute for police investigation. 

Even a seemingly clear photo of a face doesn’t always get clear matches with that technology, the chief acknowledged.

That’s why analysts in the police department would examine any leads generated by the software, he said. 

His staff have protocol directing use of the state’s software, and a 2013 City Council agreement does not allow Lincoln police to use the search’s findings as the sole basis for an arrest.

In the request for bids, LPD is also asking the software to have the ability to search in real-time video feeds such as city traffic cameras.

Asked whether the department plans to employ such a practice, Bliemeister said “absolutely not.”

“We are not trying to create this network of cameras that is actively searching the mugshot database to find out who’s got warrants that is walking down the street,” Bliemeister said.

To Marlow, the police use of this technology in real-time is “all those problems on steroids.”

“It’s enough to make George Orwell shudder,” he said. 





Source link