Public agencies using facial recognition software without oversight

An anonymous source supplied BuzzFeed News with usage data from Clearview AI, the facial recognition service that was banned by many police departments nationwide. Many agencies still used and/or tried it:

The data, provided by a source who declined to be named for fear of retribution, has limitations. When asked about it in March of this year, Clearview AI did not confirm or dispute its authenticity. Some 335 public entities in the dataset confirmed to BuzzFeed News that their employees had tested or worked with the software, while 210 organizations denied any use. Most entities — 1,161 — did not respond to questions about whether they had used it.

Still, the data indicates that Clearview has broadly distributed its facial recognition software to federal agencies and police departments nationwide, offering the app to thousands of police officers and government employees, who at times used it without training or oversight. Often, agencies that acknowledged their employees had used the software confirmed it happened without the knowledge of their superiors, let alone the public they serve.

BuzzFeed News also made a searchable table so you can see if your local agencies are on the list.

Tags: , , ,

Inadequate hate crime statistics

For ProPublica, Ken Schwencke reports on a poor data system that relies on local law enforcement to voluntarily enter data:

Local law enforcement agencies reported a total of 6,121 hate crimes in 2016 to the FBI, but estimates from the National Crime Victimization Survey, conducted by the federal government, pin the number of potential hate crimes at almost 250,000 a year — one indication of the inadequacy of the FBI’s data.

“The current statistics are a complete and utter joke,” said Roy Austin, former deputy assistant attorney general in the Department of Justice’s civil rights division. Austin also worked at the White House on data and civil rights and helped develop an open data plan for police data.

Garbage in, garbage out.

Tags: , , ,

Tracking what happens to police after use of force on protestors

You’ve probably seen the videos. ProPublica is tracking to see what happens after:

ProPublica wanted to find out what happens after these moments are caught on tape. We culled hundreds of videos to find those with the clearest examples of officers apparently using a disproportionate level of force against protesters and reached out to 40 law enforcement agencies about the 68 incidents below. For each incident, we inquired about any disciplinary action, investigations and whether the department would disclose the officer or officers involved. While some departments provided details or relevant public records, others leaned on state laws to withhold information.

See also ProPublica’s recent release of NYPD civilian complaints against police officers.

Tags: , , ,

Algorithm leads to arrest of the wrong person

Even though there was supposedly a person in the decision-making process and a surveillance photo wasn’t actually Robert Julian-Borchak Williams, he still ended up handcuffed in front of his own home. Kashmir Hill reporting for The New York Times:

This is what technology providers and law enforcement always emphasize when defending facial recognition: It is only supposed to be a clue in the case, not a smoking gun. Before arresting Mr. Williams, investigators might have sought other evidence that he committed the theft, such as eyewitness testimony, location data from his phone or proof that he owned the clothing that the suspect was wearing.

In this case, however, according to the Detroit police report, investigators simply included Mr. Williams’s picture in a “6-pack photo lineup” they created and showed to Ms. Johnston, Shinola’s loss-prevention contractor, and she identified him. (Ms. Johnston declined to comment.)

Tags: , , ,

Police Perception vs. Public Perception

The numbers are from a survey by the Pew Research Center conducted in 2016. I suspect the percentages are higher right now, but the gaps between police and public perception seem to say a lot. It’s easy to see where “one bad apple” comes from.

Tags: , ,

How police use facial recogntion

For The New York Times, Jennifer Valentino-DeVries looked at the current state of facial recognition in law enforcement:

Officials in Florida say that they query the system 4,600 times a month. But the technology is no magic bullet: Only a small percentage of the queries break open investigations of unknown suspects, the documents indicate. The tool has been effective with clear images — identifying recalcitrant detainees, people using fake IDs and photos from anonymous social media accounts — but when investigators have tried to put a name to a suspect glimpsed in grainy surveillance footage, it has produced significantly fewer results.

Not quite CSI levels yet, huh.

Tags: , ,

Data for 200M traffic stop records

The Stanford Open Policing Project just released a dataset for police traffic stops across the country:

Currently, a comprehensive, national repository detailing interactions between police and the public doesn’t exist. That’s why the Stanford Open Policing Project is collecting and standardizing data on vehicle and pedestrian stops from law enforcement departments across the country — and we’re making that information freely available. We’ve already gathered over 200 million records from dozens of state and local police departments across the country.

You can download the data as CSV or RDS, and there are fields for stop date, stop time, location, driver demographics, and reasons for the stop. As you might imagine, the data from various municipalities comes at varying degrees of detail and timespans. I imagine there’s a lot to learn here both from the data and from working with the data.

Tags: , ,

Inflated counts for cleared rape cases

Newsy, Reveal and ProPublica look into rape cases in the U.S. and law enforcement’s use of exceptional clearance.

The designation allows police to clear cases when they have enough evidence to make an arrest and know who and where the suspect is, but can’t make an arrest for reasons outside their control. Experts say it’s supposed to be used sparingly.

Culled data from various police departments shows the designation is used more often that one would expect.

Tags: , , ,

The Crime Machine

I’m behind on my podcast listening (well, behind in everything tbh), but Reply All covered the flaws of CompStat, a data system originally employed by the NYPD to track crime and hold officers accountable:

But some of these chiefs started to figure out, wait a minute, the person who’s in charge of actually keeping track of the crime in my neighborhood is me. And so if they couldn’t make crime go down, they just would stop reporting crime. And they found all these different ways to do it. You could refuse to take crime reports from victims, you could write down different things than what had actually happened. You could literally just throw paperwork away. And so that guy would survive that CompStat meeting, he’d get his promotion, and then when the next guy showed up, the number that he had to beat was the number that a cheater had set. And so he had to cheat a little bit more.

I sat in on a CompStat meeting years ago in Los Angeles. I went into it excited to see the data system that helped decrease crime, but I left skeptical after hearing the discussions over such small absolute numbers, which in turn made for a lot of fluctuations percentage-wise. Maybe things are different now a decade later, but I’m not surprised that some intentionally and unintentionally gamed the system.

See also: FiveThirtyEight’s CompStat story from 2015.

Tags: , , ,

Predictive policing algorithms used secretly in New Orleans

Speaking of surveillance cities, Ali Winston for The Verge reports on the relationship between Palantir and New Orleans Police Department. They used predictive policing, which is loaded with social and statistical considerations, under the guise of philanthropy. Palantir gained access to personal records:

In January 2013, New Orleans would also allow Palantir to use its law enforcement account for LexisNexis’ Accurint product, which is comprised of millions of searchable public records, court filings, licenses, addresses, phone numbers, and social media data. The firm also got free access to city criminal and non-criminal data in order to train its software for crime forecasting. Neither the residents of New Orleans nor key city council members whose job it is to oversee the use of municipal data were aware of Palantir’s access to reams of their data.

False positives. Over-policing. Bias from the source data driving the algorithms. This isn’t stuff you just mess around with.

Tags: , , , ,