Erica Flores

The NYPD is using a new pattern recognition system …

The New York City Police Department is using a new software system called Patternizr, which helps officers search “hundreds of thousands” of case files, according to a report in The Washington Post.

The report says the software was developed in-house and allows analysts to search a wide range of files for similar patterns or crimes. Previously, they would have had to go through physical files. In one example, officers used the system to link two crimes: a man who used a syringe to steal a drill at two different Home Depots in New York City. Rebecca Shutt, the crime analyst who solved the case explained to the Send that the system “brought back complaints from other venues that I would not have known about.”

This is not a Minority Reporta similar system that seeks to predict where crimes will occur, nor is it a system that uses artificial intelligence to analyze through CCTV images. Rather, it is a system that looks for patterns in NYPD databases, allowing detectives to search a much larger set of data in the course of an investigation. The system can help bring in additional sources of information from across the NYPD, making it difficult to see patterns of crime that could have occurred elsewhere.

The NYPD says the department implemented the software in 2016, but first disclosed its existence in a number of INFORMS Magazine in Applied Analytics. According to NYPD Assistant Commissioner for Data Analysis Evan Levine and former chief analyst Alex Chohlas-Wood, the department spent two years developing the software, claiming that the NYPD is the first to use such a system in the USA.

Chohlas-Wood and Levine say Send They used 10 years of previously identified patterns to train the system, and in testing, “they accurately recreated old crime patterns a third of the time and returned parts of the patterns 80 percent of the time.” Send says the system does not take a suspect’s race into account in the course of its search, as a precaution against racial bias.

The result appears to be one that helps reduce some of the work required for researchers, partially automating a process that has so far been done manually.