POST SCRIPT: SINCE THIS BLOG WAS POSTED, THIS KEYNOTE PRESENTATION WAS PUBLISHED BY THE ICA HERE.
by Gregory Saville
A man walks through a public plaza on a pleasant Sunday afternoon and passes by a CCTV. Minutes later he is arrested by police on suspicion of a crime that, in fact, he did not commit. The man is African American. and, unfortunately, facial recognition software on the CCTV is vulnerable to false positives.
A predictive policing algorithm sends police patrols to the same neighborhood for the sixth week in a row to prevent crimes that have not yet occurred. Based on mathematics from earthquake prediction, this algorithm is hardly the best model for predicting human behavior and crime. It has no way to know that residents of this disenfranchised neighborhood are utterly fed up with over-policing, especially when the police don’t actually do anything except show up in their patrols cars.
I blogged on these stories earlier this year.
The stories are real and they reflect real events. Unfortunately, according to experts, predictive policing algorithms have serious problems with over-policing minority areas. The Los Angeles Police Department is the latest agency to abandon their PredPol programs (they claim it is due to Covid). Similarly, scientists specializing in evaluation have also criticized facial recognition software. They claim it cannot accurately read facial characteristics of black men!
These stories reflect the threat of introducing Artificial Intelligence into crime prevention. Thus far, at least with CPTED, things in the AI world are not going well.
THE 2021 ICA CPTED CONFERENCE
On Nov 3, I will deliver a keynote address to the 2021 International CPTED Association virtual conference, hosted by Helsingborg, Sweden, the Safer Sweden Foundation, and the International CPTED Association. It will be the first ICA conference since the last pre-COVID event a few years ago. The topic of my keynote is Artificial Intelligence, Smart Cities, and CPTED – An existential threat to the ICA.
Based on my own experience with a tech start-up company a decade ago, and an experiment with some predictive critical infrastructure CPTED software, I came upon some fascinating books on AI. One, in particular, AI 2041 by Kai-Fu Lee and Chen Quifan, describes how AI will infiltrate all aspects of urban life – health, transport, schools, entertainment, crime prevention, and safety. They tell us there will be no part of the future city without AI. This is especially the case with the Smart City movement in which scientists and planners envision a city embedded with AI.
THE SORCERER'S APPRENTICE
What happens when AI systems go wrong? Artificial Intelligence is at the apex of new technologies and the implications for CPTED are significant.
AI is a potential threat of a higher order. It is a case of the Sorcerer’s Apprentice: An independent system that analyses problems and makes decisions using machine learning instructions independent from us. But when things go inevitably wrong, we end up scrambling like mad to stop the damage from unintended consequences (eg: false arrests and over-policing).
If you’re interested in this topic in more detail, come to the 2021 ICA CPTED CONFERENCE, which runs from Nov 2 – 4 as a virtual conference. The dynamic conference program has dozens of sessions on crime prevention and CPTED from around the world. My keynote runs on Nov 3 at 9:20 – 9:45 PM Central European Time (1:20 – 1:45 AM Mountain Time). A recording of the conference for registrants will be made available for later watching for those who are asleep in their time zones. POST SCRIPT; THIS PRESENTATION IS HERE.