Typically when we talk about artificial intelligence we’re talking about algorithms that streamline supply chains and give us insight into big data’s secrets. But AI is being used for something else now, and the consequences could be dire. Law enforcement agencies are starting to use historic crime data to predict future crimes and deploy law enforcement where it’s most needed, which could have serious unintended consequences.
Q4 2019 hedge fund letters, conferences and more
Corsair Capital, the event-driven long-short equity hedge fund, gained 6.6% net during the second quarter, bringing its year-to-date performance to 17.5%. Q2 2021 hedge fund letters, conferences and more According to a copy of the hedge fund's second-quarter letter to investors, a copy of which of ValueWalk has been able to review, the largest contributor Read More
The problem with applying artificial intelligence to crime is that implicit bias infects the historic data and plays out in the AI’s predictions. Historically, poor and minority communities have been overpoliced, which means that algorithms are being programmed with the biases of past generations in an attempt to separate human bias from policing.
In one famous example of this programmed bias, a test of Amazon’s AI facial recognition software misidentified 28 members of congress as criminals based on its facial recognition database. What’s more, 39% of the matches were people of color, while only 20% of congress at that time were people of color.
In addition, data shows that many crimes go unreported for various reasons, one of which is the victim feels the crime may never be solved so why bother. This means the data on which predictions are based is incomplete and therefore can’t produce and accurate prediction. With existing bias programmed into the algorithms, predictions can become self-fulfilling and create further rifts between different parts of society.
Using Crime Data To Predict Crime
Unfortunately, crime-stopping AI is already being used by law enforcement throughout the world. Predpol uses crime data to predict crime and tell police where to step up patrols. ShotSpotter is used by police departments throughout the world to listen for gunshots and automatically alert police to their location. Knightscope is an autonomous 500 pound robot that patrols the streets of Huntington Beach, California looking for “blacklisted” individuals and alerts police when they are spotted.
Nationwide violent crime is falling, but thanks to the constant information about crimes no one has noticed. Between 1993 and 2018, the majority of Americans believed that crime had increased nearly every year, but in that same time period violent crime fell between 51-71%m while property crime fell 54-69%. Predictive policing that has already been implemented in places like Los Angeles and Chicago has seemed to work at first but has ultimately been proven ineffective.
Learn more about the pros and cons of crime-stopping artificial intelligence from the infographic below.