The use of proactive police patrols to deter crime in targeted areas is a core concept of police operations. The reasoning behind this is that the presence of a police officer increases the risk for a potential offender and therefore has a deterrent effect. Proactive policing is simply the use of data analytics to make patrol recommendations that are more accurate than traditional techniques – and therefore more effective at reducing crime.
There are lots of ways to use data to guide patrol operations, ranging from putting pins on a paper map on the wall to creating heatmaps of historical data. With the advent of artificial intelligence and data mining techniques, there are seemingly an unlimited number of ways an agency could apply technology to create a predictive policing model.
Are you REALLY up for the challenge?
There are several technical challenges in developing a model to identify the times and locations most at risk for future crimes. So we narrowed it down to four of the most the most significant ones:
Determining which data to use. In other words, what data best predicts the likelihood of future crimes? Some theories and models have proposed using weather, unemployment rates, housing prices, educational attainment, income disparity, the prevalence of illicit drug use, days of the week, the placement of “crime attractors” such as bus stations or shopping center parking lots, and so on. Finding the data you want to use is the first half of the data problem; the other half is getting continuous access to a timely, accurate and granular (location-specific) source for that data. Weather forecasts, for example, are generated on a daily basis but not always accurate or granular enough. Unemployment data may be accurate, but only generated once a year and not at a useful level of granularity. Housing prices, while granular, are generated only periodically and therefore not very timely.
Find a model that can make accurate predictions. These first two challenges are to some degree intertwined. There are a range of predictive models available now, some with exotic names like naïve Bayes classifiers, multivariate adaptive regression splines, and the k-nearest neighbor algorithm. The model you select is dependent on the kind of data you feed into it, so selecting your preferred predictive model goes hand in hand with selecting the kind of data you want to use. A robust model should be able to predict for a variety of crimes ranging from property crimes to violent crimes, and it should have a consistent level of accuracy for the types of crimes predicted. It should work as well on weekdays as weekends, days as well as nights, and should work as well in winter as in summer.
Time to turn this into a mission-critical platform. Ok, so you think you’ve found a data source that is accurate, timely and granular enough. You’ve matched it with an algorithm that seems to work with that data and produce accurate forecasts of crime by crime type, time and location. Now, you need resilient methods of getting data into your system, with error flagging and fallback scenarios for when your data sources go down. You need audit capabilities to ensure the data quality remains consistent. It needs to run in an environment where it is available on a 7x24 basis, and scale up when lots of users are accessing the system. And don’t forget the security aspects! You will want to ensure that data is encrypted in transit and at rest. You will need to set role-based access and transaction levels for the different people in your organization, and keep audit logs of all transactions on your system down to the user level. Depending on the kind of data you store, you may also be subject to CJIS (criminal justice information system) compliance standards if you operate in the US.
Still think you can manage all of this? Well we’ve yet to go over your final – and perhaps most difficult – challenge! You need to find a way to deliver your predictions as patrol recommendations to the ultimate end user: the patrol officer. To do this effectively, you need to make a number of policy and design decisions. Are you providing heat maps, numeric scores, rankings? Do you cover the entire city, or just highlight specific high-risk areas? Do you make predictions at the city/county level or make recommendations for each officer’s patrol area? Which crimes – or combination of crimes – are you predicting, and will that vary by day of week, time of day, and/or patrol area? Whatever you decide, you will need to find a way to deliver your recommendations to patrol officers in a clear, unambiguous, actionable manner. Access to the relevant information should be secure yet simple, without the need to master a new technology. The more friction you introduce, the less chance you have of officer adoption. And without officer adoption, your sophisticated new predictive policing platform will go unused.
Too much? Don’t worry, we’ve got you.
Geolitica uses only 3 data points in our model: crime type, crime location, and crime date/time. We do not use or store any personally identifiable information regarding the victim or suspected perpetrator of these crimes. We never use arrest data, nor do we predict for crime types that could be initiated at the discretion of the patrol officer (for example, drug sales or prostitution). By focusing only on data as reported by victims to law enforcement, we direct officers to the locations where the risk of future victimization is highest. We have published (and patented) our algorithm and research in peer-reviewed studies, and it has been proven to be the most accurate on the market.
The Geolitica platform represents approximately 10 years of research and development and the equivalent of 70 research-years of PhD-level mathematicians, computer scientists and criminologists. Our platform has been tested and proven in dozens of departments in multiple countries over the last 6+ years and it now represents the feedback from over 200 million officer-hours of usage in the field. While we are heartened to see that other agencies believe in the concept of predictive policing so much that they are considering developing their own models, but why not leave it to the professionals? Geolitica continues to provide the most unbiased, accurate and cost-effective approach available in the predictive policing market.