Paging Sgt. Software

Can we trust computers to detect crime without violating our rights?

By Kristine de Leon

Senior Crime Analyst Brian Hoepner pulls up a map of West Hollywood on his screen.

"Here, here, here," he begins, pointing to seven blue boxes in his office at the West Hollywood station of the Los Angeles County Sheriff’s Department. “Each marks a robbery probably committed by the same individual or gang.

"And here," Hoepner continues, pointing to a dot just northwest of West Hollywood, "is where I predict the next activity would be."

He then pulls out a thick stack of paper from a cardboard box next to his office chair and lays it on his desk. They're crime reports and documents detailing the area's crime statistics. He takes a stapled report listing the 2017 and 2018 week-by-week crime statistics in West Hollywood during March.

"Look this over and based on what you find there, look over at this map," says Hoepner, as he scoots his chair over to a large map of West Hollywood displayed on the wall by his desk. "Can you tell me where you think, say robberies, are likely to occur?"

His point is that it doesn't take the world's smartest crime analyst or police officer to predict where the next robbery might occur. Cops typically know by experience and by their street sense where they can expect future instances of thefts to happen within their jurisdictions at certain times.

"Sometimes, predictive policing looks a lot more like common sense," says Hoepner, adding that it's nothing magical. Crime prevention has always been to some extent about prediction. Law enforcement officers usually try to identify several factors about why a particular crime event occurred to prevent it from happening in the future.

Each of those factors is a variable. That means each can change, or vary, and some variables can affect each other. If the occurrence of a particular crime, say bike theft, is dependent on only one variable, such as distance from a university, a cop might be able to quickly gauge the likelihood of bike theft to occur on a college campus.

But humans, in general, aren't always the best at making predictions. When more variables come into play, the human brain will find it more and more difficult to compute an outcome. However, a computer can do the work, sometimes even better. For example, the predictive policing software called PredPol has been shown to be more accurate than human crime analysts in making predictions about when and where crimes will occur.

When it comes to predictive policing, the assumption is: what matters most isn’t the future—it’s the past.

Using predictive software in the criminal justice system is nothing new. Statistical models and software programs have been helping judges and prosecutors assess the “risk” of criminal offenders for over 30 years.

When it comes to predictive policing, the assumption is: what matters most isn’t the future—it’s the past.

"We are all creatures of habit," explains Hoepner, "The same people are doing all the crimes all the time… and that’s the reality of our business."

Hoepner added that some types of people are more likely to commit crimes than others, so there are areas where he anticipates crime will occur.

"We have to keep an eye on the homeless because they’re involved in a lot of general crimes, theft, shoplifting, auto theft, vehicle burglaries,” Hoepner explains in a soft voice. “I just know when I see them, most of them are drug addicts, most of them have mental health issues… and generally, if you’re on the street, you have to be involved in some crime just to survive -- that’s how they get by. So it’s a good idea to keep an eye on what’s going on with the homeless population around you.”

Map of Los Angeles County. Click on the buttons on the left to toggle between viewing a map of the population and poverty in the county. Population and Poverty data is based on Census 2010 data.

Predictive policing is a paradigm shift that is sweeping the nation's law enforcement agencies. It relies on the collection of big data typically by private companies to help police officers identify where and how their interventions can be most effective in preventing future crime.

Over the last 15 years, major cities in California, South Carolina, Washington, Tennessee, Pennsylvania, Florida and New York, among others, have started gathering and analyzing more data on their jurisdictions, and on the people who live in them.

In some cities, the lack of transparency of private partnerships with the local police department has raised red flags, as well as questions about how they use proprietary algorithms. Take for example New Orleans, where the police department was recently exposed for using Palantir's predictive policing algorithms without the city's knowledge.

New Orleans, New York, Chicago, Los Angeles and Atlanta are among the growing number of cities weighing legislation about how to hold privately owned algorithms accountable, especially those that are used by government agencies in making decisions.

Many police departments credit the location-based predictive policing software PredPol and other similar systems as instrumental in reducing crime efficiently, using fewer resources. Some departments even say that data-driven policing can help validate police hunches or replace human bias with real data.

But is that true? Civil-rights groups like the American Civil Liberties Union don’t think so. The organization, along with the Brennan Center for Justice and the Data & Society Research Institute argue that little evidence supports the claim that such predictive technologies work and point out the lack of transparency powering the algorithms.

The Human Rights Data Analysis Group and the Electronic Frontier Foundation, two nonprofit organizations researching data and human rights issues, are concerned the formulas are unfairly concentrating police forces in communities of color.

"Predictive algorithms are only as good as the data they are trained on, and social data carries a complex history."
Jim Bueermann

They say that predictions based on previous crime data funnel more officers into already over-policed areas, resulting in a self-fulfilling cycle.

Especially in a post-9/11 world where data is being shared more widely across federal, state and local lines, privacy advocates are troubled by the prospect of centralizing law-enforcement data.

And while algorithms uncover, interpret and unearth relationships between persons, places and events, correlations don’t imply a direct causal connection. In other words, the method by which an algorithm analyzes data and predicts outcomes is not well understood.

"Algorithms are only as good as the data they train on," says Jim Bueermann, president of the National Police Foundation, "and social data carries a complex history."

Histories of discrimination can live on digital platforms, and if they go unquestioned, they become part of the logic of conventional algorithmic systems.

With over 245 million professionally-installed video surveillance cameras in the world, law enforcement officials and companies have more data on people than ever before. (Photo montage of surveillance screenshots by Kristine de Leon)

Responding to Crisis with Technology

Budget cuts. The 2008 financial recession in the United States gutted police forces.

State taxes dried up. County and municipal budgets shrank. Hiring froze. Specialized training stopped. Overtimes ended. And services were reduced. From 2008 to 2013, local police agencies across the country had to do more with less — sometimes dramatically less. The federal government began cutting, too, making matters worse. Sequestration meant reductions to federal grants for special task forces, crime scene investigations, community policing projects and juvenile diversion, among many other initiatives.

Police agencies needed to turn to more effective, efficient and economic models for policing. For this, they turned to data-driven approaches that leveraged the use of more technology.

But the national conversation about police-community relations took a turn after a controversial grand jury decision not to indict a police officer involved in the fatal police shooting of an unarmed 18-year-old in the St. Louis suburb of Ferguson. Race became the center of a national debate. The Black Lives Matter movement rose up and rallied social media and communities of color to protest patterns of racism and brutality.

There was a need for more police accountability and reform to mend the trust between police and communities. Aggressive policing creates resentment, distrust and fear in many minority communities, says Jim Bueermann, whose mission is "to advance democratic policing through innovation and science."

Bueermann, who served as the chief of police in Redlands, California, championed a holistic approach to community policing as police chief. He recognizes that the police carry some of the blame for people's distrust of law enforcement. After retiring from the Redlands force in 2011, he began working for the National Police Foundation and now works from home, researching policing and serving as an expert for the media whenever a significant crime dominates the news.

“[Police] shouldn’t offend people they serve to a level where they damage their relationship with the community,” says Bueermann, who supports criminal justice system reform in California. "The problem is [sometimes] you can make things worse by enforcing the law."

The backlash to the Black Lives Matter movement refocused attention on policing and the danger of patrolling high-crime neighborhoods. Police officers voiced frustration at staffing levels, training and unrealistic expectations when having to confront homelessness, anger and mental illness. They felt that their lives and the risks they took had been unfairly devalued.

“People and the media don’t always understand the difficult job that we’re called to do,” says Santa Cruz Police’s Elizabeth Howard-Gibbon. “[Officers] have been going through some bad times. You know, every day we go out into the community, knowing that our physical safety is at risk.”

The worsening tensions between the community and police and police administrators’ frustration with the lack of accountability all combined to create the demand for data-driven policing as a remedy to racially discriminatory practices, as well as a way to prevent crime.

At the center of this change in policing philosophy was then-Los Angeles police chief William Bratton.

"If you had to pick a single law enforcement visionary for the birth of predictive policing, Bill Bratton would be top on your list," recounts Sgt. DeBrabander, as he talked about Bratton's influence on the LAPD and surrounding law enforcement agencies, including the Long Beach Police Department.

First, as commissioner of the NYPD in the '90s, Bratton pioneered a very data-centric approach to police management using the CompStat (computer statistics) system. Police reported crime statistics every week, and commanders evaluated benchmarks for crime reduction and arrest rates in every precinct. Accountability for crime reduction based on stats became the driving focus of police management in the East Coast.

"Bratton, you know, was also the guy known for the 'broken windows' policing strategy," DeBrabander continues, explaining how Bratton was brought to Los Angeles in 2002 when the LAPD was "having lots of trouble."

In 2001, allegations of widespread fraud, corruption and criminality among the LAPD's gang unit officers at the Rampart Division resulted in a federal lawsuit against the city of Los Angeles for civil rights violations. As a result, the Department of Justice placed the LAPD under a federal consent decree until it made significant reform. But the debacle tarnished the department with legal and media scrutiny about “rogue cops," resulting in defensive and counterproductive police culture.

In response, Bratton instituted the creation of an LAPD data-driven system of police management that evolved into the building blocks of predictive policing.

"I remember when predictive policing just started and Bill Bratton was in his last week as LAPD chief," says Bueermann, recalling a conference hosted by the Department of Justice in Los Angeles about predictive policing. "We [police departments] all wanted this technology, but I said the community might be concerned about it if they don't know how it all works."

The Board of Police Commissioners meet every Tuesday morning at the Los Angeles Police Department's Headquarters in downtown LA.

The Lure of Big Data Surveillance and Predictive Policing

In the early 2000s, the mindset of policing shifted when researchers promoted the idea that data could lead the way to smarter and cheaper law enforcement. Partnerships between academics and law enforcement agencies to study violence and gang networks began to change policing on the ground. Also, the federal government began providing millions of dollars in grants to fund data-driven research and adopt new technologies, allowing local police agencies to experiment without much cost. Adding to the excitement, these technological advancements also allowed law enforcement to capture more and more personal data offered with the hope of making their jobs easier and more efficient.

The Santa Cruz-based start-up, PredPol, was started in 2006 as a research project within UCLA's Institute for Pure and Applied Mathematics. A group of researchers were looking at crime data and saw that it had predictable mathematical patterns. With the support from the National Science Foundation, they built algorithms around those patterns. That's when they formed an informal partnership with Los Angeles Police Department Captain Sean Malinowski.

Screenshot of the place-based predictive policing software, PredPol, which is used by the LAPD and Santa Cruz Police Department, among many other departments across the country.

The team first tested a prototype of the software when Malinowski was stationed at the Foothill Division of the LAPD in 2011. According to PredPol, crimes in the Foothill area went down 13 percent during the four months the algorithm was implemented.

After the successful pilot test in Los Angeles, the Santa Cruz Police Department became the first law enforcement agency in the country to institute PredPol in its operational activities.

Currently, over 60 police departments across the nation use the software, according to PredPol's website.

PredPol extols the software's success at the LAPD. Between February 2013 and February 2014, the LAPD’s Foothill Division saw 150 fewer crimes and reported zero crimes on February 13, 2014.

“Out of those 21 [LAPD Divisions], in 2013-14, Foothill Division was the number one division with crime reduction, and I believe a lot of it had to do with this predictive policing model,” says Dominic Choi, the Commander of Operations at LAPD's Central Bureau.


The chart shows the crime went down significantly in 2013 to 2014 in the Foothill Division. On the other hand, crimes in other divisions either decreased slightly or increased significantly (e.g., Pacific and Central Divisions). Other divisions To isolate just for the the Foothill Division (or any other division), click on the colored circle to hide those divisions.

Choi, who served as Foothill Division's Patrol Commanding Officer from 2014 to the end of 2016, explained that predictive policing models can identify patterns in historical data and determine where crime is likely to occur. The PredPol software creates maps of where future offenses might occur during a 12-hour period, which can act as a guide for officers on patrol.

"Random patrol produces random results, so we need to be smart about where to patrol" says Choi, who also serves as the department's Homeless Coordinator in addition to his operations duties. Having to deal with the city's rising homeless population with an understaffed police force, Choi thinks that predictive tools like PredPol can help allocate resources better.

"It’s a very focused approach to crime fighting and we can do it with fewer officers, in a more efficient way," he says. The LAPD uses PredPol in 16 of its 21 divisions.

Whether PredPol's deployment has a direct correlation to lower crime rates is yet to be studied rigorously. Predictive policing software can be an attractive substitute for smaller, budget-strapped local police departments.

"If you can’t put many people on the street, what are you going to do?" says Officer Howard-Gibbon from the Santa Cruz Police, which operates without an in-house crime analyst. "Santa Cruz is now down to just around 78 officers. I think that using technology like this, leveraging technology is essential to do exactly that."

She took out a folded sheet of paper from a front pocket of her uniform. The sheet of paper revealed a map of Santa Cruz with boxes. At the bottom of the sheet is a list of predicted areas of crime. "The algorithm provides 500-foot- by- 500-foot zones," Howard-Gibbon says, pointing out little red square blocks on the map where she would devote extra time to patrol.

The 30-year-old officer joined the Santa Cruz police force five years ago. As a relatively new officer, she believes that the predictive policing software, PredPol, is tremendously helpful for officers like herself. Although she doesn't know how the backend of the software works, she believes it works and that it adds objectivity to policing decisions.

The use of surveillance technology in conjunction with data gathering and predictive analytics grants authorities a level of insight into people’s lives that previously would have required a warrant or one-on-one surveillance. Image manipulation of the California State Capitol Building by Kristine de Leon.

"Dragnet" Surveillance: Collecting Data on Everyone

The computer systems at the Angeles County Sheriff's Department (LASD) are linked interdepartmentally and with other local, state and federal agencies through a high-performance computing platform by Palantir Technologies, according to a Request for Information report published by the LASD. Hence, law enforcement agencies, both local and national, have access to more data than ever before.

"But there are times when we don’t want it wide open to every single law enforcement person," says Sgt. DeBrabander from the Long Beach Police. "For one, that information in there is also subject to FOIA, so someone can do a request and get active investigation information."

"Second, I don’t know that person, and there have been cases of bad cops," continues DeBrabander, who often leads drug investigations for the LBPD. "That’s not the norm, but all it takes is one, and if that officer has access to my investigation, then that can comprise my entire investigation."

DeBrabander explains a case in which an Automated License Plate Reader in Chula Vista detected a suspicious vehicle, whose information was then fed into the Palantir system shared across law enforcement agencies.

"Some information that was uploaded into Palantir was tying things together that shouldn’t have been happening," continues DeBrabander. He says that the system automatically links and computes disparate information together, a feature that DeBrabander finds problematic.

"We didn’t want this case, and every case tied together," DeBrabander expresses with frustration.

But DeBrabander and his investigative department are not alone in expressing concerns about Palantir's data management system and predictive software. Citizens also are concerned about the massive data gathering, sharing and 'profiling.'

Jamie Garcia and Hamid Khan from the Stop LAPD from Spying Coalition frequently attend the LAPD's Board of Police Commissioners meetings on Tuesday mornings to express their concerns about the LAPD's use of surveillance technology and predictive policing software.

"LAPD’s policing strategies infringe on human and civil rights, and violate privacy," says Garcia.

Khan added that the coalition is also seeking information about the types of databases that the computer platform the LAPD (and LASD) searches through in the creation of a “chronic offender bulletin,” which lists targeted individuals.

During a police commissioner meeting on February 13, 2017, at the LAPD's headquarters, Garcia announced that the coalition had filed a lawsuit against the City of Los Angeles and the LAPD. She alleged that the LAPD violated California Sunshine laws by refusing to provide records about the algorithms used in the Palantir database system, as well as details about the department "Operation LASER" program.

Jamie Garcia (left) and Hamid Khan (center) from the Stop LAPD Spying Coalition protest against the LAPD's use of drones outside the LAPD headquarters in downtown Los Angeles. (Photo by Kristine de Leon)

Operation LASER (Los Angeles' Strategic Extraction and Restoration program) is a software program used by the LAPD to inform its patrol officers where crime and potential offenders are likely to be. Dubbed LASER for its 'laser-like techniques' to target individuals and hotspots, it uses CIA-developed technology and fuses the data from 15 separate sources of data -- such as license plate readers and cellphone trackers — to quantify civilians according to risk.

LASER was first launched at the Newton Division in 2011. As of the end of 2017, 12 of 21 patrol Divisions use the program, despite years of civilian protest against what some consider highly intrusive ways of surveillance. LAPD's top brass claim it works, but only one study — authored by the designers and creators of the program — has evaluated the efficacy and success of LASER.

Michael White, a senior subject matter expert for the Bureau of Justice Assistance's Smart Policing Initiative that sponsored Operation LASER at the LAPD, says "crime control and crime prevention is most effective when you target people and places."

White, who is also a professor of criminology at Arizona State University, explains that the LAPD introduced LASER program in 2011 to control gun violence in the Newton Division. He added that the program does not target everybody, but specific people.

"It's very, very targeted with the removal or disruption of a community that's causing the crimes in the area," White explains. "LASER uses crime intelligence to focus on prolific offenders or people likely to offend [...] the goal is focused deterrence."

Privacy advocates like Garcia allege that LAPD's LASER program lacks transparency and has no clear oversight or regulation, and unjustly targets ex-convicts who've served their time.

"It also takes away from seeing people holistically, seeing people in the conditions that they're living in, that they're growing in and hearing the story. Hearing their story," Garcia said. "Instead they're trying to essentially say, 'If we could break everybody up into a data point and quantify them, we'd remove bias and prejudice by doing that.' But it ends up not happening because the same tools that have always been institutionally racist and oppressive are the ones that are gathering that information themselves."

What if a computer could help predict who might be violent? What if a policing system could be redesigned to target those who are at risk in a neighborhood before a shooting occurs?

Person-based Predictive Targeting

Roughly 2,000 miles northeast of Los Angeles is one of the most heavily surveilled cities in the world: Chicago. Thousands of cameras line the city, and a network of automatic license plate readers, cell site simulators and many other surveillance devices are used by the Chicago Police Department and its sister agencies. Perhaps it's all for a good reason.

But citizens often don't know about the scope of the surveillance networks, their expensive cost, and the privacy implications of their use.

Robert McDaniel was 22 years old when the former Chicago police commander, Barbara West, dropped by his West Side home with a team of social workers unannounced in July 2013.

After explaining that she and her colleagues had his files back at the Chicago Police Department (CPD), West told McDaniel that she knew his best friend had been killed the previous year in their notorious Austin neighborhood. She warned that he could face similar consequences if he didn't change his ways.

McDaniel was shocked to learn that he had made the Chicago Police Department’s Strategic Subject List (SSL), or what some refer to as the "heat list.” At the time, he had multiple arrests on suspicion of minor offenses but only one misdemeanor conviction on his record. But regardless of that, he was shocked to find out that the police were watching him.

What if a computer could help predict who might be violent? What if a policing system could be redesigned to target those who are at risk in a neighborhood before a shooting occurs? These are the ideas behind person-based targeted policing.

The SSL is a rank-order list of potential victims and subjects with the highest risk of violence. With the help of surveillance, big data collection, statistics and data analysis, Chicago police could pinpoint people who are at risk of shooting others or being shot themselves.

The idea behind the SSL started with a $3 million experiment through a partnership between the Chicago Police and a team of researchers under the direction of Miles Wernick at the Illinois Institute of Technology. The algorithm that Wernick's team developed uses eleven variables to create risk scores from 1 to 500. A higher score means that the person has a greater risk of being a victim or perpetrator of gun violence.

Part of the strategy with big data is to warn those on the list individually from engaging in any more criminal activity, particularly gun violence. Hence, there's the Chicago Violence Reduction Strategy, where a detective or police officer shows up at a "strategic subject's" front door with a social worker and a community representative to tell the person, typically a young black man, that he is one of a few thousand men who may die. The team would approach the individual with a truce of sorts, offering to help to find a job or of social services.

A Freedom of Information request from the Chicago-based nonprofit Lucy Parsons Labs found that in 2016, the Violence Reduction Strategy program attempted 1,024 notifications. Among those attempted notifications, the program completed 558, but only 26 people attended a call-in, where police officers, social workers, and others offer support services.

What does that mean? According to an email statement from the office of the program's director, Christopher Mallette, each of the notification attempts could involve a visit to someone’s house. A completed notification, meanwhile, would include a face-to-face meeting.

According to the CPD's Investigatory Stops Report (ISR) data released in 2017, the Chicago police assigned scores to nearly 400,000 people on the Strategic Subjects List. Scores were assigned using eleven variables.

  • age
  • the number of times the person was shot
  • the frequency of becoming a victim of battery or assault
  • gang affiliation
  • the number of arrests for violent offenses
  • drug-related offenses, or weapons
  • and the frequency of arrest records

Young adults with SSL scores tend to be minorities

Description

Stacked bar chart with number of individuals with SSL scores ages 20-29 in Chicago as parts of their total population representation according to the 2016 American Community Survey.

Twenty-eight-year-old Jamal Cain from the West Side Chicago neighborhood of Austin is one of the 1,400 people on the SSL that the CPD visited between 2013 and 2016. Cain is also one of the hundreds of thousands of people on the list who has a score of over 250 -- the threshold at which Chicago police will start paying attention to individuals.

"I didn't trust no police. I don't think they ever help with nothing."
Jamal Cain

"I see police every day," affirms Cain during one of several phone and FaceTime interviews.

About a year ago, a Chicago police officer and two social workers visited Cain's grandparents' home in March 2017, while Cain hid in a closet in the next room.

"I didn't trust no police," relents Cain. "I don't think they ever help with nothing. So when they came, I hid. I wasn't going to talk to nobody."

Cain was born and raised in Austin. Just 8 miles from Chicago’s downtown, the once prosperous community in the 1930s is now littered with boarded-up storefronts, empty lots and under-enrolled schools. Today, of the nearly 89,000 residents, more than 30% live in poverty.

As Chicago's violence is at its highest since the drug wars of the 1990s, Austin has been the center stage for many of the shootings and homicides. More than 4,300 were shot in 2016. Of the city’s 780 people killed by gunfire that year, 88 were homicides.

SSL Demographics Treemap

Description

Each rectangle represents individuals grouped by race, sex and age group. The sizes of the rectangle are determined by the number of individuals with an SSL score with those demographics. Some individuals are either listed as race "Unknown", which is grouped with American Indian/Alaskan Native due to very little representation. There were fewer than 100 individuals listed with sex as "X" instead of "M" or "F", so they are not included in this chart.

SSL Score >= 0 People

Twenty-eight-year-old Jamal Cain from the West Side Chicago neighborhood of Austin is one of the 1,400 people on the SSL that the CPD visited between 2013 and 2016. Cain is also one of the hundreds of thousands of people on the list who has a score of over 250 -- the threshold at which Chicago police will start paying attention to individuals.

"I see police every day," affirms Cain during one of several phone and FaceTime interviews.

About a year ago, a Chicago police officer and two social workers visited Cain's grandparents' home in March 2017, Cain hid in a closet in the next room.

"I didn't trust no police," relents Cain. "I don't think they ever help with nothing. So when they came, I hid. I wasn't going to talk to nobody."

Cain was born and raised in Austin. Just 8 miles from Chicago’s downtown, the once prosperous community in the 1930s is now littered with boarded-up storefronts, empty lots and under-enrolled schools. Today, of the nearly 89,000 residents, more than 30% live in poverty.

As Chicago's violence is at its highest since the drug wars of the 1990s, Austin has been the center stage for many of the shootings and homicides. More than 4,300 were shot in 2016. Of the city’s 780 people killed by gunfire that year, 88 were homicides.

Police Stops and Arrests of individuals with SSL scores

Description

The map shows the locations of police-initiated stops in Chicago against the distribution of individuals with SSL scores by the census tract of their most recent arrests. The dot areas are scaled by the number of police stops, while the census tracts are scaled by the number of individuals arrested on the SSL in that tract.

Cain never left the West Side, nor did he think he could. At the tender age of 8, he joined the Almighty Vice Lord Nation, a gang with deep roots in Chicago that traces their heritage back to the infamous Henry Horner Homes projects. For as long as he can remember, violence, guns, drugs, deaths and racism has terrorized his neighborhood.

“This is everyday life, 100 percent,” stresses Cain, during a FaceTime interview, after chronicling the events of the past week on just his street alone. His tired-looking eyes are halfway open, as he walked his mixed pitbull outside in the frigid cold air along Trumball Avenue.

The carnage that rips apart some of Chicago’s West and South Side communities has many causes. Most gun violence is gang-related, attributed either to drug-selling or territory disputes. The mayhem has intensified the presence of police.

Cain admits that he sells drugs on treacherous street corners. He says that he has become so familiar with gunshots that he could sometimes recognize a weapon type by its sound. He has seen friends die on the streets and mourned many others who have been killed.

Jamal Cain was assigned a Strategic Subject List (SSL) score high enough for the Chicago police's Violence Reduction program to visit him at his home. (Photo courtesy of Jamal Cain)

But that's not uncommon for people living at the intersection of poverty and violence.

As the city tries to wrestle down the gun violence that claimed 681 lives last year, according to homicide data, it is worth considering what life is like for a young Black man in one of the city's most impoverished and besieged communities.

"I've been shot six times," says Cain. "Every Black man has to have a gun here."

Person-based predictive policing is one approach the CPD has been using to reduce gun violence. Who gets shot? The algorithm knows. Who’s going to be involved in a crime? The algorithms' predictions are sometimes tragically accurate.

"You know, I guess [the police] were right," Cain admits. "I ended up going to jail after that, for gun possession."

On April 17, 2017, just two weeks after receiving a custom notifications visit, Cain was charged with a felony, a gun possession charge to which he pleaded guilty and served a month in prison, according to the Cook County Circuit Clerk's Office.

Other cities have tried similar person-based predictive policing, including: New Orleans to identify the 1% of violent crime drivers in the city; Rochester, New York to determine juveniles who might be repeatedly involved in criminal activity; and Los Angeles to decide who might be repeat offenders.

A real-time stream of Twitter messages at a maximum rate of 50 Tweets per second. Data source: PubNub

Behind the Numbers

It's easy to forget that behind the data and algorithms are people — individuals living their lives. Some of these people engage in crime, some not. Some live in deep poverty, some not. But all are now encircled by big data’s reach. However, in many cities, data-driven crime suppression affects communities of color.

“I’m inherently uncomfortable with predictive policing strategies,” says Ana Muniz, a professor of law at the University of California, Irvine. "I think there's this way in which the use of technology legitimizes a type of policing and that people accept it as an egalitarian, bias-free way of policing," adds Muniz.

Some residents of Santa Cruz are skeptical, arguing that the supposed neutrality of data and algorithms ignores context.

“There’s been an undercurrent of racism that’s been here for a like, a really long time,” says Simba Kenyatta, a representative of the Santa Cruz chapter of the National Association for the Advancement of Colored People. “I’m from Indiana, so I know what racism looks like. And Santa Cruz is really bad.”

“One of the bad things about it is that we can’t talk about race,” Kenyatta explains. “‘Cause Santa Cruz is supposedly a liberal, progressive town. So we don’t have that problem — according to white people.”

"If you put police in one area, and they do a lot of arrests, that becomes a high crime area because the crime rate is who gets caught," says Kenyatta. "So if you put as many police in the suburbs as they do in the city, then the crime rate would go up in the suburbs."

"What I'm sayin' is that the police shape the data, and the data shapes the police," he says.

Back in Redlands, Bueermann ruminates about the trajectory of big data policing. "So this is the problem with these using strategies that have not been evaluated," says Bueermann. He stops to think, takes a sip of his Panera Bread Plum Ginger Hibiscus Tea and continues, "But it happens all the time because the police don't always have the luxury of waiting for a scientific evaluation that can take years to do to find out if it works or not."

"There's almost no evidence that shows that predictive policing works," Bueermann says. "So that doesn't mean it doesn't. It's just that there's no rigorous scientific evaluation that shows that it likely works."

He thinks that humans are terrible at predicting the future, tending to overestimate what’s doable and possible in the short term.

"Policing needs its version of what's called the Hippocratic oath right in medicine," Bueermann concedes. “There's this thing that purports to tell physicians above all why you're treating the patient and trying to cure them. Don't do any harm to the patients."

The LAPD Central Division station near Skid Row and downtown LA. (Photo by Kristine de Leon)

The Future of Policing

The last decade has been marked by the increasing use of big data and surveillance. From automated license plate readers, biometrics, cell tower simulators, social media monitoring, to police observation devices that can track and identify vehicles and persons, police now have more information about people than ever before.

But police officers think there's no reason to worry about privacy because "everything is out there already."

"The media makes a big stink about all this surveillance stuff, but everyone's information is already open to the public," explains Hoepner. "You should have no reasonable expectation of privacy-- it just doesn't exist."

Other officers, such as Howard-Gibbon and DeBrabander, have expressed similar statements about the lack of privacy. But they disagree with calling it an Orwellian Big Brother conspiracy.

"If you haven't done anything wrong, you have nothing to worry about," says Hoepner. He reasoned that criminals, however, need to be monitored and their surveillance is justified.

"As a society, we don’t get tough with people anymore. We want to put the blame on a lot of other different reasons," says Hoepner. "We have a hard time accepting that sometimes people are just criminals. I don’t know if they’re born that way or what it is, but it’s who they are."

Surveillance camera by the Chicago Police Department. In the past, surveillance of public spaces was dependent on resources such as the availability of police officers. However, surveillance technology has expanded its capabilities to monitor more people than ever before.