Computers v. Crime | Full Documentary | NOVA | PBS
7ySOSrIe7fY • 2022-10-26
Transcript preview
Open
Kind: captions
Language: en
foreign
we live in this era where we leave
digital traces throughout the course of
our everyday lives
what is this data how is it collected
how is it being used one way it's being
used is to make predictions about who
might commit a crime give me all your
money man and who should get bailed
count one you're charged with felony
intimidating the idea is that if you
look at Haas primes you might be able to
predict the future
we want safer communities we want
societies that are less incarcerated but
is that what we're getting are the
predictions reliable I think algorithms
can in many cases be better than people
but of course algorithms don't have
Consciousness the algorithm only knows
what it's been fed because it's
technology we don't question them as
much as we might a racist judge or a
racist officer they're behind this
veneer of neutrality
we need to know who's accountable when
systems harm the communities that
they're designed to serve
can we trust the justice of predictive
algorithms and should we
computers versus crime
right now on Nova
[Music]
we live in a world of Big Data
where computers look for patterns in
vast collections of information in order
to predict the future
and we depend on their accuracy is it a
good morning for jogging
will this become cancer
what movie should I choose
the best way to beat traffic your
computer can tell you
similar computer programs called
predictive algorithms are mining big
data to make predictions about Crime and
Punishment Reinventing how our criminal
legal system works
policing agencies have used these
computer algorithms in an effort to
predict where the next crime will occur
and even who the perpetrator will be
state is recommending judges use them to
determine who should get bail and who
shouldn't have you failed up here next
time you get no bond
it may sound like the police of the
future in the movie Minority Report
placing you under arrest for the future
murder of Sarah marks but fiction it's
not
[Music]
how do these predictions actually work
can computer algorithms make our
criminal legal system more equitable
yeah
are these algorithms truly fair and free
of human bias
I grew up in Chicago in the 1980s and
early 1990s
[Music]
my dad was an immigrant from Greece
we worked in my family's restaurant
called kmars
[Music]
Andrew Papa Christus was a 16 year old
kid in the north side of Chicago in the
1990s I spent a lot of my formative
years busting tables serving people with
hamburgers and euros
and kind of was a whole family affair
young Papa crystals was aware the
streets could be dangerous but never
imagined the violence would touch him or
his family
two more gang-related murders Monday
night and of course you know the 80s 90s
in Chicago was something historically
the most violent periods in Chicago
Street Corner Drug markets Street
organizations
and then like a lot of other businesses
on our on our block and in our
neighborhood local gangs try to extort
my family and the business and my dad
had been running kmars for 30 years and
kind of just said no
[Music]
then one night the family restaurant
burned to the ground
police suspected arson
it was quite a shock to our family
because everybody in the neighborhood
worked in the restaurant at one point in
their life
and my parents lost 30 years of their
lives
that was really one of the events that
made me want to understand violence like
how could this happen
about a decade later
Papa Christos was a graduate student
searching for answers
in graduate school I was working on a
violence prevention program that brought
together community members including
Street Outreach workers
and we were sitting at a table
and one of these Outreach workers asked
me the University student who's next
who's going to get shot next
and where that led was me sitting down
with stacks of shooting and homicide
files
with a red pen and a legal pad by hand
creating these Network images of this
person shot this person and this person
was involved with this group and this
event and creating a web of these
relationships
and then I learned that it was so
science about networks I didn't have to
invent anything
social network analysis was already
influencing popular culture
Six Degrees of Separation was a play on
Broadway
and then there were Six Degrees of Kevin
Bacon
the idea was you would play this game
and whoever got the shortest distance to
Kevin Bacon would win
so Robert De Niro was in a movie with so
and so who was in a movie with Kevin
Bacon it was creating essentially a
series of ties among movies and actors
and in fact there's a mathematics behind
that principle
it's actually old mathematical graph
Theory right that goes back to 1900s
mathematics and lots of scientists
started seeing that there were
mathematical principles and
computational resources computers data
were at a point that you could test
those things so it was in a very
exciting time
we looked at arrest records and police
stops and we looked at victimization
Records who was the victim of a homicide
or a non-fatal shooting
the statistical model starts by creating
the social networks of say everybody who
may have been arrested in a particular
neighborhood so person a and person B
we're in a robbery together they have a
tie and then person B and person C were
were stopped by the police in another
instance and it creates networks of
thousands of people
understanding that events are connected
places are connected
that there are old things like disputes
between Crews which actually drive
behavior for Generations
what we saw was striking and you can see
it immediately and you can see it a mile
away which was gunshot victims clumped
together you very rarely see one victim
you see two three four sometimes they
string across time and space
and then the model predicts what's the
probability that this is going to lead
to a shooting on the same pathway in the
future
another young man lies dead
in Boston Papa Christos found that 85
percent of all gunshot injuries occurred
within a single social network
individuals in this network were less
than five handshakes away from the
victim of a gun homicide or non-fatal
shooting
the closer a person was connected to a
gunshot victim he found the greater the
probability that that person would be
shocked
around 2011 When papa Christos was
presenting his groundbreaking work on
social networks and gang violence the
Chicago Police Department wanted to know
more we were at a conference the then
superintendent of the police department
he was asking me a bunch of questions he
had clearly read the paper
the Chicago Police Department was
working on its own predictive policing
program to fight crime
they were convinced that Papa christos's
model could make their new policing
model even more effective
[Music]
predictive policing involves looking to
historical crime data to predict future
events either where please believe crime
may occur or who might be involved in
certain crimes
so it's the use of historical data to
forecast a future event
at the core of these programs is
software which like all computer
programs is built around an algorithm
so think of an algorithm like a recipe
you have inputs which are your
ingredients you have the algorithm which
is the steps
and then there's the output which is
hopefully the delicious cake you're
making
happy birthday so one way to think about
algorithms is to think about the hiring
process in fact recruiters have been
studied for 100 years and it turns out
many human recruiters have a standard
algorithm when they're looking at a
resume
so they start with your name
and then they look to see where you went
to school
and then finally they look at what your
last job was if they don't see the
pattern they're looking for that's all
the time you get
and in a sense that's exactly what
artificial intelligence is doing as well
in a very basic level it's recognizing
sets of patterns and using that to
decide what the next step in its
decision process would be
what is commonly referred to as
artificial intelligence or AI is a
process called machine learning where a
computer algorithm will adjust on its
own without human instructions in
response to the patterns it finds in the
data
these powerful processes can analyze
more data than any person can and find
patterns never recognized before
the principles for machine learning were
invented in the 1950s but began
proliferating only after about 2010.
what we consider machine learning today
came about because hard drives became
very cheap so it was really easy to get
a lot of data on everyone in every
aspect of life and the question is what
can we do with all that data those New
Uses are things like predictive policing
there are things like deciding whether
or not a person is going to get a job or
not or be invited for a job interview
so how does such a powerful tool like
machine learning work
take the case of a hiring algorithm
first a computer needs to understand the
objective here the objective is
identifying the best candidate for the
job
the algorithm looks at resumes of former
job candidates and searches for keywords
in resumes of successful hires the
resumes are what's called training data
the algorithm assigns values to each
keyword words that appear more
frequently in the resumes of successful
candidates are given more value
the system learns from past resumes the
patterns of qualities that are
associated with successful hires then it
makes its predictions by identifying
these same patterns from the resumes of
potential candidates
in a similar way the Chicago Police
wanted to find patterns in crime reports
and arrest records to predict who would
be connected to violence in the future
they thought Papa christus's model could
help
obviously we wanted to and tried and
framed and wrote all the caveats and
made our recommendations to say This
research should be in this public health
space but once the math is out there
once the statistics are out there people
can also take it and do what they want
with it
while Papa Christo saw the model as a
tool to identify future victims of gun
violence
CPD saw the chance to identify not only
future victims but future criminals
first it took me you know by by surprise
and then it got me worried what is it
going to do who's it going to harm
what the police wanted to predict was
who was at risk for being involved in
future violence give me all your money
man
training on hundreds of thousands of
arrest records the computer algorithm
looks for patterns or factors associated
with violent crime to calculate the risk
that an individual will be connected to
Future violence
[Music]
using social network analysis
arrest records of Associates are also
included in that calculation
the program was called the Strategic
subject list or SSL it would be one of
the most controversial in Chicago
policing history the idea behind the
Strategic subjects list or the SSL was
to try to identify the people
who would be most likely to become
involved as what they called a party to
violence either as a shooter or a victim
Chicago police would use Papa christos's
research to evaluate what was called an
individual's co-arrest Network
and the way that the Chicago Police
Department calculated an individual's
network was through kind of two degrees
of removal anybody that I've been
arrested with and anybody that they
would had been arrested with counted as
people who were within my network so my
risk score would be based on my
individual history of arrest and
victimization as well as the histories
of arrest and victimization of people
within that two degree network of mine
who's closely known as the heat list if
you were hot you were on it and they
gave you literally a risk score at one
time it was zero to 500 plus if you're
500 plus you are a high risk person
[Music]
and if you made this key list you might
find a detective knocking on your front
door
[Music]
trying to predict future criminal
activity is not a new idea
Scotland Yard and London began using
this approach by mapping crime events in
the 1930s
[Music]
but in the 1990s
it was New York City Police Commissioner
William Bratton who took crime mapping
to another level
I run the New York City Police
Department my competition is the
criminal element Bratton convinced
policing agencies across the country
that data-driven policing was the key to
successful policing strategies part of
this is to prevent crime in the first
place
[Music]
Bratton was inspired by the work of his
own New York City Transit Police as you
see all those dots on the map that's our
opponents it was cold charts of the
future and credited with cutting Subway
felonies by 27 and robberies by a third
and sole potential
he ordered all New York City precincts
to systematically map crime collect data
find patterns report back
the new approach was called comstat
you know using Data Tracking
year-to-date's identifying places where
law enforcement interventions could be
effective Etc really laid the groundwork
for predictive policing
by the early 2000s as computational
power increased criminologists were
convinced this new data Trove could be
used in machine learning to create
models that predict when and where crime
would happen in the future
L.A police now say the government opened
fire with a semi-automatic weapon in
I was chief of the Los Angeles Police
Department
Bratton joined with academics at UCLA to
help launch a predictive policing system
called predpoll powered by a machine
learning algorithm
purple started as a spin-off of a set of
like government contracts that were
related to military work they were
developing a form of an algorithm that
was used to predict IEDs
and it was a technique that was used to
also detect aftershocks and
seismographic activity
and after those contracts ended the
company decided they wanted to apply
this in the domain of of policing
domestically the United States
the predpole model relies on three types
of historical data
type of crime crime location and time of
crime going to two to five
people the algorithm is looking for
patterns to identify locations where
crime is most likely to occur
as new crime incidents are reported they
get folded into the calculation
the predictions are displayed on a map
as 500 by 500 foot areas that officers
are then directed to Patrol
and then from there the algorithm says
okay based on what we know about the
kind of very recent history where it's
likely that we'll see crime in the next
day or the next hour
one of the key reasons that police start
using these tools is the efficient and
even to a certain extent like in their
logic more fair
um and and justifiable allocation of
their police resources
by 2013 in addition to predpole
predictive policing systems developed by
companies like hunch lab IBM and
palantir were in use across the country
and computer algorithms were also being
adopted in courtrooms
21cf 3810 state of Wisconsin versus
these tools are used in pre-trial
determinations they're used in
sentencing determinations and they're
used in housing determinations they're
also used importantly in the plea
bargaining phase they're used really
throughout the entire process to try to
do what judges have been doing which is
the very very difficult task of trying
to understand and predict what will a
human being do tomorrow or the next day
or next month or three years from now
failed forfeited we failed to appear 12
13 21 didn't even make it to preliminary
hearing the software tools are an
attempt to try to predict it better than
humans can I'm count one you're charged
with felony intimidation of a victim
so in the United States you're innocent
until you've been proven guilty but
you've been arrested
now that you've been arrested a judge
has to decide whether or not you get out
on Bill or how high or low that bill
should be you're charged with driving a
suspended license I've set that Bond at
one thousand no insurance I've set that
Bond at one thousand one of the problems
is judges often are relying on money
Bond or financial conditions of release
so I'm going to lower response to make
it a bit more reasonable so instead of
250 000 cash Surety is one hundred
thousand it allows people who have
access to money to be released if you
are poor you are often being detained
pre-trial approximately seventy percent
of the people in jail are there on
pre-trial
these are people who are Presumed
Innocent but are detained during the
pre-trial stage of their case
many jurisdictions use pre-trial
assessment algorithms with a goal to
reduce jail populations and decrease the
impact of judicial bias
the use of a tool like this takes
historical data and assesses based on
Research Associates factors that are
predictive of the two outcomes that the
judge is concerned with that's Community
safety
and whether that person will appear back
in court during the pre-trial period
[Music]
many of these algorithms are based on a
concept called a regression model
the earliest called linear regression
dates back to 19th century mathematics
what linear
which is predict based on the initial
conditions the situation they're seeing
predict what will happen
in the future whether that's like in the
next one minute or the next four years
throughout the United States over 60
jurisdictions use predictive algorithms
as part of the legal process
one of the most widely used is compass
the compass algorithm weighs factors
including a defendant's answers to a
questionnaire to provide a risk
assessment score
these scores are used every day by
judges to guide decisions about
pre-trial detention bail and even
sentencing
but the reliability of the compass
algorithm has been questioned
in 2016 propublica published an
investigative report on the compass risk
assessment tool
investigators wanted to see if the
scores were accurate in predicting
whether these individuals would commit a
future crime
they found two things that were
interesting one was that the score was
remarkably unreliable in predicting who
would commit a crime in the future over
this two-year period
but then the other thing that propublica
investigators found was that black
people were much more likely to be
deemed high-risk and white people low
risk
this was true even in cases when the
black person was arrested for a minor
offense
and the white person in question was
arrested for a more serious crime
[Music]
this propublica study was one of the
first to begin
to burst the bubble of Technology as
somehow objective and neutral
the article created a national
controversy
but at Dartmouth a student convinced her
professor they should both be more than
stunned
as it turns out when my students Julia
dressel reads the same article I'm sorry
this is terrible we should do something
about it the difference between an
awesome idealistic student and a jaded
uh professor and I thought I think
you're right and as we were sort of
struggling to understand the underlying
roots of the bias and the algorithms we
ask ourselves a really simple question
are the algorithms today are they doing
better than humans because presumably
that's why you have these algorithms is
that they eliminate some of the bias and
the prejudices either implicit or
explicit in the human judgment to
analyze compasses risk assessment
accuracy they used the crowdsourcing
platform mechanical turf
their online study included 400
participants who evaluated 1 000
defendants
we asked participants to read a very
short paragraph about an actual
defendant how will they were whether
they were male or female what their
prior juvenile conviction record was and
their prior adult conviction record and
importantly we didn't tell people their
race and then we ask a very simple
question do you think this person will
commit a crime in the next two years yes
no
and again these are non-experts
these are people being paid a couple of
bucks online to answer a survey no
criminal justice experience don't know
anything about the defendants they were
as accurate as the commercial software
being used in the courts today one
particular piece of software that was
really surprising
um we would have expected a little bit
of improvement after all the algorithm
has access to huge amounts of training
data
and something else puzzled the
researchers
the M Turk workers answers the questions
about who would commit crimes in the
future and who wouldn't
showed a surprising pattern of racial
bias
even though race wasn't indicated in any
of the profiles
they were more likely to say a person of
color will be high risk when they
weren't and they were more likely to say
that a white person would not be high
risk when in fact they were and this
made no sense to us at all you don't
know the race of the person how is it
possible that you're biased against them
in this country if you are a person of
color you are significantly more likely
historically to be arrested be charged
and to be convicted of a crime so in
fact prior convictions is a proxy for
your race not a perfect proxy but it is
correlated because of the historical
inequities in the criminal justice
system and policing in this country
what are y'all doing like this this
racial program
Research indicates a black person is
five times more likely to be stopped
without cause than a white person
black people are at least twice as
likely as white people to be arrested
for drug offenses even though black and
white people use drugs at the same rate
black people are also about 12 times
more likely to be wrongly convicted of
drug crimes
other populations therefore the tool
predicts that a black man for instance
will be arrested at a rate and
recidivate at a rate that is higher than
a white individual
and so what was happening is you know
the Big Data the big machine learning
folks are saying look we're not giving
it race it can't be racist but that is
spectacularly naive because we know that
other things correlate with the race in
this case number of Prior convictions
and so when you train an algorithm on
historical data well guess what it's
going to reproduce history of course it
will
compounding the problem is the fact that
predictive algorithms can't be put on
the witness stand and interrogated about
their decision-making processes
many defendants have had difficulty
getting access to the underlying
information that tells them
what was the data set that was used to
assess me what were the inputs that were
used how were those inputs weighted so
you've got what can be these days
increasingly a black box a lack of
transparency
some black box algorithms get their name
from a lack of transparency about the
code and data inputs they use
which can be deemed proprietary
but that's not the only kind of black
box
a black box is any system which is so
complicated that you can see what goes
in and you can see what comes out but
it's impossible to understand what's
going on inside it
all of those steps in the algorithm are
hidden inside phenomenally complex math
and processes
and I would argue that when you are
using algorithms and Mission critical
applications like Criminal Justice
System we should not be deploying Black
Box algorithms
red poll like many predictive platforms
claimed a proven record for Crime
reduction
in 2015 Fred poll published its
algorithm in a peer-reviewed journal
William Isaac and Christian Lum research
scientists who investigate predictive
policing platforms analyzed the
algorithm
we just kind of saw the algorithms going
back to the same one or two blocks every
single time
and that's kind of strange because if
you had a truly predictive policing
system you wouldn't necessarily see it
going to the same locations over and
over again
for their experiment Isaac and Lum used
a different data set
Public Health Data to map illicit drug
use in Oakland
it's a good chunk of the city was kind
of evenly distributed in terms of where
potential listed drug use might be
but the police predictions were
clustering around areas where police had
you know historically found incidents of
illicit drug use
specifically we saw significant numbers
of neighborhoods that were predominantly
non-white and lower income being
deliberate targets of the
[Music]
even though illicit drug use was a
city-wide problem the algorithm focused
its predictions on low-income
neighborhoods and communities of color
the reason why is actually really
important it's very hard to divorce
these predictions from those histories
and Legacies of over policing
as a result of that they manifest
themselves in the data
in an area where there is more police
presence more crime is uncovered
the crime data indicates through the
algorithm that the heavily policed
neighborhood is where future crime will
be found
even though there may be other
neighborhoods where crimes are being
committed at the same or higher rate
every new prediction that you generate
is going to be increasingly dependent on
the behavior of the algorithm in the
past so you know if you go 10 days 20
days 30 days into the future right after
using an algorithm all of those
predictions have changed the behavior of
the police department and are now being
folded back into the next day's
prediction
[Music]
the result can be a feedback loop that
reinforces historical policing practices
all of these different types of machine
learning algorithms are all trying to
help us figure out are there some
patterns in this data
it's up to us to then figure out are
those legitimate patterns do they are
they useful patterns because the
computer has no idea it didn't make a
logical Association it just made it made
a correlation
my favorite Definition of artificial
intelligence
is it's any autonomous system
that can make decisions under
uncertainty you can't make decisions
under uncertainty without bias
in fact it's impossible to escape from
having bias it's a mathematical reality
about any intelligent system even us
and even if the goal is to get rid of
prejudice
bias in the historical data can
undermine that objective
[Applause]
Amazon discovered this when they began a
search for top talent with a hiring
algorithm whose training data depended
on hiring successes from the past
Amazon
somewhat famously within the AI industry
they tried to build a hiring algorithm
they had a massive data set
they had all the right answers because
they knew literally who got hired and to
get the promotion in their first year
the company created multiple models to
review past candidates resumes and
identify some 50 000 key terms
would Amazon actually wanted to achieve
was diversify their hiring
Amazon just like every other tech
company and a lot of other companies as
well has enormous bias built into its
hiring history it was always biased
strongly biased in favor of men in favor
generally of white or sometimes Asian
men
well they went and built a hiring
algorithm and sure enough this thing was
the most sexist recruiter you could
imagine if you said the word women's in
your resume then it wouldn't hire you if
you went to a women's college it didn't
want to hire you
so they take out all the gender markers
and all the women's colleges all the
things that explicitly says this is a
man and this is a woman or even the ones
that obviously implicitly say it
so they did that
and then they've trained up their new
deep neural network to decide who Amazon
would hired and it did something amazing
into something no human could do and
figured out who was a woman and it
wouldn't hire them
it was able to look through all of the
correlations that existed in that
massive data set
and figure out which ones most strongly
correlated with someone getting a
promotion and the single biggest
correlate of getting a promotion was
being a man
and it figured those patterns out and
didn't hire women
Amazon abandoned its hiring algorithm in
remember the way machine learning works
right it's like a student who doesn't
really understand the material in the
class they got a bunch of questions they
got a bunch of answers and now they're
trying to pattern match for a new
question say oh wait let me find an
answer that looks pretty much like the
questions and answers I saw before the
algorithm only worked because someone
has said oh this person whose data you
have they were a good employee this
other person was a bad employer this
person performed well this person did
not perform well
because algorithms don't just look for
patterns they look for patterns of
success however it's defined
but the definition of success is really
critically important to what that end up
ends up being and a lot of a lot of
opinion is embedded in what what does
success look like
in the case of algorithms human choices
play a critical role
the data itself was curated someone
decided what data to collect
somebody decided what data was not
relevant right and they don't exclude it
necessarily intentionally they could be
blind spots
the need to identify such oversights
becomes more urgent as technology takes
on more decision making
[Music]
facial recognition technology used by
law enforcement in cities around the
world for surveillance
in Detroit 2018
law enforcement looked to facial
recognition technology when thirty eight
hundred dollars worth of watches was
stolen from an upscale Boutique
police ran a still frame from the Shop's
surveillance video through their facial
recognition system to find a match
how do I turn a face
that equations can act with you turn the
individual pixels in the picture of that
face into values
what it's really looking for are complex
patterns across those pixels
the sequence of taking a pattern of
numbers transforming into little edges
and angles then transforming that into
eyes and cheekbones and mustaches
to find that match the system can be
trained on billions of photographs
facial recognition uses a class of
machine learning called Deep learning
the models built by Deep learning
techniques are called neural networks
a neural network is you know stylized as
you know trying to model how neural
Pathways work in the brain
you can think of a neural network as a
collection of neurons
so you put in some values into a neuron
and if they're sufficiently they add up
to some number they cross some threshold
this one will fire and send off a new
number to the next neuron
at a certain threshold the neuron will
fire to the next neuron
if it's below the threshold the neuron
doesn't fire this process repeats and
repeats across hundreds possibly
thousands of layers Making Connections
like the neurons in our brain
the output is a predictive match
based on a facial recognition match in
January 2020 the police arrested Robert
Williams for the theft of the watches
the next day he was released
who did Williams have an alibi
wasn't his face
to be very blunt about it these
algorithms are probably dramatically
over trained on white faces
[Applause]
so of course algorithms that start out
bad can be improved in general
the gender Shades project found that
certain facial recognition technology
when they actually tested it on black
women it was 65 accurate whereas for
white men it was 99 accurate
how did they improve it because they did
they did they built an algorithm that
was trained on more diverse data so I
don't think it's completely a lost cause
to improve algorithms to be better
[Music]
I used to think my job was all about
arrests
there was a commercial a few years ago
that showed the police officer going to
a gas station and then waiting for the
criminal to show up data spot patterns
and figure out where to send patrols
they said well our algorithm will tell
you exactly where the crime and the next
crime is going to take place well that's
just silly uh that's not how it works by
stopping it before it happens
let's build a smarter planet
[Music]
understanding what it is about these
places that enable crime problems to
emerge and or persist
at Rutgers University
the researchers who invented the crime
mapping platform called risk terrain
modeling or RTM
bristle at the term predictive policing
we don't want to predict we want to
prevent
I worked as a police officer a long time
ago in the early 2000s
police collected data
for as long as police have existed now
there is a greater recognition that data
can have value but it's not just about
the data it's about how you analyze it
how you use those results there's only
two data sets that risk train modeling
uses these data sets are local current
information about crime incidents within
a given area
and information about environmental
features that exist in that landscape
such as bars fast food restaurants
convenience stores schools Parks
alleyways
the algorithm is basically the
relationship between these environmental
features and the outcome data which in
this case is crime the algorithm
provides you with a map of the
distribution of the risk values
this is the highest risk area on this
commercial Corridor on Bloomfield Avenue
but the algorithm isn't intended for use
just by police
criminologist Alejandro Jimenez Santana
leads the Newark Public Safety
collaborative a collection of 40
Community organizations
they use RTM as a diagnostic tool to
understand not just where crime may
happen next
but why
RTM we identify this commercial Corridor
on Bloomfield Avenue which is where we
are right now as a risky area for auto
theft due to car idling so why is this
space particularly problematic when it
comes to auto theft
one is because we're in a commercial
Corridor where there's high density of
people who go to the beauty salon or to
go to a restaurant Uber delivery and
ubereats delivery people who come to
grab orders that also and leave their
cars running create the conditions for
this crime to be concentrated in this
particular area
what the data showed us was there was a
tremendous rise in Auto vehicle thefts
but we convinced the police department
to take a more Social Service approach
Community organizers convinced police
not to ticket idling cars and let
organizers create an effective public
awareness poster campaign instead and we
put it out to the Newark students to
submit in this flyer campaign and have
their artwork on the actual flyer
as you can see this is the commercial
Corridor on Bloomfield Avenue the side
score shows a six which means that we
are the highest risk of auto theft in
this particular location and as I move
closer to the end of the commercial
Corridor the side risk scores coming
down
this is the first time in Newark the
police data for Crime occurrences have
been shared widely with community
members
the kind of data we share is incident
related data sort of time location that
sort of information we don't discuss any
private arrest information we're trying
to avoid a crime
in 2019 Kaplan and Kennedy formed a
startup at Rutgers to meet the rising
demand for their technology
despite the many possible applications
for RTM from tracking public health
issues to understanding vehicle crashes
law enforcement continues to be its
principal application
like any other technology risk train
modeling can be used for the public good
when people use it wisely
foreign
we as academics and scientists we
actually need to be critical because it
could be the best model in the world it
can be very good predictions but how you
use those predictions matters in some
ways even more the police department had
revised the SSL numerous times since in
2019 Chicago's Inspector General
contracted the Rand Corporation to
evaluate the Strategic subject list
the predictive policing platform that
incorporated Papa christos's research on
social networks
I never wanted to go down this path of
who was the person that was the
potential suspect and that problem is
not necessarily with a statistical model
it's the fact that someone took victim
and made him an offender you've
criminalized someone who's at risk that
you should be prioritizing saving their
life
it turned out that some 400 000 people
were included on the SSL of those 77
were Black Or Hispanic
the inspector General's audit revealed
that SSL scores were unreliable the Rand
Corporation found the program had no
impact on homicide or victimization
rates
the program was shut down
but data collection continues to be
essential to law enforcement
foreign
there are things about us that we might
not even be aware of that are sort of
being collected by the data Brokers and
will be held against us for the rest of
our lives held against people forever
digitally
data is produced and collected
is it accurate
and can the data be properly vetted
and that was one of the critiques of not
just the Strategic subjects list but the
gang database in Chicago any data source
that treats data as a stagnant forever
condition is a problem
the game database has been around for
four years it'll be five in January you
want to get rid of
surveillance and black and brown
communities in places like Chicago and
places like La where I grew up there are
gang databases with tens of thousands of
people listed their names listed in
these databases just by simply having a
certain name and coming from a certain
zip code could land you in these
databases do you all feel safe in
Chicago the cops pulled up out of
nowhere didn't ask any questions just
immediately start beating on us and
basically was saying like what are what
are we doing over here you know like in
this in this gangbang area I was already
labeled as a gang banger from that area
because of where I lived I just happened
to live there
foreign
database is shared with hundreds of law
enforcement agencies even if someone is
wrongly included there is no mechanism
to have their name removed
if you try to apply for an apartment or
if you try to apply for a job or a
college or even on a um a house it will
show that you are in this record of a
gang database
I was arrested for peacefully protesting
and they told me that well you're in a
gang database but I was never in no game
because you have a game destination
you're a security threat group right
researchers and activists Have Been
instrumental in dismantling some of
these systems and so we continue to push
back I mean the fight is not going to
finish until we get rid of the database
[Music]
I think what we're seeing now is not a
move away from data it's just to move
away from this term predictive policing
but we're seeing big companies big Tech
enter the policing space we're seeing
the reality that almost all policing now
is data driven you're seeing these same
police departments invest heavily in the
technology including other forms of
surveillance technology including other
forms of databases to sort of manage
policing
more citizens are calling for
regulations to audit algorithms and
guarantee they're accomplishing what
they promise
without harm
ironically there is very little data on
Police use of big data and there is no
systematic data at a national level on
how these tools are used
the deployment of these tools so far
outpaces legal and Regulatory responses
to them what you have happening is
essentially this regulatory Wild West
and we're like well it's an algorithm
let's let's just throw it into
production without testing it to whether
it works sufficiently
um
at all
multiple requests for comment from
police agencies and law enforcement
officials in several cities including
Chicago and New York were either
declined or went unanswered
artificial intelligence must serve
people and therefore artificial
intelligence must always comply with
people's rights the European Union is
preparing to implement legislation to
regulate artificial intelligence
in 2021 bills to regulate data science
algorithms were introduced in 17 States
and enacted in Alabama Colorado Illinois
and Mississippi
if you look carefully on electrical
devices you'll see UL for Underwriters
Laboratory that's a process that came
about so that things when you plug them
in didn't blow up in your hand that's
the same kind of idea that we need in
these algorithms
we can adjust it to make it better than
the past
and we can do it carefully and we could
do it with with precision and an ongoing
conversation about what it means to us
that it is uh it's biased in the right
way I don't think you remove bias but
you get into a bias that you can live
with that you you think is moral
be clear like I think we can do better
but often doing better would look like
we don't use this at all
there's nothing fundamentally wrong with
trying to predict the future as long as
you understand how are the algorithms
working how are they being deployed what
is the consequence of getting it right
and most importantly is what is the
consequence of getting it wrong keep
your hands on the steering wheel my
hands haven't moved off the steering
wheel gonna arrest me officer
[Music]
this program is available with PBS
passport and on Amazon Prime video
[Music]
all right
[Music]
thank you
Resume
Read
file updated 2026-02-13 12:57:44 UTC
Categories
Manage