Are You Feeding a Powerful Facial Recognition Algorithm?
sxQXARMJcys • 2021-04-23
Transcript preview
Open
Kind: captions
Language: en
without you realizing it images you've
posted online
could be feeding a powerful facial
recognition algorithm
often used by law enforcement it's over
three billion photos with faces in the
database
and it's all from open source internet
so any kind of website cnn.com or
mugshot website
news site social media you name it
a company called clearviewai has the
largest known facial recognition
database of images in the us
larger than the fbi's it includes images
scraped from social media sites like
facebook
youtube even venmo the algorithm isn't
looking for your face
only in the photos you've posted it can
find your face in other people's public
photos too
the company says that's how a person
accused of sexually abusing a child
was identified they found him in the
background of someone else's
instagram page in the in the gym
you know in the mirror clearview's
algorithm uses artificial intelligence
or ai to identify people by mapping a
person's unique facial features
like the nose or the distance between
your eyes
it will find the features that stay the
same across
age and color and like lighting and
things like that
clearview says their app is now only
available to law enforcement
users upload an image with the case
number and the algorithm searches
through billions of images
for a match matches come up in seconds
with links to web pages
and we're always because we're crawling
the web folding back in the data that's
out there
into retraining an algorithm you know
the larger data set we get the more
accurate it is
over time as well but collecting and
storing biometric data from online
photos
has raised concern canada called
clearview ai's app illegal
a violation of privacy rights and
ordered canadian faces removed from the
photo database
and the use of the technology is being
challenged in the us
in illinois and in california but
clearview is becoming increasingly
mainstream for law enforcement
and the technology was used to help
track down the january 6 capital rioters
the day after the riot the company
reported a 26 increase in searches
so you get all these pictures that have
been submitted from the public to the
fbi's tip site
and it has to be compared against a
database to get
potential matches more than 400 people
have been charged with crimes related to
the january 6th attack
writers were likely dropping digital
breadcrumbs with every move they made
crowdsource tips location data and
surveillance footage have all helped law
enforcement
understand who was where on january 6th
so many people were live streaming and
taking photos
you know hey we can figure out who you
are pretty easily
with the technology that exists
clearview says their large dataset is
part of what makes the algorithm work so
well
the company also says that facial
recognition software should be a tool
in an investigation but not the only
evidence for a criminal identification
but it's not always clear to the public
which law enforcement agencies are using
the technology
and how they're using it the technology
is pretty good but
it's still not suitable to say this is
positively
this person you still have to have a
human look at it
and say this is a match and even then
you still have to go out and do some
some more legwork and further confirm
that
your match is in fact who you think it
is
many facial recognition systems have
gender age
and race biases and often misidentify
people of color
and critics are concerned about the
technology exacerbating existing
inequalities
so artificial intelligence has the
veneer
of being objective has the veneer of
being at arm's length from
human bias but it is far from that
there's always a human element
in the creation of these methodologies
and
these automated decision systems and
we have been very concerned about the
inputs
into these systems that often produce
racially
discriminatory results clearview's
co-founder
claims their technology's
identifications are more accurate than
eyewitnesses
i think that it can minimize mistakes
minimize
misidentifications the technology has
far support
surpassed the human eye now in terms of
accuracy
and then there's the issue of privacy a
full embrace of the technology
could potentially mean the end of
anonymity in any location within view of
a camera lens
there are some potentially significant
benefits
for facial recognition technology things
like
finding missing people or being able to
use in law enforcement purposes to catch
for example
people that are dangerous that need to
be found very quickly
but in order to recognize those benefits
we have to sacrifice
almost everything in terms of privacy
otherwise those tools aren't going to be
effective for their stated purposes
we have to sort of relinquish control
over name face databases so that all of
our face prints are stored
we have to agree to being consistently
surveilled in public all the time and
once we've done
that that's when the
potential for abuse is at its highest
for this technology
there's been fairly widespread support
for efforts to track down the capital
rioters
but civil rights advocates have raised
concerns about broader use of this kind
of artificial intelligence
i think anyone who cares about the
future of our democracy
will understand that we must have
absolute and complete accountability at
all levels
for the attack on the capital that said
what we don't want to see is the
january 6 attack being used as a
predicate
for increased surveillance of black
communities brown communities muslim
communities and other communities that
have been subject
to this extensive and unwarranted
surveillance over time often
we turn to technologies to try to solve
hard social problems hard political
problems
because it's almost easier just to ask a
technology to solve it for us
right and instead i think it's time to
to really ask the harder political
are questions rules appropriate can we
achieve the same level of protection and
serve the values that we want to serve
with our existing tools
is it really just a matter of not
wanting to enforce them
Resume
Read
file updated 2026-02-13 12:57:09 UTC
Categories
Manage