About this episode’s guest: Sarah T. Roberts is an Assistant Professor in the Department of Information Studies, Graduate School of Education & Information Studies, at UCLA. She holds a Ph.D. from the iSchool at the University of Illinois at Urbana-Champaign. Prior to joining UCLA in 2016, she was an Assistant Professor in the Faculty of […]

About this episode’s guest:

Sarah T. Roberts is an Assistant Professor in the Department of Information Studies, Graduate School of Education & Information Studies, at UCLA. She holds a Ph.D. from the iSchool at the University of Illinois at Urbana-Champaign. Prior to joining UCLA in 2016, she was an Assistant Professor in the Faculty of Information and Media Studies at Western University in London, Ontario for three years. On the internet since 1993, she was previously an information technology professional for 15 years, and, as such, her research interests focus on information work and workers and on the social, economic and political impact of the widespread adoption of the internet in everyday life.

Since 2010, the main focus of her research has been to uncover the ecosystem – made up of people, practices and politics – of content moderation of major social media platforms, news media companies, and corporate brands.

She served as consultant to and is featured in the award-winning documentary The Cleaners, which debuted at Sundance 2018 and aired on PBS in the United States in November 2018.

Roberts is frequently consulted by the press and others on issues related to commercial content moderation and to social media, society and culture, in general. She has been interviewed on these topics in print, on radio and on television worldwide including: The New York Times, Associated Press, NPR, Le Monde, The Atlantic, The Economist, BBC Nightly News, the CBC, The Los Angeles Times, Rolling Stone, Wired, The Washington Post, Australian Broadcasting Corporation, SPIEGEL Online, and CNN, among many others.

She is a 2018 Carnegie Fellow and a 2018 recipient of the EFF Barlow Pioneer Award for her groundbreaking research on content moderation of social media.

She tweets as @ubiquity75.

This episode streamed live on Thursday, October 1, 2020. Here’s an archive of the show on YouTube:

About the show:

The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.

Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.

Transcript

01:43
all right
01:44
hey humans
01:48
how we doing out there come on in start
01:50
gathering around the uh the old digital
01:52
campfire
01:54
let me hear from those of you who are in
01:55
line uh right now tell me
01:57
tell me who’s out there and tell me
01:59
where you’re tuning in from
02:01
i hope you’re starting to get your
02:02
questions and thoughts ready
02:04
for our guest i’m sure many of you have
02:06
already seen who our guest is and i’ll
02:07
be reading her bio here in just a moment
02:09
so start thinking of your questions
02:11
about commercial content moderation and
02:13
what you want to
02:14
know about that and you know all that
02:17
kind of stuff
02:18
uh i hear sarah laughing in the
02:19
background it’s not to laugh
02:22
really good valid questions i think i
02:25
was just snorting
02:26
honestly through my uh through my sinus
02:29
trouble
02:30
so uh welcome to those of you who are
02:32
all tuned in welcome to the tech
02:34
humanist show this is a multimedia
02:36
format program
02:37
exploring how data and technology shape
02:39
the human experience
02:41
and i am your host kate o’neil so i hope
02:44
you’ll subscribe and follow wherever
02:45
you’re catching this
02:46
so that you won’t miss any new episodes
02:49
i
02:50
am going to introduce our guest here in
02:51
just a moment uh one one last shout out
02:53
if anybody’s out there wanting to say hi
02:56
feel free
02:56
you are welcome to comment and i see a
02:59
bunch of you
03:00
online so feel free to tune uh
03:03
comment in and tell me who you are and
03:05
where you’re tuning in from
03:07
but just get those you know type in
03:08
fingers warmed up because we’re gonna
03:10
want you to
03:10
to weigh in with some questions and
03:12
comments as the show goes on
03:14
but now i’ll go ahead and introduce our
03:17
esteemed guest so today we have the
03:19
very great privilege of talking with
03:21
sarah t roberts who
03:22
is an assistant professor in the
03:24
department of information studies
03:26
graduate school of education and
03:28
information studies at ucla
03:30
she holds a phd from the ischool at the
03:32
university of illinois urbana-champaign
03:34
my sister’s school i went to university
03:36
of illinois chicago
03:38
prior to joining ucla in 2016 she was an
03:40
assistant professor
03:42
in the faculty of information and media
03:44
studies at western university in london
03:46
ontario for three years
03:47
on the internet since 1993 she was
03:50
previously an information technology
03:52
professional for 15 years and as such
03:54
her research interests focus on
03:56
information work and workers and on the
03:58
social
03:59
economic and political impact of the
04:01
widespread adoption of the internet in
04:02
everyday life right totally
04:06
so since 2010 the main focus of her
04:08
research has been to uncover the
04:10
ecosystem
04:11
made up of people practices and politics
04:14
of content moderation of major social
04:16
media platforms
04:17
news media companies and corporate
04:19
brands
04:20
she served as consultant tune is
04:21
featured in the award-winning
04:22
documentary
04:23
the cleaners which debuted at sundance
04:26
2018
04:27
and aired on pbs in the united states in
04:29
november
04:30

so roberts is frequently consulted
04:33
by the press and others on issues
04:34
related to commercial content moderation
04:36
and to social media society and culture
04:38
in general
04:39
she’s been interviewed on these topics
04:41
in print on radio
04:42
on television worldwide and now on the
04:44
tech humanist show
04:45
uh including the new york times
04:47
associated press npr
04:48
le monde the atlantic i mean this list
04:50
is going to go on and on so
04:52
buckle in folks the economist bbc
04:55
rolling stone wired and picking and
04:57
choosing now it’s a really really
04:59
impressive list of media
05:00
she’s a 2018 carnegie fellow and a 2018
05:04
recipient of the eff barlow
05:06
pioneer award for her groundbreaking
05:08
research on content moderation
05:10
of social media so audience again please
05:12
start getting your questions ready for
05:13
our outstanding guest
05:15
please do note as a live show i well
05:17
i’ll do my best to vet comments and
05:19
questions in real time
05:20
we may not get to all of them but very
05:23
much appreciate
05:24
you being here tuned in and
05:25
participating in the show so with that
05:27
please welcome uh our dear guest
05:31
sarah t roberts and you are live on the
05:34
show
05:34
sarah thank you so much for being here
05:37
thank you uh
05:38
thanks for the invitation and thanks to
05:40
your audience and
05:41
uh all those interested folks who are
05:44
spending time with us today i’m really
05:45
grateful
05:46
for the opportunity we’ve already got uh
05:48
david polgar
05:49
saying excited for today’s talk hey our
05:52
buddy
05:53
dave drp
05:54
[Laughter]
05:56
all right so i wanna talk right away
05:59
about your um
06:01
your book behind the screen i i hadn’t
06:03
had a chance to read and until i was
06:05
preparing for the
06:06
show and it was it was wonderful to get
06:07
a chance to dig into your research
06:09
so tell us a little bit about that came
06:11
out last year is that right
06:13
um yeah it just just a little over a
06:15
year ago uh
06:16
came out on on yale university press
06:19
um you know the academic
06:23
publishing cycle is its own beast it’s
06:25
its own world
06:26
it uh as it relates to
06:29
um kind of like journalism and and
06:31
mainstream press timelines it’s much
06:33
slower
06:34
that said uh i wrote the book in about a
06:37
year which is about a normal
06:39
a normal cycle but it took about eight
06:42
years to put together the research that
06:44
went into the book
06:46
and this is because when i started my
06:48
research in 2010
06:50
which you know we say 2010 it seems like
06:53
yesterday that was a decade ago now
06:55
you know if we’re in terminable 2020
06:59
you know which is which is a million
07:01
years long so far but
07:03
back in 2010 when i started looking into
07:05
this topic as a
07:07
as a doctoral researcher at the
07:09
university of illinois
07:10
uh you know there were a lot of things
07:12
stacked against that endeavor
07:14
including the fact that i was a doctoral
07:16
student at the university of illinois i
07:17
had no cachet i had very few
07:20
like material resources um you know to
07:23
finance
07:24
a study that would require uh
07:27
at the end of the day required going
07:29
around the world quite literally
07:32
but maybe the biggest barrier at the
07:34
time was the fact
07:36
that i was still fighting an uphill
07:38
battle trying to tell people
07:40
that major mainstream social media
07:43
platforms
07:44
were engaged in a practice that is now
07:47
weirdly um you know a phrase that you
07:51
might say around the dinner table and
07:52
everyone would get which is content
07:54
moderation
07:55
and that further when i would um raise
07:58
the issue
08:00
and and bring up the fact that firms
08:01
were engaged in this practice which
08:04
you know has to do with the adjudication
08:06
of people’s
08:08
self-expression online and sits
08:10
somewhere between users
08:13
and the platform and then the platform’s
08:15
recirculation of users material
08:18
uh you know people would argue with me
08:20
at that point
08:22
about the fact that that practice would
08:24
even go on
08:25
and then when i would say that uh you
08:27
know kind of offer
08:28
incontrovertible proof that in fact it
08:30
did go on uh
08:32
then we would uh find ourselves in a
08:34
debate about whether or not
08:36
it was a legion of human beings
08:40
who was undertaking this work or uh in
08:43
fact it was computational
08:45
now in 2010 in 2020 the landscape is
08:48
complicated but in 2010
08:51
the technology and the sort of
08:53
widespread adoption
08:54
of of computational uh
08:58
automated let’s say algorithmic kinds of
09:01
content moderation or machine learning
09:03
and forum content moderation was not a
09:05
thing
09:05
it was humans and so i had to start the
09:09
conversation
09:10
so far below baseline
09:14
that it you know it took uh it took
09:17
quite a lot of effort just to get
09:19
everybody on the same page to discuss it
09:22
and you know when i’m talking about
09:24
uh engaging in these conversations i
09:27
mean just like trying to vet this as a
09:29
as an appropriate research topic at the
09:32
graduate school you know what i mean
09:34
like to get faculty members
09:36
many of whom were world experts in in
09:39
various aspects of uh of the internet or
09:42
of
09:42
media or information systems themselves
09:46
um it was new to them too that was did
09:49
you originally frame it was it it’s a
09:51
question of how
09:52
is this done or what was the original
09:54
framework of that question yeah
09:56
so i’ll tell you a little bit about the
09:57
origin of why i got interested
10:00
and it’s something that i write about in
10:01
the book because i think it’s so
10:03
important to acknowledge kind of those
10:06
those antecedents i had read i was
10:08
actually teaching down at the university
10:10
of illinois in the summer
10:12
of 2010 and i was on a break from
10:15
teaching and
10:16
you know probably drinking a latte which
10:18
is what i’m doing right now
10:19
and um and uh uh reading the paper i was
10:23
reading the new york times and there was
10:24
a very small
10:26
uh but compelling article in the new
10:28
york times about a group of workers
10:30
who were there there were a couple of
10:32
sites they mentioned but there was in
10:33
particular a group of workers in rural
10:35
iowa well here i was sitting in rural
10:38
central illinois thinking about this
10:40
group of workers in rural iowa as
10:42
profiled in this piece
10:44
who were in fact engaging in what we now
10:46
know as commercial content moderation
10:48
they were working
10:49
in effectively a call center uh
10:53
adjudicating content for unnamed kind of
10:55
you know
10:56
media sites websites and social media
10:59
properties
11:00
and i kind of circulated that article
11:03
around i shared it with friends i shared
11:05
it with my colleagues and i shared it
11:06
with professors and
11:07
the argument that i made was that it was
11:10
it was multifaceted first of all it
11:12
sounded like a miserable
11:14
job and guess what that has been borne
11:16
out it is a
11:17
very difficult and largely unpleasant
11:20
job
11:21
uh so i was captivated by that fact that
11:24
there were these you know
11:25
unnamed people who a generation or two
11:28
ago would have been on a family farm
11:30
who were now in the quote unquote
11:32
information economy but seemed to be
11:34
doing
11:34
a drag just awful work
11:38
uh but also there was this bigger issue
11:41
of
11:42
uh you know really having this this big
11:44
reveal
11:45
of the of the actual
11:48
ecosystem an unknown here for unknown
11:51
portion of the social media ecosystem
11:54
effectively letting us know how the
11:56
sausage was being made right
11:58
and yet if you were to look at any of
12:01
the
12:02
the uh the social media platforms
12:05
themselves or any of the discourse at
12:06
really high levels in
12:08
industry or in regulatory bodies this
12:11
was not
12:12
this was a non-starter but i was was
12:14
arguing at the time
12:16
that how content was being adjudicated
12:18
on the platforms
12:20
under what circumstances under what
12:23
conditions and under what policies was
12:25
in fact
12:27
maybe the only thing that mattered at
12:29
the end of the day
12:30
right now in 2010 that was a little bit
12:32
of a harder case to make
12:34
by 2016 not so much after we saw the uh
12:38
the ascent of donald trump in the united
12:40
states we saw brexit
12:42
we saw uh this the rise of bolsonaro and
12:45
in brazil largely
12:46
uh attributed to um
12:49
social media campaigns there and kind of
12:52
discontinued sustained
12:54
support through those channels uh and
12:57
here we are in 2020 where uh
13:00
we might argue or we might claim that
13:02
misinformation and disinformation online
13:04
is one of the primary
13:06
concerns of civil society today
13:09
and i would put front and center
13:13
in those all of those discussions
13:16
the fact that social media companies
13:18
have this incredible immense power
13:20
to decide what stays up and what doesn’t
13:24
and how they do it and who they engage
13:27
to do it
13:28
should actually be part of the
13:30
conversation if not
13:31
i would argue that it’s a very
13:33
incomplete conversation so when i talk
13:35
about like the
13:36
scholarly publishing cycle it took a
13:39
year to put the book out right but it
13:40
took eight years to amass the evidence
13:44
to um to do the to the interviews and
13:47
media that you mentioned
13:48
to converse with industry people at the
13:51
top levels eventually but
13:52
you know starting at the bottom with the
13:54
workers themselves to find workers who
13:56
are willing
13:56
to talk to me and break those
13:58
non-disclosure agreements that they were
14:00
under um and to kind of create also
14:04
a a locus of activity for other
14:07
researchers and scholars and activists
14:09
who are also interested in in uncovering
14:12
uh this area and really sort of create
14:14
co-create a field of study so that’s
14:17
what took eight years it took a year to
14:18
get the book out
14:19
um but all that legwork of proving in a
14:22
way
14:23
that this mattered took a lot longer i
14:25
don’t have to make that same case
14:27
anymore
14:27
as i’m sure you you can imagine um
14:30
people people are interested they’re
14:33
concerned
14:34
and um they want to know more they’re
14:36
demanding a lot more
14:38
um from firms as users
14:41
you know as people who are now engaged
14:43
in social media in some aspect
14:45
of their lives every day need i say more
14:48
about zooming
14:49
constantly which is now our you know our
14:52
primary
14:53
medium of connection for so many of us
14:55
in our work lives even
14:57
yeah hey we already have a question from
15:00
our buddy drp david ryden-polgar let me
15:04
uh
15:04
put this against the background we can
15:06
actually see it here uh
15:08
he says sarah would love to hear your
15:10
thoughts on section 23
15:12
230 and how any potential changes would
15:15
impact content moderation
15:16
so we’re going right in right deep yeah
15:19
really
15:20
so um let me try to flush that out a
15:22
little bit
15:24
for others who aren’t um you know inside
15:26
quite as as deep
15:28
um section 230 is
15:31
a part of the uh communications decency
15:34
act which goes back to 1996 but
15:36
effectively what what anyone needs to
15:38
know about section 230 is that
15:40
it’s the it it’s sort of the legal
15:42
framework
15:43
that informs social media companies
15:48
rights and responsibilities around
15:51
content
15:52
when we think about legacy media um
15:55
so-called uh broadcast television for
15:58
example or other other forms of of media
16:01
that we consume
16:02
you know i always bring up the the
16:04
example of george carlin who
16:06
famously um uh
16:10
you know made a career out of the seven
16:12
dirty words that you couldn’t say
16:13
on radio right so there are all kinds
16:16
of governing uh
16:19
legal and other kinds of norms about
16:22
what is allowed and disallowed in some
16:24
of these legacy media
16:26
when it comes to social media however
16:30
there is a pretty
16:35
drastically contrasted permissiveness
16:38
that is in place uh that
16:41
seeds the power of the decision-making
16:44
around
16:45
what is allowable and what is not
16:46
allowable to the platforms themselves so
16:49
this is a really different kind of
16:50
paradigm right
16:52
and it’s section 230 that allows that
16:54
that’s the
16:55
that’s the precedent that’s the that’s
16:57
the guidance uh
16:58
legally that uh that provides that kind
17:01
of
17:02
uh both responsibility and discretion
17:05
and what it does is it allows the
17:07
companies
17:08
um to make their own decisions
17:12
effectively
17:13
about what policies they will follow
17:15
internally now this doesn’t go for
17:17
every single piece of content you know
17:18
one of the the biggest examples that
17:21
uh that this does not cover is child
17:24
sexual exploitation material which is
17:25
just illegal full stop it doesn’t matter
17:28
if platforms wanted to traffic in that
17:30
material or not it’s illegal
17:32
but beyond that just to certain to a
17:35
certain extent what section 230 allows
17:38
is for platforms to redistribute
17:42
effectively material that other people
17:44
submit
17:45
uh without being held liable for that
17:47
material
17:48
and so if we think about that that’s
17:50
actually the business model of social
17:51
media
17:52
the business model of social media is to
17:54
get other people to create content
17:56
upload it circulate it and engage with
17:59
it download it
18:00
and effectively the platforms have um
18:03
you know argued and claimed that they
18:04
are really
18:05
you know don’t kill the messenger right
18:07
like they’re just like the
18:08
the the apparatus by which this material
18:10
gets shared
18:12
i think that um
18:15
you know at one time that really made
18:16
sense particularly when the
18:18
when this uh when the communications
18:20
decency act was passed and this goes
18:22
back in
18:23
into the mid 90s when what was
18:26
kind of imagined as needing this this
18:29
uh reprieve from liability was an isp an
18:33
internet service provider
18:35
which at that time uh i guess the most
18:38
imaginative version of that you could
18:40
think of would be america online for
18:41
those of you who
18:42
remember that on the program shout out
18:45
to the aol days yeah
18:47
right aol like all the you know the
18:49
discs and cd-roms you got and used as
18:51
coasters
18:52
um but you know back in that time but an
18:55
internet service provider really was a
18:57
pass-through in some cases you know i
18:58
knew a guy who ran an isp locally
19:01
he really just had a room with a with a
19:03
huge internet pipe coming in
19:06
and a wall of modems and you would dial
19:08
up through your modem and connect
19:10
through and then be on the internet to
19:11
some other service
19:12
so that was the model then but the model
19:15
now
19:15
uh is you know multi-billion dollar
19:19
transnational corporations
19:21
uh who have immense power in decision
19:24
making around content
19:26
and yet are are uh
19:29
in the american context at least largely
19:32
not liable for those decisions
19:34
uh legally or or otherwise um
19:38
making incredibly powerful
19:42
decisions about what kind of material we
19:45
all see and engage in
19:47
and what is permissible and what is not
19:49
online and they do that at their
19:50
discretion well if they’re doing that at
19:52
their discretion
19:54
do you think that they’re largely going
19:56
to um
19:58
fall into a mode of altruism and like
20:01
what’s best
20:01
for civil society are they going to look
20:03
at their bottom line
20:05
and their shareholder demands and
20:07
respond to that i mean
20:09
the audience yeah i mean frankly
20:12
publicly traded companies
20:13
have a legal mandate to respond to their
20:15
shareholders and to generate revenue for
20:17
them so
20:18
um when those things are at odds when
20:20
when those things are aligned with
20:22
what’s good for you know
20:23
america is good for uh facebook’s
20:26
internal policies around content
20:28
moderation that works out great
20:29
but if there’s you know if ever those
20:32
two pathways should diverge
20:34
we know which one they’re going to fall
20:35
under and there’s just there’s very
20:37
little
20:37
um legal consequence or legal uh
20:41
expectation for uh reporting out on how
20:46
uh these decisions get made the way that
20:48
that
20:49
we have seen more decisions getting uh
20:52
publicly
20:53
unveiled through things like um
20:56
the publication of of what had been
21:00
previously kind of closely held secret
21:03
policies internally is through public
21:06
pressure
21:06
through the pressure of civil society
21:08
groups and advocacy groups through the
21:10
pressure
21:11
of the public through the pressure and
21:13
the constant threat of
21:15
you know things like reform to section
21:17
230 or other kinds of
21:19
regulation so it’s a very interesting
21:23
moment and it’s interesting to bring up
21:24
section 230 because
21:26
again a couple of years ago i had
21:28
colleagues um
21:30
who are in uh legal studies and who are
21:34
you know law professors essentially tell
21:36
me that 230 would soon be rendered
21:38
moot anyway because it’s just it’s it’s
21:41
you know based on um on
21:45
well it should be solely relevant in the
21:47
united states right in the jurisdiction
21:49
of the united states
21:50
and so because these platforms were
21:52
going worldwide
21:54
uh you know there
21:57
it would be rendered mood well i would
21:59
say it’s actually been the opposite
22:00
that’s right that what is happening is
22:02
that section 230 is getting bundled up
22:04
as the norm
22:06
and is now being promulgated either just
22:09
through uh through the process of these
22:13
platforms going global but kind of
22:14
keeping their americanness and
22:16
keeping their um their response their
22:20
you know business practices largely
22:22
responsible to american laws first and
22:24
foremost
22:25
but also even to the point that uh you
22:28
know it recently
22:29
has become known i think more and more
22:32
to people like me who aren’t legal
22:34
scholars but who have a great interest
22:36
in how this stuff goes down that section
22:39
230 like language
22:41
is being bundled up and put into trade
22:44
agreements
22:45
uh at the nation state level or
22:48
you know region level with the united
22:50
states and trading partners and we know
22:52
that
22:53
you know these these trade agreements
22:56
which have been you know huge hugely
22:57
politically
22:59
uh problematic and were a major issue in
23:03
fact of the 2016 election
23:05
uh you know they’re they’re they’re
23:07
anti-democratic i mean how do you even
23:09
know what’s in a trade agreement they’re
23:10
totally secret
23:12
uh but i i learned while watching a uh
23:15
uh house uh subcommittee
23:19
uh convening about section 230 from
23:22
a highly placed google executive
23:26
that in fact their their lobbyists are
23:28
pushing for this kind of language in
23:31
in these trade agreements so we see that
23:33
instead of 230 becoming less relevant
23:35
because of the globalization
23:37
of american social media platforms it’s
23:39
actually becoming a norm that is now
23:42
being
23:43
first of all it was sort of like softly
23:45
reproduced just because of the spread of
23:47
these american platforms and
23:49
how they were doing business but now
23:50
it’s actually becoming codified
23:52
through other means means like like
23:55
trade agreements that the public has
23:57
really no
23:58
mechanism to intervene upon and i think
24:00
that’s really worrisome
24:02
what about those mechanisms where the
24:04
sorry what were you gonna say
24:06
no okay i was just gonna say that’s one
24:07
of my short and concise professorial
24:09
answers
24:11
let me drink a coffee well david
24:14
uh thanks you for that uh great
24:17
historical overview and i’m sure
24:18
the rest of our viewers and listeners do
24:20
too i i wonder about the ones
24:22
the the examples that don’t have that
24:25
kind of
24:26
uh consumer involvement so i’m wondering
24:28
about for example
24:29
you know youtube and it’s kids content
24:32
and
24:33
and so there have been a lot of changes
24:35
it seems like
24:36
with regard to that that platform and
24:38
that subject over the
24:40
over the last few years so can you maybe
24:42
give us an overview of
24:43
how that has gone down um
24:46
well i think that you know youtube is
24:49
such an interesting example
24:51
to talk about for for many reasons uh
24:53
for its reach and pervasiveness you know
24:56
it’s a
24:56
market leader for sure it’s globality i
24:59
would also say that youtube is
25:01
particularly interesting because when we
25:04
think about
25:05
uh social media content as being
25:10
monetized there is no greater
25:13
and more direct example than youtube
25:15
where it actually pays people who are
25:17
really highly successful on the platform
25:19
for content right
25:20
so like when there’s no kind of like a
25:23
metaphor there about monetization it is
25:25
literally monetized right
25:27
um and this you know just to kind of tie
25:30
this back to the section 230
25:31
conversation
25:32
when we imagined isps as just path
25:35
pass-throughs you know that was one
25:37
thing but here we have
25:39
these huge companies like youtube and
25:40
others involved actively
25:43
in production so that kind of like
25:46
firewall between just being an
25:48
intermediary and actually being actively
25:50
engaged in producing media
25:51
has gone but the there’s like a legacy
25:54
legal environment that it still
25:56
informs it so youtube you know they pay
25:58
producers they have these like
26:01
uh pretty extraordinary studios in
26:05
in major uh in major
26:08
cities around the world including la
26:10
where i live
26:12
uh they you know they are kind of the
26:15
go-to outlet and people
26:18
want to participate in youtube for all
26:20
sorts of reasons but there’s certainly
26:21
you know a dollar sign reason that
26:24
people get involved
26:25
and you bring up this issue of kids
26:27
content
26:28
um again here’s where we see sort of
26:31
like the softening and the eroding of
26:33
regulation too it
26:35
started it’s it’s not just youtube i
26:36
have to confess it’s not just
26:38
social media companies that have eroded
26:40
uh you know child protections around
26:42
um media that that goes back to the you
26:45
know 40 years ago in the reagan
26:47
administration when there used to be
26:48
very stringent rules around
26:50
uh saturday morning cartoons for example
26:52
and advertising to children that could
26:54
go on
26:55
during that time uh shout out to my
26:58
colleague molly neeson who has worked
27:00
extensively on that
27:01
on that particular topic and that
27:02
erosion so
27:05
i see uh on on youtube again
27:08
a lot of the pressure to kind of reform
27:11
and
27:11
i think when you’re talking about kids
27:13
content you’re talking about
27:15
some of like some like really disturbing
27:17
and weird content that was showing up
27:20
um you know kind of like cheaply made
27:22
unknown
27:23
weird creepy sometimes not really
27:25
clearly
27:27
necessarily uh
27:30
benevolently made like you know
27:33
sometimes creepy sexual undertones
27:36
uh other kinds of stuff going on you
27:38
know really and really no way to know
27:40
that’s part of the problem no way to
27:42
know right um
27:43
and then uh the massive problem of
27:46
trying to
27:48
moderate that material right um you know
27:51
i think of it
27:52
as like the the classic story of the the
27:55
whole
27:56
springing through the the dyke holding
27:58
the water back you know
27:59
you plug one hole another one springs
28:02
open
28:02
so it’s a little bit falls down so the
28:05
whole wall
28:06
and then your inundated that’s right
28:07
that’s right and so
28:09
you know that is a good metaphor to
28:10
think about the problem of these like
28:12
kind of isolated
28:14
uh hot spots that explode on platforms
28:17
as a new social issue or maybe a new
28:21
uh a geopolitical conflict erupts
28:25
somewhere in the world it’s you know
28:26
gets meted out and replicated on social
28:28
media and attention gets drawn to it
28:31
and so i think this issue of child
28:34
content and its kind of exploitive
28:35
nature and
28:36
strange nature in some cases was
28:38
something that advocacy groups and
28:40
others brought attention to
28:41
and the platform had to reconfigure and
28:44
focus on it
28:45
now i mentioned earlier that you know
28:47
back in 2010 it really was humans who
28:49
were doing this work almost exclusively
28:50
but by 2020
28:52
we are using computational tools
28:55
to try to deal with content as well
28:57
although i
28:58
i’ll repeat the quote that i once heard
29:00
from a reporter
29:02
who who heard it from a an engineer at a
29:05
company that shall not be named but it
29:06
might sound like
29:08
um you know boo-boob let’s say might
29:10
rhyme with that
29:11
uh and the quote was uh whatever the
29:14
algorithm is doing it’s
29:15
not watching the video so you know
29:17
they’re using these computational
29:19
mechanisms to do all kinds of other
29:21
stuff but it’s not like
29:22
an algorithm can watch and sense make
29:25
out of a video it has to look at other
29:26
stuff
29:28
so that’s an interesting point though
29:30
too and i want to follow up on that with
29:31
a question about
29:32
you know do you do you personally
29:34
advocate for more
29:35
ai in the mix of con of content
29:38
moderation such as you know facebook
29:39
recently made an announcement that they
29:40
were using
29:41
ai to simulate bad actors so that they
29:44
could train their moderation
29:45
systems automated moderation systems to
29:47
more effectively recognize it do you
29:49
think that that ultimately
29:50
will work and will benefit the humans
29:52
who are part of this ecosystem or
29:54
is it likely to produce unintended ill
29:56
effects so i mean that’s a really great
29:59
question because that’s sort of like the
30:01
64 000 question about my work if
30:04
you know one would one would think if my
30:05
concern is the welfare of workers
30:08
which has always kind of been my cut in
30:10
on this topic and where i start and
30:11
where i come back to an end
30:13
um then hey wouldn’t it be great if
30:15
tomorrow we could just flip that switch
30:16
and go
30:17
to those uh purely computational means i
30:20
think that
30:21
in theory right in theory but i think
30:24
there are a lot of red flags there
30:26
you know one red flag is that if it’s
30:29
been this difficult
30:30
as and i kind of laid the groundwork for
30:32
that at the at the front end of the show
30:34
to unpack and uncover uh
30:37
the ecosystem involving humans and i
30:39
have to say
30:40
the majority of my work has been
30:43
reliant upon the willingness of human
30:46
beings involved in the system
30:48
to leak essentially to break
30:51
their non-disclosure agreements and to
30:54
you know essentially snitch on what they
30:56
felt was
30:58
problematic also sometimes what they
31:00
felt was good about the work they did
31:02
how do you get uh an algorithm or a
31:04
machine learning based tool
31:06
to call a journalist or
31:09
uh you know do an interview with a
31:11
researcher
31:13
i don’t know how to do that you know the
31:14
closest thing we could come to is
31:16
getting access to it and looking
31:18
at code but that’s not easy to do and
31:20
it’s much harder to do
31:22
than finding uh and i cannot stress the
31:25
difficulty of what it was like
31:27
in the early days to find people willing
31:29
to talk to me so
31:30
you know you can’t do that with ai how
31:32
do we how do we audit those tools how do
31:34
we
31:35
how do we you know what’s the check on
31:37
power that the firms have with those
31:39
tools
31:40
in terms of how they’re set up and what
31:42
they keep in and what they keep
31:43
out it also sounds like a potentially
31:46
even greater violation
31:47
of uh that non-disclosure if someone
31:50
leaks a bit of code
31:51
rather than just tell their own personal
31:53
story oh for sure i mean and and
31:56
you know the the other thing too that
31:58
that comes to mind for me is
32:00
the nature of how these tools work
32:03
and you know a great worry and i think a
32:05
legitimate worry of many people in the
32:07
space
32:07
is that uh they
32:11
the tendency to use those tools would be
32:13
to
32:14
uh calibrate them
32:17
to be even uh less permissive let’s say
32:21
or to you know because of their nature
32:23
they would have less of an
32:24
ability to look at a given piece of
32:27
content
32:28
and you know see that it violates abc
32:31
policy but understand it in the context
32:34
of you know again
32:35
a cultural expression or um
32:38
you know an advocacy piece around a
32:41
conflict zone
32:42
and then make an exception so what we
32:44
would see
32:45
is uh more conservative and greater
32:49
false positives around material that
32:52
quote unquote is disallowed right
32:55
again all of this adjudicating to the
32:58
logic that the firms themselves create
33:00
which for um for many years itself was
33:03
opaque
33:05
uh so this is you know it’s not as easy
33:08
to say unfortunately if we could just
33:10
get those darn algorithms right if we
33:11
could just get
33:12
you know machine learning to get
33:13
sophisticated enough we could
33:16
take out the human element and and
33:18
basically
33:19
you know save people from having to do
33:21
this work
33:23
unfortunately i think it’s more
33:24
complicated than that and i would say
33:26
that
33:26
you know bringing up the idea of
33:29
training machine learning tools as you
33:30
did
33:31
one of the gross ironies of this whole
33:33
thing that i’ve been
33:34
monitoring is that uh
33:38
content moderation commercial content
33:40
moderation for these major platforms
33:42
is its own kind of self-fulfilling uh
33:46
industry that begets uh sub industries
33:49
in and of itself
33:49
so that when machine learning tools have
33:52
come on what needs to happen
33:54
is that people need to sort data sets to
33:56
create data sets for the machine
33:58
learning tools to train on
33:59
and they need to be themselves trainers
34:02
and classifiers for the machine learning
34:04
tools so now we have a whole new stratum
34:06
of people
34:07
working to train machine learning
34:09
algorithms which has them essentially
34:11
doing a certain kind of content
34:12
moderation
34:13
it’s a lot easier that cottage industry
34:14
of evil ai
34:16
spawn it’s like anything like
34:19
how are we gonna make the ai bad enough
34:21
to train our ai
34:23
uh automation systems to recognize that
34:25
so that we can keep a good environment
34:27
but then you’ve got this whole cottage
34:29
industry around the bad
34:30
ai seems like a very awkward way of
34:32
going
34:33
so you know as someone who monitors like
34:36
like hiring trends and things like that
34:37
too
34:38
i was i was watching companies looking
34:41
for people to to come be
34:42
classifiers on data sets which is just
34:44
moderation before the fact right
34:46
yeah you know you talked about that in
34:48
the book too you have
34:50
you presented a taxonomy of sorts of
34:52
labor arrangements from
34:53
in-house moderators to what you call
34:56
micro labor you know looking at
34:58
mechanical turk and things like that can
34:59
you walk us through that a little bit so
35:01
that we can become familiar with what
35:02
the
35:02
the human issues relative to each level
35:06
yeah one of the one of the early
35:07
insights i had when i was trying to
35:09
figure out the contours of this industry
35:11
from
35:11
you know the outside and it reminds me
35:13
of that parable of you know
35:15
people feeling different parts of the
35:16
elephant without really being
35:18
being able to see it and they don’t
35:19
really they don’t really get the big
35:21
picture
35:22
um was that you know what i was
35:24
considering as being kind of like a
35:26
monolithic
35:27
practice really wasn’t it was happening
35:28
in all kinds of different places and in
35:30
different guises
35:32
including using different names like
35:33
there was no kind of cohesive name to
35:35
call
35:36
this this work practice so i started out
35:38
kind of knowing about these workers
35:40
in in iowa that i reference in the book
35:42
and i referenced today
35:44
who were working in a call center and it
35:46
turned out that call centers were really
35:48
a prevalent way
35:50
that this work was going that it was um
35:53
you know kind of at somewhat of a remove
35:55
geographically and organizationally so
35:57
it’d be kind of like a third party
35:59
contracted out group of workers
36:00
somewhere in the world
36:02
when i started out i knew about the
36:03
workers in places like iowa florida etc
36:06
but i soon came to know about workers in
36:08
places like india
36:09
or in malaysia or of course key to the
36:12
book in the philippines
36:13
so that um that that call center
36:16
environment for content moderation work
36:18
is really prevalent
36:20
and it’s global but there are also
36:23
workers who
36:24
uh prior to covid we’re going every day
36:26
for example in the bay area down from
36:28
san francisco on the
36:30
company buses um and going on site to
36:33
companies
36:34
that i describe in the book one that has
36:36
the you know
36:37
pseudonym of megatech and is a stand-in
36:40
for
36:40
any number of companies in fact i’ll
36:42
just tell you a little anecdote that
36:44
i’ve met a lot of people from industry
36:46
who like over cocktails after meetings
36:48
will come up to me
36:49
all from different companies and say
36:52
we’re mega tech aren’t we and it’s like
36:54
you know like at least six different
36:56
corporations think they’re making
36:57
answers
36:58
yes yes sounds right yeah that tells you
37:01
something
37:02
so um you know these people were on site
37:05
workers they were
37:06
in you know the belly of the beast
37:07
essentially they were working
37:09
in places where there was also uh
37:11
engineering product development
37:13
marketing
37:14
uh communications you know soup to nuts
37:16
uh
37:17
although interestingly enough they were
37:20
also contractors in the case of the
37:21
books so
37:22
they still had this differential and
37:24
lesser status even though they were
37:26
going on site
37:27
to the corporate hq you know it still
37:31
wasn’t quite the right badge caller as
37:33
they described it to me although they
37:35
thought about the people who were
37:36
working as contractors and call centers
37:38
as another kind of worker
37:40
even though they were essentially very
37:43
very similar
37:44
then we had people that i encountered
37:47
who were
37:48
you know very entrepreneurial and
37:50
especially in in sort of the early days
37:52
were
37:52
developing a model that looks almost
37:56
like an ad agency they were
37:58
independent companies that were starting
38:00
to specialize in providing content
38:02
moderation services
38:03
to other companies and it was a boutique
38:05
kind of service
38:06
a specialty service and they would often
38:09
offer
38:10
social media management across the board
38:13
so not only were they offering
38:14
the removal of content in some cases but
38:16
they would even
38:18
offer again in that advertising model
38:20
the generation of content
38:22
because believe it or not sometimes you
38:24
know your auto parts company’s facebook
38:26
page just doesn’t
38:27
generate a lot of organic interest and
38:29
so you hire a company to come post about
38:31
how awesome your auto parts company is
38:34
um likewise if there’s a you know as
38:37
somebody once
38:38
told me and it’s in the book too if you
38:40
open a hole on the internet it gets
38:41
filled with
38:43
bleep with uh you know if you have
38:46
a web page or you have a facebook page
38:48
and there’s no activity
38:49
that’s like organic or really about what
38:51
it’s supposed to be about i guarantee
38:52
you that somebody will be posting
38:54
invective racist comments and so on
38:56
these boutique firms said
38:58
to usually to smaller companies hey
39:00
we’ll manage the whole thing we’ll
39:01
delete that stuff
39:02
we’ll generate new stuff for you it’ll
39:04
look organic nobody will really know
39:06
that that’s what we’re doing
39:07
and they were having great success when
39:09
i talked to them was that generally
39:11
filed under this sort of banner of user
39:12
generated content
39:14
or was it called other things generally
39:16
um
39:17
you know it was kind of like a social
39:19
media management is how they would call
39:21
couch that and how they would pitch it
39:25
and uh you know it was like uh hey
39:28
company x you your business has nothing
39:31
really to do with social media that’s
39:33
not
39:33
you know your primary business let us
39:35
handle it for you
39:36
and a lot of companies jumped at the
39:38
chance to kind of outsource that and not
39:40
deal with it
39:41
an interesting thing in that kind of
39:43
bucket of
39:44
of the taxonomy that you mentioned is
39:46
that those companies
39:48
uh in some cases got bought up by
39:52
ad firms or ad firms have started doing
39:54
this service as well
39:56
or they become really really big and
39:58
successful so there’s like a few that
40:00
kind of
40:01
uh uh rose to the top and have survived
40:05
and then you already mentioned this
40:07
really interesting and and kind of
40:09
worry some arena where this work goes on
40:12
which is in the micro labor realm
40:14
the amazon mechanical turk model
40:17
uh which is effectively you know digital
40:19
piece work it’s people
40:21
adjudicating a bit of content here
40:23
they’re often
40:25
paid a per view or per decision
40:28
uh and then they try to aggregate enough
40:30
to make that make sense for them
40:31
financially
40:33
and it it turns out although that’s
40:36
supposed to be an anonymous relationship
40:38
you know savvy mechanical turkers they
40:40
can figure out who they’re working for
40:42
because a lot of times
40:43
you know they’d receive a set of of
40:46
images or other content to adjudicate
40:48
and like you know the interface was
40:50
obvious
41:00
[Music]
41:02
before and you get those guidelines
41:04
again then you know yeah
41:06
that’s right so you know i i came to
41:09
know some folks who were
41:10
uh you know who themselves sort of began
41:13
to specialize within
41:14
mechanical turk and other platforms on
41:17
this kind of thing and they would seek
41:18
out this work because they got good at
41:20
it like you said
41:21
and they got good at knowing the
41:22
internal policies and juggling them for
41:24
all these different firms and
41:26
began to specialize in this work on that
41:28
platform
41:29
i was wondering you know when thinking
41:31
about this as you mentioned earlier
41:33
about the
41:34
the consequences of misinformation
41:36
especially as we
41:37
are deep in the process of the us
41:40
presidential election cycle and
41:42
i say the u.s because i want to be
41:43
sensitive to the fact that there are
41:44
global viewers but i feel like everyone
41:46
in the world is kind of
41:48
you know hooked into the u.s
41:49
presidential election right now
41:51
and we’re all like yeah aren’t they
41:53
right and we’re all being subject to
41:55
you know all of this uh well the the
41:58
dumpster fire of it all but also the
42:00
misinformation that accompanies it
42:02
and so i wonder how should people think
42:04
and understand the difference between
42:07
content on social media and content in
42:09
news media
42:10
and what are some of the differences in
42:12
approaches to moderating
42:14
harmful content and you know kind of
42:16
just thinking about
42:18
the access to you know free access to
42:21
information you know this is kind of a
42:23
big
42:24
muddy question i’m not sure i’m
42:26
articulating very well but
42:27
hopefully you see the direction of of
42:29
the um
42:30
the question that i’m asking her yeah i
42:34
i’ll i’ll do my best to respond and we
42:36
can
42:36
you know we can you can offer guidance
42:40
yeah as i go i mean i i think your
42:43
question in essence is what the hell
42:45
right yeah
42:48
information misinformation
42:50
disinformation the election
42:52
what the hell and so i think you speak
42:54
for a global audience when you pose that
42:56
question and
42:58
you’re right about the u.s election i
43:00
know uh friends and colleagues who were
43:02
up early in australia watching it and
43:04
you know as mortified as we were by the
43:06
the behavior on display
43:08
and the other night yes the debate and
43:11
the kind of the nadir
43:12
of uh you know american politics in my
43:15
lifetime is how i described it
43:17
um you know i i often
43:20
bring up the the rise of social media
43:24
as a force in again in american civic
43:27
life
43:29
that it’s important to not think about
43:31
it having happened in a vacuum or having
43:33
happened
43:34
uh without without
43:37
um other forces at play and in the other
43:40
part of my life i
43:42
am a professor in a program that trains
43:44
and prepares
43:45
people for careers and information
43:47
professions primarily in librarianship
43:50
and so i know something about the way
43:53
in which we’ve seen a gross
43:57
erosion of the american
44:00
public sphere and opportunities for
44:03
people to become informed
44:06
in places that traditionally have been
44:10
more transparent more committed to the
44:13
public good
44:13
not-for-profit i’m thinking about
44:16
institutions like public schools
44:18
and institutions like public libraries
44:21
so if we were to take
44:24
you know uh funding a funding graph or
44:28
something like that and put them
44:29
together about expenditures or
44:31
where where money goes in our society we
44:34
would see
44:35
you know that off the cliff kind of
44:37
defunding
44:38
of of these uh institutions that i just
44:41
mentioned
44:42
while we see a rise in social media
44:46
and what i think that suggests at least
44:49
to me is that
44:50
it’s not that the american public
44:51
doesn’t have a desire to be informed
44:54
or to have information sources and i
44:56
would add to that by the way
44:57
it’s not necessarily in the public
44:59
sphere in the same way
45:00
but we have seen total erosion in
45:04
regional and local journalism too right
45:06
during the same time right
45:08
into mega media that’s right mega media
45:11
which
45:12
you know came about by the shuttering of
45:14
local news
45:15
and it there was a time when you know
45:17
cities like mine i come from madison
45:19
wisconsin 250
45:21
000 people yeah they yeah they might
45:24
have had a a
45:25
a reporter in dc you know what i mean
45:28
for our local paper the capitol times
45:30
which went the way of the dodo some
45:33
some years ago and that that local paper
45:35
no longer exists in a print form
45:38
so there’s a whole i mean we could do a
45:40
whole show on this and you probably
45:42
shouldn’t have me on for the show so
45:44
apologies to to the users that this
45:46
isn’t my total area of expertise but i’m
45:48
just trying to connect some dots here
45:50
for people to make sense of it right
45:52
right and you know when we think about
45:53
the differences between social media
45:55
information circulation and something
45:58
like journalism
46:00
agree or disagree with what you read in
46:02
in in the newspaper or you hear on the
46:05
news
46:06
of your choice but there are things
46:09
there that are not present
46:10
in the same way in the social media
46:12
ecosystem uh
46:13
you know an author name a set of
46:16
principles by which
46:18
uh the journalists
46:21
at least pay lip service to but most of
46:24
them
46:25
live by you know that they have been
46:27
educated
46:28
to uh to serve and then do so
46:31
in their work there’s editorial control
46:34
that before stories go to print they
46:37
have to go through a number of eyes
46:38
there’s fact checking if you’ve ever you
46:41
know i’ve been on the
46:42
the the side of having been interviewed
46:44
for journalistic pieces and i get phone
46:46
calls from fact checkers to make sure
46:48
that the journalists got
46:49
right what i think yeah right
46:52
you think that did you really say xyz
46:55
yes i did that doesn’t exist and you
46:57
know
46:58
your your your racist uncle
47:00
recirculating
47:01
um god knows what from whatever outlet
47:04
that is just go those those
47:08
what we might think of barriers to entry
47:10
but we also might think of as safeguards
47:11
are just gone
47:13
and with all of the other institutions
47:16
eroded that i mentioned
47:17
you know public schooling library public
47:20
libraries and so on the mechanisms that
47:22
people might use to
47:24
vet material to understand what it means
47:27
when they look at a paper of record
47:29
versus
47:32
a dubious outlet let’s say a dubious
47:34
internet based outlet
47:36
and how those uh sources differ those
47:39
mechanisms to to learn about those
47:41
things have been eroded as well
47:43
um is there even a civics class anymore
47:45
in public school
47:46
i see that uh at least it looks like
47:49
donald trump missed it when
47:51
when he was coming up based on what i
47:54
saw in the debate the other night
47:55
i had one growing up i think that when
47:57
it might have already been spotty by the
47:59
time
47:59
i was in yeah you know i mean yeah i i
48:02
have that and
48:03
i i won’t mention my age but it’s it’s
48:06
probably a writer it’s not 29.
48:11
so you know i i’m trying to i guess what
48:13
i’m trying to do
48:14
uh in a roundabout way here is draw some
48:17
connections
48:18
around phenomena that seem often um
48:23
like they have come from nowhere to say
48:25
that actually
48:26
it would behoove us to to connect those
48:29
dots both in this moment but also draw
48:32
back
48:33
a little bit on history uh at least the
48:36
the last 40 years of sort of like
48:37
neoliberal
48:39
uh policies that have eroded the public
48:41
sphere
48:42
in favor of private industry and it what
48:45
it didn’t do was erode the public’s
48:47
desire to
48:48
know but what has popped up and cropped
48:50
up in that
48:51
vacuum left are these uh
48:54
you know really questionable uh
48:57
information sources that
48:59
really don’t respond to any greater
49:02
norms other than partisanship
49:06
uh advertising dollars etc
49:09
and and that’s on a good day i mean
49:11
those are the ones that aren’t totally
49:13
nefarious like state and extra state
49:15
right uh you know psyops generated stuff
49:19
right
49:20
it’s so interesting because when you
49:21
were talking about youtube you mentioned
49:22
about how
49:23
the quote about how the ai’s not
49:25
watching the video
49:26
and and the comment you made was about
49:28
you know the the idea of sense making
49:30
from the video and
49:32
what i’m hearing and what you’re
49:33
describing there is you know one of the
49:35
kind of underlying themes of my work is
49:37
that humans crave meaning
49:39
and that a lot of one of the most sort
49:41
of human qualities that we have or two
49:44
of the most human qualities are
49:45
meaning making and meaning uh finding so
49:48
it means
49:48
yeah so it strikes me that what you’re
49:50
describing is this kind of
49:52
systemic change you know many systemic
49:56
changes happening at once
49:57
in what information is available what
49:59
information is provided and yet
50:01
our impulse to connect dots and create
50:05
meaning is still there but we’re giving
50:07
we’re given increasingly unreliable
50:10
potentially
50:10
a sources of information and we’re
50:14
we’re still trying to connect those dots
50:16
but the the dots that connect
50:17
inevitably become simpler more
50:20
rudimentary more
50:21
uh fit more conspiracy theory kind of
50:24
narrative in many cases is that a fair
50:26
characterization do you think i mean
50:28
i that’s absolutely the conclusion i
50:30
come to and i actually have a
50:32
a doctoral candidate whom i supervise
50:35
right now
50:35
who works on this notion of conspiracy
50:38
theory
50:39
and we have talked her name’s yvonne
50:41
eden and i want to shout it out just
50:42
because i want folks to look out for her
50:44
coming up
50:45
um but i i have talked with her and
50:49
others about this topic
50:51
significantly because i have quite
50:54
this may surprise people but i have
50:56
quite a sympathetic
50:58
uh orientation towards people who
51:01
who fall prey to what we might consider
51:04
broadly speaking conspiracy theories
51:06
and it’s the reason you just laid out so
51:08
eloquently that
51:09
that human impulse as you described it
51:11
right and that’s the that’s really at
51:12
the root of the show anyway right yeah
51:14
we’re
51:15
all tech is human and it’s it’s about
51:17
humanity it’s about digital humanity
51:19
um and i think when i think about people
51:22
who fall victim
51:23
to to conspiracy theories
51:26
what i see underlying that is an is a
51:30
human impulse to want to make sense of a
51:33
world that increasingly doesn’t
51:34
and they’re doing it in the absence of
51:37
information
51:38
that is way more complex and hard to
51:40
parse out and actually might
51:42
um point criticism at places that are
51:44
very uncomfortable
51:45
right so you know a flat earth or a q
51:47
anon
51:48
uh adherent etc uh whatever else is go
51:52
you know lizard people who live under
51:55
the crust of the earth i mean i’ve heard
51:56
all of these things
51:58
um you know it’s like
52:01
it’s like this process of finding black
52:04
holes
52:05
that uh astrophysicists engage in
52:08
it’s not that they’re seeing the black
52:10
hole it’s that they’re seeing
52:12
the way the energy behaves around it
52:14
right um
52:15
for example that you know things might
52:17
be congregating towards a point
52:19
in space or uh you know there’s
52:21
disruptions
52:23
and i think about that metaphor a lot
52:25
when i think about people who fall
52:26
victim to
52:27
essentially bad information that they
52:29
sense
52:30
you know a disruption in the force right
52:33
they sense a disruption they sense
52:36
a wrongness about the world but they
52:38
don’t have
52:39
the right information um presented to
52:42
them
52:42
or access to it or even the ability to
52:45
parse it because
52:46
we’ve destroyed public schools and now
52:48
we we might suggest in my own bout of
52:50
conspiracy making
52:52
that this kind of again debasement and
52:55
destruction of these public institutions
52:57
that help people
52:58
uh identify good information and be good
53:01
citizens
53:02
and understand the world around them in
53:05
a
53:05
in a way that you know lasts longer than
53:08
a blip on the screen is political
53:12
and that it leaves them chasing their
53:14
own tail through conspiracy theories
53:15
instead of unpacking things like
53:17
you know the consequences of um of
53:20
western imperialism
53:22
or understanding human migration as
53:25
economic and environmental injustice
53:27
issues or
53:28
the destruction of political systems
53:30
that have long lasting consequences in
53:33
their countries of origin
53:34
that the u.s may have been involved in
53:36
you know like all of these kinds of
53:37
things that you learn from studying
53:39
history
53:40
or social sciences or having a
53:42
humanistic approach
53:43
uh to various topics that you know all
53:45
of these spaces that are getting eroded
53:47
from
53:47
from kindergarten all the way through
53:49
higher education um
53:51
have consequences and then the the
53:52
auxiliary institutions that help people
53:55
such as libraries and that that is
53:58
happening not just in the in the public
54:00
library sphere but there’s been gross
54:01
consolidation in academic libraries over
54:03
the years
54:04
uh the price of information access to
54:06
journals has skyrocketed for
54:08
virtually no reason etc you know you
54:10
combine all that and
54:12
we have created essentially an
54:13
information
54:15
access problem and an information
54:19
an ability to parse information a
54:22
problem
54:23
for people what do they do they reach
54:25
for the pablum of social media which is
54:27
instantaneous
54:28
always on constantly circulating speedy
54:31
easy to digest uh and
54:34
worth about as much as you know those
54:37
things might be worth
54:38
yeah well before we run out of time i
54:40
want to make sure we touch on the
54:42
speaking of intersecting dimensional
54:45
systems
54:45
uh the work that you are doing with uh
54:48
with the
54:49
ucla center for critical internet
54:51
inquiry which of course
54:52
you do with our previous delightful
54:53
guest satya noble
54:55
yeah we all love zapier noble uh the
54:58
website
54:59
describes it as an interdisciplinary
55:01
research center committed to holding
55:02
those who create
55:03
unjust technologies and systems
55:05
accountable for the erosion of equity
55:07
trust and participation i won’t read the
55:08
whole statement but that immediately to
55:10
me
55:11
ties back into some of what you were
55:12
just saying so can you tell us a little
55:14
bit about
55:15
what your initiatives and programs are
55:17
or are planned to be
55:19
well uh absolutely first of all um
55:22
yeah this is sort of a this is sort of
55:26
the
55:26
uh the the long lasting goal that sofia
55:31
and i have had and
55:32
just so your v your viewers and
55:34
listeners have a little contact sophie
55:36
and i
55:36
uh met essentially on our first day of
55:38
our phd program and so that’s we that’s
55:41
how far we go back
55:42
and we were both really involved in each
55:44
of the origin stories of each other’s
55:46
research and
55:47
if you look at the work that i do on on
55:49
humans in content moderation you look at
55:51
the work that she does on algorithmic
55:53
bias you can see you know the the
55:56
the dna is swirling around each other
55:58
born of so many conversations that she
56:00
and i have had over the years
56:02
and so one of our long-term goals was to
56:05
create a center
56:06
uh at what well first of all our
56:08
long-term goal was to get to the same
56:10
academic institution which for those of
56:11
you in academia
56:13
out there know that is not an easy task
56:16
but we managed
56:17
over the years to do that through
56:19
circuitous means and we ended up at ucla
56:21
together
56:22
then the second task that we had in mind
56:25
was to create a center that would
56:27
take the sum of the work that we do
56:30
put it together and allow it to be
56:32
bigger than it would be
56:34
on its own because we would be able to
56:36
invite others in under the umbrella and
56:38
under the
56:39
you know the big ten approach of having
56:40
a center
56:42
and so it’s really uh taking
56:46
uh this opportunity of having some
56:48
funding sources bringing on other
56:50
researchers
56:51
um holding convenings uh hopefully
56:54
sponsoring research studies
56:56
etc uh that will allow us to amplify
57:01
the strands of the research that i
57:02
talked about today and that
57:04
uh previous viewers and listeners will
57:06
know about through sophia’s work
57:08
and then amplifying that and making it
57:10
bigger than we could on our own
57:12
so of course in this particular moment
57:15
we’re absolutely interested in in
57:18
focusing on things like
57:19
uh the political situation in the us and
57:22
in the election
57:23
uh we’re absolutely committed to and
57:26
following the black lives matter
57:27
movement and how
57:28
uh that is that is playing out um and
57:31
and issues of of racial and social
57:34
injustice and inequity
57:36
that not only i would argue are
57:38
perpetuated in social media but are
57:40
exacerbated
57:41
in social media and um you know thinking
57:44
about ways in which
57:45
interventions should take place so that
57:48
that doesn’t happen
57:49
and i’ll leave just one anecdote you
57:50
know we were in a room once
57:52
at a megatech one of the many mega techs
57:55
with
57:55
with an engineer who was working on a
57:58
tool
58:00
his particular platform had you know had
58:02
it
58:03
had a mechanism for people to post um
58:06
uh job advertisements and they were
58:08
finding that you know people were doing
58:10
a lot of
58:10
uh racist and and other kinds of you
58:13
know gender discriminatory things in
58:14
their
58:15
job ads go figure and he was
58:18
you know he was working more on the
58:20
experimental side thinking about how to
58:22
make algorithms understand what might be
58:24
fair and you know he started ruminating
58:26
really what is
58:27
fairness you know and sophia and i
58:29
looked at each other we’re just about
58:30
dying we’re sitting there together we
58:32
say to the guy you know
58:34
um you could sit there and reinvent the
58:36
many thousands of years of philosophical
58:38
rumination on fairness
58:40
or you could look to guidance from the
58:41
federal government
58:43
that has laws about what constitutes
58:46
fairness and hiring practices
58:48
and in fact we would argue that’s what
58:49
you should do uh and that
58:51
that’s what you’re compelled to do so so
58:53
you see what i mean
58:54
they’re always doing that i’m telling
58:57
you a mega tug man
58:58
they’re on my nerves so you know this is
59:00
the kind of this is the kind of
59:03
the thing that that we want to do in a
59:05
bigger way
59:06
through the center which is to inform um
59:09
make these connections like we’ve talked
59:12
about on the on the
59:13
uh on the program today uh talk to
59:16
policymakers
59:17
many of whom desperately want to be
59:20
better informed themselves right
59:22
regulators uh politicians they’re going
59:24
to be the first to say
59:26
it’s a series of tubes may have been
59:28
more correct than than not in retrospect
59:30
right but they
59:30
they need help too to unpack and
59:32
understand
59:34
and many of the firms themselves want to
59:36
get better as well and we’re we’re here
59:38
to do all of those things and more
59:40
oh i i want to give you a chance to um
59:42
point people to
59:43
where they can find your work i know
59:45
that on on the promo for this
59:47
uh show we had ubiquity75 as your
59:50
twitter handle are there other places
59:51
people can find you online
59:53
well i think a great thing for folks to
59:56
do would be to follow the c2i2 the ucla
59:59
c2i2
60:00
twitter account we also have a mailing
60:03
list
60:04
that will become more active as we go
60:07
forward you can visit our website again
60:11
c2i2ucla.edu
60:12
and um you know follow the initiatives
60:15
there i i have to give people a
60:17
you know a content warning that the the
60:19
tweets from the last few days have a lot
60:21
of naughty words in them because i was
60:22
watching
60:23
the presidential debate and just losing
60:25
my mind
60:26
um but i’m not the only one no you’re
60:29
not the only one you know
60:32
caveat lector on my on my twitter
60:35
account just you know i’m a human too
60:37
folks i’m a technological human too
60:42
that’s fair enough we i think that’s a
60:44
great way to finish off this thought
60:46
thank you so much thank you to our
60:48
audience for tuning in
60:49
and uh to david ryan polgar for our only
60:52
question that was asked live today but i
60:54
know there must have been another ones
60:55
so
60:55
feel free folks to follow up with sarah
60:58
on
60:59
twitter or on her channels and engage
61:02
with her work
61:03
thank you sarah for the work you’re
61:04
doing it’s so appreciated and it’s so
61:06
important
61:08
thank you very much uh i guess i should
61:10
say the book is called
61:11
behind the screen content moderation in
61:13
the shadows of social media i didn’t
61:15
mention that
61:16
that’s the cover uh right there it’s on
61:18
yale university press
61:20
it’s coming out in a french edition on
61:22
october 8th
61:23
on la de cuvert uh uh press and so for
61:26
any
61:27
french speakers or french readers who
61:30
may be watching or if you have friends
61:32
uh they can now read it in in french
61:39
thank you very much all right thanks
61:41
everyone bye bye now
61:43
all right thank you

Twitter Mentions