Discriminating Data: Wendy Chun in Conversation with Lisa Nakamura *transcript cleaned by Kaitlyn Gastineau Introduction 0:00 [silence] 0:45 welcome everyone my name is uh Irina Aristarhova and on behalf of the Digital 0:51 Studies Institute and DISCO Network I'm really happy to present our 0:58 public event today the book talk Discriminating Data: 1:05 Wendy Chun in Conversation with Lisa Nakamura and Alex Barnett 1:11 um I am a professor and um 1:17 Acting Director of the Digital Studies Institute that is hosting this talk 1:22 and our co-sponsors for today's event are the following units at the University of 1:29 Michigan - Ann Arbor Asian Pacific Islander American Studies, English Language and Literature, American 1:37 Culture, Communications and Media, and the Stamps School of Art and 1:42 Design. Digital Studies Institute at the 1:47 University of Michigan provides space for scholars of the humanities and qualitative social sciences 1:54 who are experts in digital studies and supports their research and teaching 1:59 we're the nation's first one-stop place for all things digital including digital media studies, digital 2:07 humanities, digital pedagogy, digital art, aesthetic practice and design, and critical 2:14 thinking about the digital future as you can see the Digital Studies is a 2:20 big tent for us that has always included practitioners, critics, theorists, and 2:26 artists we are also a national leader in the study of digital media and race, 2:32 gender, and identity the next slide please 2:40 um before I go into the DISCO Network I would if Veronica if you could put the 2:46 previous slide in I forgot to mention here that we have a postdoctoral 2:51 fellowship applications open uh they are due on January 7 2:58 2022 and they are advertised currently at the chronicle of 3:04 higher education and I'm putting in the link right here so please for those 3:11 of you who are interested um and just gotten your doctorate in an area that's 3:17 related to the digital studies uh check it out the next slide please 3:23 the Digital Studies Institute is a proud uh home of the DISCO Network which is an 3:29 amazing Digital Inquiry Speculation Collaboration Optimism Network 3:35 led by Lisa Nakamura and you will hear more about it in a second 3:40 next slide please upcoming DISCO events include a Digital 3:45 Methodology Workshop with André Brock, Mad Frictions Workshop Series with 3:53 Remi Yergeau and from the Digital Studies Institute 3:59 we have an event on December the 9th Welcome to the Indigenous Future 4:06 and our Digital IDEAS summer institute will be devoted to critical access 4:12 technology and disability justice in summer of 2022 I welcome you also to wait for 4:19 applications and apply um and we will be happy to see you here in 4:25 Ann Arbor in the summer 2022 when the weather will be fabulous 4:31 uh the next slide please and here it's my pleasure to introduce 4:36 um a bios of our speakers today 4:42 Wendy Hui Kyong Chun is Simon Fraser University's Canada 150 4:49 Research Chair in New Media and leads the Digital Democracies Institute 4:55 she has been Professor and Chair of the Department of Modern Culture and Media at Brown University where she worked for 5:01 almost two decades she has also been a Visiting Scholar at the Annenberg School at the University 5:08 of Pennsylvania Member of the Institute for Advanced Study at Princeton and she 5:13 has held fellowships from the Guggenheim ACLS American Academy of Berlin 5:19 Radcliffe Institute for Advanced Study at Harvard Lisa Nakamura 5:25 is the Gwendoline Calvert Baker Collegiate Professor of American Culture and Digital Studies at the University of 5:32 Michigan she is the author of several books on race gender and the Internet 5:38 she is the founding Director of the Digital Studies Institute at the University of Michigan and the Lead P.I. 5:45 for the DISCO network next slide please 5:50 Alex Barnett is Group Leader for Numerical Analysis at the Center for Computational 5:56 Mathematics at the Flatiron Institute a division of the uh Simons Foundation in 6:03 New York City his Ph.D was in physics at Harvard University followed by postdoctoral work 6:09 in radiology at Massachusetts General Hospital and in applied mathematics at 6:15 the Courant Institute at NYU he then taught at the Department of Mathematics at Dartmouth College for 12 6:22 years becoming a full professor in 2017 his research interests include 6:28 computational partial differential equations, waves, fast algorithms integral 6:35 equations, neuroscience, imaging statistics, and inverse problems 6:40 he has published more than 50 papers received several grants from the National Science Foundation 6:46 won the XXI International Physics Olympiad and received Dartmouth's Karen E. 6:52 Wetterhahn Memorial Award for Distinguished Creative or Scholarly Achievement I personally look forward to 6:59 hearing today how Alex's work contributed to this Wendy's book 7:05 the next slide please and here I'm leaving you with hearing 7:11 more about the DISCO Network Digital Inquiry Speculation Collaboration 7:16 Optimist Network thank you and have a wonderful day Discriminating Data 7:22 I can talk a little bit about the DISCO Network it stands for Digital Inquiry Collaboration Speculation and 7:28 Optimism and there are um six of us who are in this network and started it it's me 7:34 André Brock Catherine Knight Steele Rayvon Fouché 7:39 Stephanie Dinkins and Remi Yergeau and we have a three-year period during which we are hoping to 7:45 write teach and do some cool research on the intersection between disability race and 7:51 technology so I'm just going to dive right in I am so excited to be talking to Wendy and 7:58 later her co-author Alex about this new book Discriminating Data 8:03 from MIT Press and I'm hoping to 8:08 get at what this book is contributing to the field which is a lot so 8:14 in recent years we've seen quite a few critiques of the tech industry Wendy and I as we were saying before 8:20 both remember when there was a lot of utopian optimism and it seems that there's a kind of backlash to that 8:26 but I think what the field and what the conversation is missing is both nuance and history 8:32 around this question you know so this book Discriminating Data Correlation Neighborhoods and the New 8:38 Politics of Recognition provokes us to think about data in a new 8:44 way and it's beautifully written beautifully researched it has some fascinating 8:50 exercises that are very interactive which I've never seen in a critical book like this before which we'll talk about 8:55 a little more but I wanted to ask Wendy some things about the title 9:00 so Discriminating Data is a title that can go in lots of ways and that's one thing I love about it so 9:07 if you read it as being about data as the subject data playing a role in 9:13 creating new kinds of social discrimination against and systemic denial of resources 9:18 to marginalized groups which I feel now we know is true right that's not even that controversial but it was quite 9:24 controversial a while ago the data itself is an agent in creating discrimination against 9:30 people specifically people of color women and so on the data is an agent that makes 9:36 discrimination happen and that we know discrimination is wrong so that's one way to read it and I think 9:43 if we look at another way we can see discriminating data is something that we as humans can do 9:50 that data isn't all the same if we learn through critical 9:56 theory critical thinking maybe a little bit of math I'm hoping not too much math because that part I need some help with 10:02 um but I know we're going to have to get some help with um that if we see us as possibly being able to discriminate 10:09 between not only data which discriminates and data that maybe doesn't discriminate as much 10:15 but if we can learn to see somehow the inner workings and really important distinctions between different kinds of 10:20 data right personal data data about our face data about our name 10:26 data about our race and gender and understand what kinds of things people are doing when 10:33 they're making data can we see through data and what is being done with it can we have some 10:40 agency to know and I think that's what the tutorials and the book are about 10:45 to empower ourselves to know how data is discriminating not just against us possibly but creating new kinds of 10:52 discrimination maybe even new ways of being discriminatory that used to be invisible to us and something I love 10:59 about this book is that in our field we do a lot of gesturing towards 11:04 discrimination but only as an effect right that I know I was denied credit I know that I didn't get to see what my 11:10 friend saw because I'm a woman because I'm Black right or because I'm Asian 11:16 but though how is not always that easy for us to get at so in some ways we're 11:22 all grabbing different parts of the elephant you know we know that data is discriminatory but we don't know how we 11:28 can discriminate ourselves about how data is discriminatory and what it's doing not just to me but to 11:35 people I care about and as Wendy says people we care about are 11:40 in some ways the problem right that people we care about are people who are like us oftentimes 11:47 um they're our neighbors sometimes we care about our neighbors sometimes we don't sometimes we 11:52 recognize people we care about as being like us in a more ideal scenario we could recognize people who 11:59 aren't like us but are machines able to recognize that and what are the implications of that so I think in the 12:07 first case we could think of data as something which is doing things to us 12:12 and in the second we can hopefully see ourselves as people who can discriminate when we look at data if we get to look 12:19 at it and not just feel like objects so I'm going to invite Wendy to talk about method 12:26 because I was so compelled by the way this book draws its conclusions 12:33 what this book is doing to me is to create new origin stories for how we got 12:38 here so I think it's really easy to blame companies for being bad and that is not 12:46 even necessarily true but it doesn't tell us anything about the how so you know we're kind of swimming in 12:51 algorithmically curated news facial recognition systems social media um there's a lot of dissatisfaction with 12:58 those things but Wendy I think cuts right to the heart when she writes big data is the bastard child of 13:04 psychoanalysis and eugenics so I wanted to invite her to talk about this alternative genealogy 13:12 that in some ways pushes a lot against this focus on famous men and what they did wrong because that's 13:19 not really getting us anywhere so we used to think you know Norbert, Wiener, Vannevar, Bush all those 13:25 people we teach in intro classes were good and then we thought they were bad right and this is not a history which is 13:32 very fruitful so in this book Wendy you spent a lot of time looking in unfamiliar places 13:39 so not at MIT not at Silicon Valley but rather at housing 13:44 developments in the 50s you know different kinds of 13:51 racialized engendered places where we don't necessarily look so can you 13:57 talk more about how you found and selected um the kind of new origin story in 14:03 eugenics and physiognomy to understand say network science and facial recognition technology why it 14:09 matters where we get our histories from and what new light this sheds on people 14:15 doing work on say race but not necessarily technology or looking at say gender but not necessarily 14:22 the history of data science so that's a long question but I just wanted to invite you to talk about how you did 14:28 this and what you were what you were doing when you did it um so this is Wendy thank you so much The Problem 14:34 Lisa for your um summary and your incredible question I'm so thrilled to 14:39 be here in conversation with you and um the DISCO Network 14:45 so I'm gonna this is a huge question um and so I'm gonna answer it in in several 14:51 different ways um the first is um one thing that was important for me in 14:56 terms of methodology and stories that we tell is to move away from the idea that 15:03 if we only added humanities and social sciences to technology then we would have ethical tech 15:09 so the problem is that we don't have social science or humanities interests in these that these are purely 15:16 technical and that's what's wrong because as I started reading more and more 15:22 about how these concepts became axiomatic notions like similarity breeds 15:27 connection or homophily which is what you mentioned earlier Lisa what became so key is that these were borrowings 15:34 from social sciences and the humanities so the issue isn't that the social 15:40 sciences and humanities aren't there but that concepts have been borrowed poorly or poor concepts have been borrowed um 15:48 and so part of what's needed is a richer more interdisciplinary 15:54 engagement where we're not just taking each other's methods poorly from each other but trying but understanding the 16:01 weight of what it means to work with these concepts and so what I always do and this comes I 16:09 think from my background both in engineering and in english literature is always to start with a basic concept um 16:16 and to ask myself how did this come to be and also to try to think through the the 16:22 very different and often conflicting meanings there are for any one concept so such as control freedom etc 16:30 um and what became so important for me when I was looking at these um axiomatic concepts and the first one I start off 16:37 is with correlation right so the beginning of the 21st century you heard about correlation all the time 16:43 right this was what was different about big data this was what was different about knowledge it was you know let us 16:50 get you know take over the future not the past um what I found really odd about that is 16:57 that the rhetoric around correlation was almost the same rhetoric that happened a 17:02 century earlier when eugenesis came up with the concept of correlation 17:09 um and why the rhetoric was so eerie is that both eugenicists and people talking 17:15 about big data were talking about correlation not as a way to understand or think statistically about the past 17:21 but rather to understand how to determine the future once you've determined things in the past and the 17:27 present that do not change um and so that for me was absolutely key 17:33 and trying to think through why and how is correlation as this future predictor 17:38 this way to close the future which is what eugenics thought to do um coming back but now in the name of disruption 17:46 so if what we're undergoing is disruption it's not because something radically new is 17:52 happening and the future can be anything but rather because the future has been closed through this logic and so for me 17:59 going back to the eugenicists and trying to understand what's different though about how they used correlation and 18:06 thought about correlation and society and us um was also key 18:12 because for eugenicists um they believed that their population was the nation as a 18:17 whole right um and so they had very disturbing notions of uplift 18:22 whereas now with a new correlation um it's being used to divide communities 18:27 and what's being touted are certain forms of exit rather than a communal exit rather than 18:36 this national uplift and that has everything to do with questions of race 18:41 and how race and difference is being thought through and so the ways in which homophily which 18:48 is the idea that similarity breeds connection has become this default concept and so the second chapter looks 18:54 at homophily and as you noted it comes from studies of housing projects 19:00 biracial housing projects and so in a bizarre way 19:05 by using the the notion homophily what we're doing is spreading segregation 19:11 through these networking concepts right so segregation isn't a 19:16 accident it's the default um and but what was also very key to me was 19:22 to go back to these studies and to see like even back then homophily wasn't the only choice 19:30 and that if you go back to the ways in which these concepts themselves were formulated you can think through other 19:36 possibilities and understand how the amplification of certain types of relations are destroying others so just 19:44 very briefly with homophily what was so key is that um this was um this came from a 19:49 housing study that came in the 1940s it was when there was incredible income 19:54 compression and also a real housing crisis and the question was you know what will 20:00 what should we do should we have more public housing and back then public housing was mainly 20:06 white or should we have private housing should we have mortgages 20:12 and we know what happened and we know the incredible wealth disparity that has 20:18 resulted from the decision that was made but back then they they you know said 20:24 okay so let's study these housing projects one housing project was an all-white co-op 20:30 um and the other one was a biracial housing project and they asked every 20:35 member of the families of these groups um questions like these were four-hour 20:40 interviews and so this was um blatantly forms of social engineering 20:46 and of um sociological surveillance they talked about this as a laboratory this is why housing projects were so 20:52 interesting because they were forms of social engineering so that idea of an enclosure in engineering which has now 20:59 become of course the way people describe social networks was there from the very concepts that 21:04 were drawn from and thinking through these enclosed communities therefore opens up these 21:11 historical narratives yeah something I love about the way you write this book is that it's very Running Away 21:18 precise but there's also some very accessible kind of memorable writing in here and one of them is how do you show 21:24 a love of the same by running away when others show up so um I think as you say right the idea 21:31 add social science add humanities and stir doesn't get at the root issue here 21:36 which is the way these things including social science and the humanities are put together to create forms of 21:42 discrimination so how do we know when we run away how do we know when we are allowed to stay 21:49 right and I'm interested in how that maps on to say ethnic studies or women's studies 21:55 which are in some ways as you say structured around what you call the productive conflicts 22:01 that in some ways are antithetical to data science right because data science wants to fix something as having a 22:07 nature and then perform a bunch of operations based on that nature 22:12 which can be mistaken you know people think if we just clean the data maybe it won't be so racist but I think what 22:19 you're saying with this origin story it doesn't really matter how much you clean this data 22:25 right it then it's instead about the kind of premises of what data is 22:31 um and what conflict is and how conflicts should be dealt with in a network right so are we going to run away when 22:37 others show up and if not what are we going to do so this is a big question but I'm just 22:44 going to ask it since I think people would love to know from a kind of data science and also critical perspective 22:52 what you think given this is a kind of poisoned well to begin with but your history but certainly you know my 22:59 research when I think about what I've been trying to do too is to say how can something 23:05 based on say the labor of Indigenous women making our stuff not produce 23:10 discrimination you know it's invisible we don't know where that stuff came from but that is itself part of the problem 23:17 that the origins of things are mistaken you know the wrong kind of p- not the 23:22 wrong kind of people but certain bodies are getting a lot of credit good and bad for having created 23:29 the digital and others are totally totally excluded in many ways 23:34 so I'm going to transition to a question about dirty data 23:39 um because you write machine learning programs are not only trained on selected discriminatory and often dirty 23:44 data they are verified as true only if they reproduce these data 23:49 right in other words as you say they predict the past not the future 23:55 which makes sense that you're going to you know the eugenics is at the heart of these things so your Lombroso pictures 24:01 right of facial recognition I've been waiting for someone to do that for a long time and say you know this is 24:08 completely old this was dirty data too right these drawings of Jewish noses or 24:14 of Asian eyes are completely dirty data but we just didn't call them that back then 24:20 um so this is a theme throughout your book as you write social network create and spawn the reality they imagine they 24:26 become self-fulfilling prophecies so how do you interrupt this repetition of 24:33 the same um big question but I had to ask you Deep Fakes 24:39 thank you Lisa um so Wendy here again I think that what's 24:45 absolutely key in the question that you asked is there's two parts to it right 24:50 one is uh the question of dirty data or what I would call deep fakes right so 24:56 what I argue is that deep fakes aren't just the product of these programs they're actually the ground 25:03 truth right these either corrupted police files 25:08 hollywood celebrities as the basis for facial recognition um training etc um what we 25:15 have is the problem that the deep fake has become ground truth so what's replicated is true is this deep fake 25:22 right but the other argument is even if we fix the deep fake you'll still have 25:29 reproduction of a certain ground truth as 25:34 what is true and part of this is because you know when these programs are trained 25:41 what they're verified against trying to see if they're they're predicting things correctly isn't future values but rather 25:47 certain values that have been held back during the training process and they're usually from the same data set or 25:53 they're out of sample data and so one thing that's fundamentally 25:59 that we need to think through is this definition of truth as repetition because if truth is repetition then only 26:06 racist and society is racist and the ground truth that is based on is this 26:12 deep fake racist past then these things will only be verified as as correct if they make racist 26:18 predictions so one of the largest things that I put out there is that we need other forms of verification validation 26:26 and truth and this is why we need to have an interdisciplinary dialogue because definitions of truth within the 26:33 humanities and philosophy for instance don't rely simply on repetition I mean 26:38 Hannah Arendt famously said if truth becomes um repetition then it becomes 26:43 nothing right and people always you know experience 26:48 things like films or novels that feel profoundly true even though they are fictional 26:55 so I think that these larger questions of truth are absolutely key but they also 27:00 therefore don't mean that you know truth is something absolutely unreproducible but it gets us to things even like first 27:06 principle and the relationship between first principles and truth so for example if you have this mindset that oh what is 27:13 true is what's reproducible if you think about global climate change you're absolutely screwed right because you 27:19 have to wait for global climate change to happen therefore you can verify it as 27:25 true um you've lost the battle and I think global climate change models make absolutely clear the difference between 27:32 what's true and what's correct yeah I think that's awesome because it really encourages other scholars to try 27:39 and find data that doesn't fit into this receive story right as you say in another section of 27:47 this book um the internet has become nothing but social experimentation to normalize the 27:52 non-normative so I love this kind of invitation to normalize 27:59 or to um to think about how the non-normative can be disruptive here so and that's the 28:05 kind of project of this alternative historiography I think that you and other people as well have been doing 28:12 um and I'm super fascinated by what you say is democracy equals offense 28:17 because I'm very offended all the time by the internet I feel like that's my job 28:23 and you know I hear what you're saying as well that data is doing a great job discriminating 28:29 against us but we're not doing a super great job discriminating data and this book is part of that project so 28:35 I wanted to invite Alex who is a collaborator of yours to speak to how his contribution is going to help 28:42 us do that I mean I'm a humanist I totally don't know anything about 28:48 math so I'm hopeful that you can speak more to how what you were thinking to help people who are like me 28:55 learn to do a better job hi Lisa 29:00 thank you it's Alex here um that's a tough question uh 29:06 so I think our goal of the collaboration with Wendy was to 29:12 put together tools to understand what's going on in a way that hasn't been done before like where you actually 29:19 see a basic sort of scientific and in 29:25 equations and graphs and pictures understanding of of the methods alongside well I mean I'm just adding 29:31 that onto these sort of in-depth history of where these ideas came from and it's it's 29:36 invariably fascinating like how uh tied up with the um 29:42 some of sort of scary social science history that the math is and and that you don't 29:49 even it's that's not taught uh if you go through a masters in data science and then you go out 29:55 into the real world and get hired by a million companies to uh to sell you know 30:01 to control their marketing algorithms or whatever I mean this is the story of 30:06 tens of thousands of graduates all trying to to get those jobs right now they are not taught 30:12 like I mean for instance the history of Linear Discriminant Analysis which i 30:17 sort of had was really shocked when I dug into that and learned that and we sort of put this 30:23 in the book so I'll hold I'll hold up the section just to cue those if you have the book but um 30:30 that this method was literally made in order to separate and segregate the rate 30:35 uh the different castes as the British uh uh Empire saw in India and that led 30:42 to this method that then is like now a totally standard method in statistics and uh 30:48 data analysis and it's the first thing you would learn if you want to classify pictures of cats and dogs and 30:54 all of the things that big data lets you do so part of it is just to 31:00 to to put the two together I mean normally you read a history book you'd read a 31:05 statistics book and they would rarely talk to each other um 31:11 so does that help solve the problems I think 31:16 I think you have to be informed on both sides about literally what the algorithms are doing 31:21 the sort of operations of formulae and then uh what they can lead to and and 31:28 where they came from uh and you have other 31:34 team members that are gonna can really help with this like journalists and then uh 31:40 computer scientists and whole teams of people that are questioning the ethics 31:46 of how these algorithms what they end up doing to people yeah something um I really Machine Unlearning 31:53 resonated with reading this book is the idea of the machine the AI apocalypse because it was 31:59 longed for and then it was feared and people still talk about it I feel like it happened a while ago 32:04 and we're more or less just getting our head around that or starting to 32:09 recognize that it's already happened and now what are we gonna do so in the conclusion you urge the reader to 32:16 prevent this or rather not to prevent it but like try to engage it by um 32:22 doing machine unlearning so as you were saying Alex it might be learning more about the colonial logics 32:28 of these machines as a way to unlearn what the results might mean for us 32:34 in other words to create a different future not to repeat all the time so what are some specific things that we 32:41 might consider doing in our everyday life to enact machine unlearning Ways to enact Machine Unlearning 32:52 that's a great question uh Lisa and there's so many different ways to think about it depending on who you are what 32:58 your everyday life looks like I think that the term unlearning of 33:03 course comes from Ariella Azoulay's work on potential history and what she 33:09 encourages us to do in terms of unlearning the archive is to move away from treating people as sources of 33:15 information but rather instead treat them as companions and so one thing that I argue in the 33:22 book is if you think through these communities that have been at the heart of this kind of analysis so 33:28 residents of biracial housing projects and Lisa as you and I have been looking at together residents of 33:35 Japanese American Internment camps who are central to the emergence of sentiment analysis right 33:42 what does it mean to treat these folks as companions rather than sources of information and what that looks like is 33:49 a serious dealing with the legacies of this and to think through questions of reparations and of living 33:57 with these histories and you also mentioned the apocalypse so what's so important 34:02 for me is the argument is um we the fact that we have this fear of discriminatory 34:08 algorithms and the fear of this um AI apocalypse at the same time coming up 34:14 isn't an accident and we need to actually think these two together because one way to one reason why we're 34:20 so afraid of of the uh AI apocalypse is because machines 34:26 are framed for us as servants um as slaves they're profoundly raced and 34:32 gendered right so (inaudible) has written about this of course you've talked about the labor practices as well so to take 34:38 on these issues we need to take on these questions of labor the framing of these machines and how we want to relate to 34:44 them and here this is why I think Indigenous Studies is so important and they've written texts like the 34:51 Making Kin with the Machines you know that's Jason Lewis et al's really important contribution as well as the 34:58 AI Indigenous protocol and what's so key about that in terms of thinking about apocalypses that have been I mean one 35:05 thing that's really clear within Indigenous Studies is that the apocalypse has already happened 35:10 and trying to engage with what remains and what sort of relationality is fundamental so I think 35:16 that's one thing let's start thinking through these entire relations that are part of 35:21 your engagement and what is your relationship to that is a fundamental relationship I think that the other 35:27 thing that um us as citizens and and as people need to think about is not simply you 35:34 know is this technology good or bad but what is it amplifying how is it amplifying and also what are the root 35:41 issues that are being um brought up by these so to ask ourselves 35:46 okay instead of saying okay this risk recidivism um 35:51 program is discriminates against racial minorities we need to fix it right we 35:57 have to realize actually that this thing is operating exactly as it should and what it's revealing is that the criminal 36:03 justice system is discriminatory um so one thing that we need to do is to take these results and then say like 36:10 global climate change models show us what's wrong with the world or what the future will look like if we keep doing x 36:15 y and z to take these results as cautionary tales and say okay then if we don't want this future then we've got to 36:22 change the way the criminal justice system operates we've got we've got to think through these questions 36:27 of injustice and just to be very engaged in what these these are doing 36:33 but Alex 36:38 um I all I would add is that yes the the apocalypse I think began for me The Discount Card 36:46 sort of when you think about some uh a very simple corporate uh construction 36:54 namely the uh the discount card right in the old days right I think people 36:59 understood capitalism if you go to a store and you buy something well the store wants to sell 37:04 you things and and and they want you to buy as much as possible I mean that can be hyper 37:10 capitalistic but at least the everyday person understands what the 37:15 goal is of the company is to sell you some stuff and okay so then we get good at looking at ads and understand sort of 37:23 the media literacy at that level is important but when you get a 37:28 discount card that's sort of the first level where you where the consumer the consumer 37:34 uh starts to become a source of data and that even happened you know in the 80s before the internet 37:41 before we started hooking all the computers together and things went terribly wrong right so 37:47 now it's like that on steroids we have um the interaction you think you're having 37:52 with with a company is actually nothing like what is going on like you think with 37:57 Facebook you think you're getting a free service that it's kind of fun and you can share 38:02 information you know be social okay but what the company thinks is going on is and what is actually going on is that 38:09 you're generating data uh immense amount of personal information that is then sold to 38:14 advertisers and that kind of media literacy we need to sort of teach this to 38:20 everybody and in a sense then you have to understand the algorithms too as well as what they're doing and I don't know i 38:26 guess that's that's where I think that's why I think we have an apocalypse it's it's not capitalism it's that 38:32 people haven't that the companies have been very good at actually telling us a different story about what's happening 38:38 from what's really happening I sound paranoid right now but no that's such a good example you think you're Free Bread 38:44 getting a free loaf of bread but you're really not you're instead you're giving something 38:49 I'm thinking of Greg Elmer's book Profiling Machines that came out in 2003 that's about discount cards in Canada 38:55 I remember reading this and thinking wow people should really be more concerned about free beer because [Alex: yes] there really 39:04 isn't such a thing 39:09 um okay I think we have a few more minutes before we want to open it up to questions from 39:16 people here did you want to expand on any any things that came up or 39:23 answer a question I didn't ask yet 39:34 well I hope we will bounce ideas around with what can be done concretely 39:39 for new either social platforms or um 39:44 laws you know on the legal standpoint uh things like Section 230 and then 39:50 making companies into admit that they are media companies not platforms this kind of stuff I don't 39:56 know it'd be interesting to see what maybe people will ask about that those kind of steps yeah can I ask what you're doing 40:05 me right no yeah well no no not this second but your practice is generally informed by what you know around how 40:12 data is being deployed well I mean I'm mostly a mathematics 40:19 researcher and working on algorithms that help solve science problems rather than social problems I would say so I 40:26 mean this is a this is a new venture for me uh but by 40:31 you know being inspired by Wendy's work and talking to her over the years and and 40:36 so I mean I've got I don't have like a personal project on this I just try and follow what's going on to discuss uh 40:44 and I mean I have been involved for instance I mean one one idea that that counters 40:51 the hegemony that the homophily concept that's so central to to this 40:57 book to Wendy's ideas uh is that you shouldn't group people together by what they like or who they 41:05 are or you know that you should randomize so the opposite of uh of segregation is randomization and I have 41:12 been involved in student housing where similar to this housing project work 41:18 uh for instance the fraternity system uh place I used to teach 41:24 is all about hierarchy and segregation and we really tried to fight to remove that 41:29 system and then you would have one based on people being placed with people who are different from them and then they 41:35 have to talk to each other I think that's one of the main ways progress can occur is okay 41:41 sharing ideas I should have framed my question differently I was thinking whether you thought we should quit facebook like that kind of thing because Should I quit Facebook 41:48 people when they get a data scientist want to know should I quit Twitter should I quit Facebook should I use my 41:54 credit card like it becomes very kind of embodied of is it I always have 41:59 this feeling I'm doing something wrong when I post an Instagram picture when I tag somebody 42:05 um so I was wondering if you had any opinions about that I mean maybe as this book kind of sketches out that the it's 42:12 already such a systemic issue that they that may not matter very much in the big picture 42:21 i all I'll say is I think we have to educate in a sort of new form of media Media literacy 42:26 literacy that teaches the kind of things Wendy discusses so that if you do post a picture you 42:33 know what's happening and why the company wants you to post a picture right uh that maybe maybe that 42:39 and you know I happen to deplatform from Facebook and and really enjoyed that extra 42:46 you know 20 minutes a day that I got but that's not I mean you know there's 42:52 going to be other things maybe we should have platforms that you actually pay a little money for a little like the telephone 42:58 you know in the old days I mean ATT was a big nasty monopoly but at least you were the customer 43:04 and you were paying for a service that was in order to make a call to somebody else now who is the customer well it's not 43:10 you anymore it's it's the advertising it's the businesses who advertise it's totally flipped around and and if we 43:17 could just if people could be aware of that when they use the platform I think that would help a lot right yeah I think looking at Deepfakes 43:24 the um the kind of worksheet you know tutorial parts really helped me 43:30 understand why deep fakes is the kind of new stereotype right or the 43:36 new model that when things look like each other they're just much easier for machines to deal with them they 43:43 become preferred right so some of the pictures in here of faces that all look the same 43:49 and then the math kind of explaining why that's preferred really helped me understand why some faces aren't seen as 43:55 faces and some people aren't seen as people Opening the chat 44:02 okay so I think it's time to open it up so let's see if there's any questions 44:08 from um people who are here I'm sure there are some awesome questions I look at the chat I'm like oh this is an amazing 44:15 group of people um so uh yeah I will 44:20 look at the chat and see what's going on here ah okay so we already have a really 44:28 interesting couple questions um what is a meaningful collaboration between computer scientists and 44:34 humanities scholars look like in practice uniquely qualified people to answer and 44:41 what does an ethical AI look like for me Wendy and Alex What is a meaningful collaboration 44:49 so why don't I start with the first uh question this is Wendy um so I think that uh what a meaningful 44:56 collaboration looks like I'll start with an example that draws from my own experience um 45:02 and so at one point at Brown when the new president came in um she said we 45:08 want faculty-led initiatives um but of course part of this was to put the faculty 45:13 together who were going to then initiate projects and so I was put together with a head of computer scientist science and 45:20 somebody in biostats who worked on AIDS and we looked at each other and we said 45:26 well we're not going to work together unless there's a problem that we know we can't solve on our own so in other words 45:32 why do we all need to be at this table what is the question we can't solve and the question that we came up with 45:38 which was initiated from the person in biostats was I can show almost any correlation to be real I can show almost 45:46 anything to exist but I don't know what's true I don't know what matters and then the other question that was 45:51 initiated from some of my work on global climate change models is just because something is true doesn't mean that it 45:57 will inspire action so what is that relationship between truth and action and so we called it DIAL the data 46:04 inference action loop um it didn't it wasn't one of the ones that were chosen but it became a model for me of 46:10 interdisciplinary work that happens and it starts when we all have a question 46:16 that we that we're interested in that we want to solve and we know we can't do it on our own 46:22 and then we work with others because they're different which means respecting their methodology 46:27 but it also means questioning their first principles and having our first principles 46:33 questioned in turn and so that's what the digital democracy institute is all about is to take 46:40 the hard problems that face us from disinformation to abusive language to 46:45 network segregation bring together humanists, data scientists, network scientists, um people um in the arts, and 46:52 the social sciences and to say okay um this is what this is how we've approached this problem but it remains 46:59 why does it remain and so how can we use some of these roadblocks as opportunities for intervention and to be 47:06 as concrete as possible our project on misinformation and disinformation for example starts from 47:12 the fact that fact checking is absolutely important but it's not enough like over and over again people pointed 47:19 out that fact checking can actually spread stories to different audiences 47:24 um in fact some people spread stories they find compelling um whether or not they're 47:30 correct um and if you think through the 199- sorry the 2016 election what's so 47:36 key is that not only was it called the fake news election it was called the authenticity election and the person who 47:43 won scored high on both and so rather than saying okay this is a post-truth era and putting our hands up 47:50 we say well actually what's happening here um so instead of saying okay why is 47:55 NDP like how is this correct or incorrect ask ourselves why and how does anything come to feel true 48:02 irregardless of its facticity and what is that relationship between feeling true or authentic and facticity um and 48:10 here you know we've been drawing from theater performance studies and literary studies which has done you know a ton of 48:15 work around questions of authenticity um and trying to understand that relationship between authenticity and 48:21 authority truth and trust fact and things that were done 48:26 and so starting from what seems to be an absolute roadblock but then using every discipline and that where each 48:34 discipline is broken to draw from another discipline to push the question further What is an ethical AI 48:46 did you want to add anything Alex um uh I'll just give an example of another I 48:53 mean other than DISCO and Wendy's uh wonderful institute there is amazing stuff happening at 48:59 Tufts University from a mathematician called Moon Duchin who has the redistricting project there 49:05 and she brings together train computer scientists lawyers social scientists and 49:11 uh you know political scientists who attack redistricting and really try and 49:17 create expert witnesses that then can testify and this so this really connects to the legal 49:23 system and I think so regulation is part of of uh what an ethical AI 49:28 has to be it has to the algorithms have to be available for examination 49:34 uh and um you know even if you have to sue it out of I mean the government has 49:41 to get squeeze it out of corporations it's going to be very difficult but they may 49:47 have to pass tests that maybe we could devise such tests uh 49:52 so I don't know I I'll probably just leave it at that awesome well here's a great question Antiethical AI 49:59 about um truth before we do this Lisa they also asked you what you thought ethical AI 50:04 looked like [Lisa: oh gosh] I know that you've been thinking through the limitations of 50:09 this yeah well just to take a little piece of it um I realize I quoted you incorrectly 50:16 a little bit ago around democracy and offense because in this book you write how toxicity and 50:21 racism are baked into these social media platforms as proof of concept of debt work is diverse so if if people are 50:28 basically being really racist to each other and you know horrible and nothing is happening in in return that's kind of 50:36 proof of concept that democracy is working so you know there's some statistics 50:42 which I believe that only three percent of hate speech is ever called out in any way 50:48 right much less regulated meaning people actually losing access to accounts or even being 50:53 permanently banned three percent which seems high to me I forget where I get got this from is the paper or something 50:59 so you know it's incentive to make us this big racism machine 51:05 as evidence that democracy is working because democracy is about people 51:10 getting to do what they want and the evidence of doing what you want is doing something someone else doesn't want 51:15 so that to me is anti-ethical AI you know it's the basic principle you're writing about here that is only going to 51:22 make more of the same as people get more concerned about the apocalypse like democracy America we have to preserve it 51:28 we must have more racism as evidence that there is we're preserving it even more vigorously than before which 51:34 obviously I'm against um but having read this book I can see how that gets defended 51:40 you know and how as there was an article in the Post not long ago that Tarleton Gillespie cited 51:48 that shows that Facebook was actually regulating anti-white racism much more rigorously than anti-Black 51:55 racism right because they had decided that was made more sense and was probably you 52:02 know optimal I mean who knows um which is the opposite of ethical right 52:08 it's anti-ethical AI so seeing how these ways of 52:14 kind of I would say reproducing racism are 52:19 are obviously incentivized by companies like that has to happen that has to stop and I'm not sure exactly how it has to 52:26 stop but reading this book has made me see why it's happening that way right um and why there's so much disincentive 52:33 for companies to do anything about it um so there's a question here about 52:39 truth which I feel like I have to ask because it speaks to this question of authenticity which is core to this book 52:45 right so the the question of why we have to see people's personal really unpleasant details as you write 52:52 is evidence of something different from truth which is identity authenticity people being real often means 52:59 embarrassing themselves or others or suffering a lot of shame because of 53:04 personal details coming out right so um there's a request here for you to 53:10 speak about the differences between true correct 53:15 authentic and real and how these subtle terminologies are 53:20 part of a new model for media literacy Authenticity 53:28 thanks that's a great question and I would also point that there's a lot of people working on this and so I think 53:34 that Sarah Banet-Weiser's work on authenticity has been fantastic I think that um the work that uh you know Alex 53:41 Juhasz and Ganaele Langlois and Nishant Shah have done around really fake is important and the beyond verification 53:48 project is a large international collaboration um so in terms of the relationship between what's authentic 53:55 true correct and what was the third term 54:01 um I think it was real real as in real ID 54:07 that's a real ID yeah those are so important and I think that one way to think through those and the differences 54:13 between those and the resonances they make is to go exactly with the example you started with Lisa which is how this 54:20 call for us to be authentic where the idea is we become transparent our insides and outsides coincide not 54:27 because we're being sincere not because we're being good people but rather because we're exposing something nasty 54:33 about ourselves or other people and therefore showing that we have nothing to hide right um and that this 54:39 concept of authenticity is so valuable because this open secret about ourselves 54:45 this nothing to hide about ourselves that we decide to parade as a way to say we're authentic is is often those 54:51 divisive likes that are really effective in terms of clustering people within network 54:57 neighborhoods right and so this is being weaponized and used against us and what's so key is that 55:03 authenticity shares the same root as author and as authoritarian 55:08 so it's always linked to this question of verification and certain notions of authority 55:15 against this or in relationship would be the question of what is true 55:22 where truth comes from the same word is trust and here I think that the work 55:28 that's been done in history of science by Steven Shapin and others has been actually crucial to 55:34 showing how it was only in the 18th century that this notion of truth emerged um that 55:42 that equaled what is quote unquote correct right and what he points out is it's actually a bunch of dudes in the 55:47 18th century within science trying to claim for claim certain modes of truth right um saying that these experiments 55:54 could be reproduced but they couldn't be right but the question was how does one learn to trust certain 55:59 infrastructures so that what is being produced is considered to be valid right 56:05 and so I think that thinking through the relationship between truth and trust not in a cynical way but to actually 56:11 understand that truth and trust are intimately intertwined is crucial to thinking through why it 56:18 is you trust something that this thing isn't like um so again what I think is so important is not to 56:24 say the issue is that people have become cynical or distrustful because democracy 56:30 is based on people questioning things the problem isn't that people question CNN it's that questioning of CNN that 56:37 leads to trust of epoch times so it's not the fact that people question is where they land 56:44 in this like trust of even more dubious sources that we need to think through and I think that the idea of constantly 56:51 questioning this literacy of constantly questioning so you don't land somewhere is absolutely key and tried to think 56:57 through structures of trust um in terms of what's real real of course 57:03 is race it's a thing right it's uh it gets us to all sorts of questions about what what how these things emerge and 57:11 how we consider them to relate to others and I think that starting from those 57:18 differences is really key because it moves us away from saying again this is post-truth this is post-modernism blah 57:24 blah blah as if this was new but rather getting us to think through these moments of trust and authority and 57:31 why it is that we expose these things when we do and how we do them 57:39 Alex did you want to add anything to it well I'm just reminded of the old Gary 57:45 Larson cartoon that shows that the dog typing on the computer saying nobody knows you're a 57:50 dog on the internet uh that that that sort of the es- you know this they knew that 57:57 (inaudible) there's a cat and mouse game not to fix too many animal metaphors going on 58:04 actually I sort of had fun illustrating this with the Bayesian spam filters which is a sort of 58:10 well-known uh even uh web 1.0 example of or even 58:15 pre-web right email hacking uh competition that's sort of going on can you get people to click 58:22 on um emails that sell you something and uh or to give away your money or just get 58:27 something in front of your eyeballs that you don't you didn't particularly think you wanted 58:33 to see so that's authenticity and the there is just this huge financial 58:40 incentive now we have this tool of instant duplication by you know you can 58:46 send billions of emails with with very little effort um 58:52 that's what leads to sort of the danger it's the scale that this is on you know in the old days you'd have to print 58:57 pamphlets and stick them on you know around town or something now you the scale in which you can uh fake 59:04 authenticity is huge and the amount of effort therefore that computer scientists are going to spend 59:10 meaning you know whatever we call them hackers right but maybe they're just people working for a company are going to spend 59:17 uh trying to trying to fake something to get in get inside your 59:23 your soul in some way is huge and that that's you know it's just not happened and 59:29 that's why democracy is at such stake as well truth is 59:34 very slippery to define even in academia right with these these studies the recent 59:39 uh repeats of like 100 classic social science papers and maybe a third of them actually hold up at the level that they 59:46 were originally claimed for businesses it's I mean they're they don't they 59:52 well it's kind of interesting they're constantly doing AV Testing as Wendy explains like the the multi-armed bandit they're trying things 59:58 out on us so they're actually the algorithms are learning truth about human behavior 1:00:04 but then it's not it's not for public uh discussion it's it's buried in an algorithm that you'll never find that 1:00:10 just means that this business is better at advertising at this one and nobody will ever really know why 1:00:17 yeah that's a very scary situation yeah as you were talking I thought about the corporate strategy of doubling down 1:00:24 which seems to be very mathematically oriented that something that's obviously a lie can become true if you simply 1:00:31 multiply it more yes you know um and that 1:00:37 is why I think these questions around authenticity truth and reality have become so loaded 1:00:42 that in some ways they come to mean their opposite you know the more strenuously someone asserts something is 1:00:48 true in some ways the more you ought to be concerned about it 1:00:54 so here's a great question um for people who are teaching and even for people who are just 1:00:59 not or taking classes which is what recommendations you have for teaching current and future undergrad students 1:01:05 who've grown up in the environment of pervasive algorithms and have no other comparison which is our students um 1:01:12 this just moved I'm going to find it again uh oh sorry this chat is it it's it's moved 1:01:20 to answered oh it has it okay thank you um so how do we uh 1:01:25 work with students who don't have any basis for comparison so it's not as if there was a before and after for them 1:01:31 around before all my data was gathered and you know I was misrecognized in these painful ways 1:01:38 versus the old ways how does things skeptically about the environment especially when it feels 1:01:44 useful to have personalized ads or personalized recommendations so I'm thinking especially about Spotify Wrapped or 1:01:50 things that are genuinely pretty joyful for people you know how do you put that into 1:01:56 context so even the things that produce pleasure and don't seem to be so bad for people 1:02:02 who have no other moment to remember but there's moments 1:02:10 I think what's key is you start with those moments I mean there's this view somehow that these undergrads are 1:02:15 completely naive that you know they're they need to be taught something before after 1:02:21 um as a way to get to them to think critically about what they're doing but I think we need to respect the fact that 1:02:27 they do think critically and that but to to move that and start with that in 1:02:33 different ways so I used to teach a very large digital media class at Brown 1:02:38 and I would start off it was it drew students from all disciplines it was it was really a fun course to teach um and 1:02:46 I would always start with uh the ways in which they're already using these 1:02:51 these technologies critically right and so back then Snap was huge and you know I asked them how many people you know 1:02:57 they always Snap how many of them actually think that these images disappear and they all put up their head and said they didn't think it 1:03:03 disappeared and I said how many of you regardless though would would not send me something on Snap that and 1:03:11 they're all like yeah so it was trying to get to think through their practices already I mean if you're talking about 1:03:17 Instagram I mean clearly people who are gramming understand that they're characters within this drama called big 1:03:24 data that what's being captured and manipulated and thought through isn't this sort of simply behaviorist 1:03:32 knee-jerk reaction but rather their engagement with it as character so I always say like let's think of ourselves 1:03:39 as characters rather than marionettes which is how you know the social drama 1:03:44 presents us because the social drama has only a behaviorist understanding of humans right 1:03:50 um to go to that that rich depth and gaps that we know are part of the process and say use that as the 1:03:56 beginning and we think ourselves as characters which students completely understand 1:04:01 immediately you also understand that if you're a character you have a script you have agency but there are ways in 1:04:08 which things are controlled also the ways in which your performance is limited there's also the 1:04:15 the question of finance and the whole performance etc etc um but then I say as well what's so key about it is that um 1:04:23 your character but your actions and your script are determined by everybody else 1:04:29 um so I get everyone then to open up their phones or think through their technology and I say well the way this thing works 1:04:35 is um and I have them open TCP/IP and do a open terminal do a TCP/IP dump is that 1:04:42 they then see the ways in which their card is actually sucking in everybody 1:04:47 else's traffic and then only actively deleting what's addressed to them so the ways in which your their computer's 1:04:54 default mode is actually what's called promiscuous mode right so part of what I do is get them to open up their 1:05:00 technology and to think through their own actions and what they understand about this 1:05:05 um and and the limitations and possibilities of it when we we start thinking about the ways in 1:05:11 which um correlation for instance means that we're always correlated to others so how could that be actually a place 1:05:17 from which to intervene rather than think of ourselves as only completely manipulated 1:05:31 should I haven't actually taught in such a class I have been a professor of mathematics for many years and try to 1:05:37 bring real-world examples in and the closest I've gotten is statistics 1:05:43 and you just get to see how the different presentation of statistical facts affects 1:05:48 your your feelings about them and that's shocking to people in a math class perhaps you know uh 1:05:55 there's some wonderful books like Kathy O'Neil's book on uh Weapons of Math Destruction that 1:06:01 should be mentioned that certainly eye-opener for me it's not a it's not a math book though it's a it's a popular book so I 1:06:08 would love to know what textbooks there are that sort of bridge the gap uh that 1:06:13 for the undergrads but I don't think this is fundamentally new I mean it's a bit like 1:06:18 uh you know why don't we teach the undergraduates about the meat industry or some other highly uh 1:06:25 processed and and intricate uh system that the capitalist system that's 1:06:30 set up for our benefit you know supposedly for our benefit right but uh so you can buy cheap meat at the 1:06:36 supermarket but no one ever questions it they're like meat comes from the supermarket that's the same level of engagement that many you know 1:06:43 you have you know with Facebook let's say you want to like understand how is the meat 1:06:48 grown in the animals and the whole packaging and the distribution and the er- and the 1:06:54 environmental effects all of these things you need to do the equivalent of that for the social media I think 1:07:02 but also Lisa I know that you've been teaching and thinking through these issues 1:07:10 oh wow I don't know I feel like um as you say 1:07:16 just because people don't remember a before doesn't mean they're super critical about now 1:07:23 you know I think if anything students are far more critical about now and the issue for me in teaching is to kind of 1:07:29 hold back the incredible anger and kind of rage and frustration around 1:07:34 how this happened and try to strategize a so work 1:07:40 backwards and we're forward so remember where these things came from in a way that isn't just about hating or loving 1:07:47 who we see is in charge right because that's a tendency that you see in a lot of histories which I don't think is 1:07:53 great because people are rarely in control of what they do so it doesn't make sense to 1:07:59 find victims and heroes and all that um but also thinking forward in terms of yes this is horrible but let's get 1:08:06 really creative or really off the map around what could be the difference because at this point there's nothing to lose by being 1:08:12 totally unrealistic right like I think the moment for trying to be measured and thoughtful and what 1:08:18 can we actually achieve which I've been asking you is a kind of red herring you know I think as you say Indigenous 1:08:24 philosophy Afrofuturism assumes that the the apocalypse has already happened so 1:08:29 if that's the case kind of incremental or conservative fixes to things 1:08:36 you know like me not doing Facebook or trying to get cleaner data or let's just optimize you can't optimize your way out 1:08:42 of this because it's a kind of root and branch problem right so I think that social media 1:08:47 companies are like we're gonna do a better job with moderation and hire more moderators it does not matter how many 1:08:53 moderators you hire at this point like if every human in the US became a moderator it would kind of 1:08:59 be like Borges' Map of the world that's the same size as the map as the world you know like in some ways everyone is 1:09:06 already doing the work of moderation but they're just not doing a great job 1:09:11 they're just not agreeing about what that work ought to be so I think this idea of common sense which I've been 1:09:17 very interested in for a long time the moderation policy was common sense 1:09:23 which is to say use your personal ideology and we will back you up um because we've you know section 230 we 1:09:29 are not responsible you are responsible um so I think you know really 1:09:36 unreal you know unfeasible solutions are very interesting 1:09:42 in a new way to me for that reason yeah yeah Iisa Wendy here I think that what is so key about what you brought up 1:09:48 too is that we need to refuse the terms that we're given right so moderation makes it seem as if suddenly we're going 1:09:55 into content when in fact I mean and this is Tarleton Gillespie this is Sarah T 1:10:01 Roberts etc they've shown that it's always been moderated it's always been shaped 1:10:07 so the question of moderation itself is so beside what we're trying to take on 1:10:15 right and having AI do more of it is the totally the opposite way 1:10:22 for the for many of the reasons you say in your book right like then you're really tripling down quadrupling down 1:10:28 right multiplying the problem and just making it more pervasive 1:10:33 so I want to get some more questions because these are such good questions so there's a 1:10:39 question here around where a person ought to start in this book if they want to learn more around 1:10:45 authenticity the real and truth and how it relates to data and everyday lives 1:10:52 so the question uh sorry this is Wendy uh is where to start yeah what's a good moment or chapter that you would 1:10:59 identify as a place for a person just starting the book to focus first 1:11:05 if they were interested in questions of authenticity then I would direct them to 1:11:10 the the chapter on authenticity which is the third chapter um which brings together studies of reality tv and 1:11:18 training in that sense right which is reality tv is all about training 1:11:23 and trying to think through algorithmic authenticity and the relationship between the two so that's the third 1:11:29 chapter I mean the book is rather strange just to talk about structure a little we have these sort of meaty 1:11:35 chapters and then each starts with a little theoretical interlude that goes into some of the questions that were 1:11:41 raised as well as Alex's incredible mathematical illustrations so I think 1:11:47 that also um having those interludes and conversation with that chapter around authenticity 1:11:54 would be a great place to begin 1:12:01 yeah I'm going to throw in a question about randomization that's inspired by a chat question which has to do with the 1:12:08 price of of diversity in a way so you were talking Alex about how you know 1:12:15 part of what data analytics does is to reduce randomization so everything gets matched 1:12:21 to something else which is legible to that thing which tends to reproduce different kinds of problems um this 1:12:27 question is about the price for people say who are a marginalized group and who are 1:12:34 mixed in with people who are unlike them so you know what's the human price I guess of 1:12:41 heterophily or mixing when we've already created a situation that's so toxic and painful for people 1:12:48 who are increasingly marginalized and discriminated against in everything they're doing right like 1:12:54 going to school and trying to live in a neighborhood and trying to buy stuff right or get a mortgage 1:13:02 how do you compensate or help the people who are then 1:13:08 having to be the random in the randomization that's a great question I think what it 1:13:14 brings out is part of the problem with homophily isn't simply that it talks about similarity breeding connection is 1:13:20 that it makes users and people responsible for segregation right so homophily becomes a 1:13:28 synonym for discrimination right but it erases questions of institutional discrimination it becomes a homonym 1:13:36 for a community that erases the real work that needs to be in place for communities to emerge so one thing 1:13:42 that's absolutely wrong about homophily is this notion that it's simply voluntary and produced 1:13:48 and so it's only it erases questions of institutional racism etc because those 1:13:54 questions of institutional racism are also there at the the question of these quote-unquote solutions which is throw 1:14:00 in a diverse person and then suddenly it's the work of the diverse person to make everything better right um and 1:14:07 completely ignoring those questions of institutions and the ways that that 1:14:12 relies and what I think is really important is if you look at these historical cases and ask yourself what 1:14:19 could have emerged um so again this the notion of homophily actually comes from the preferences of white residents 1:14:25 living in a biracial housing project for friends who were like them who were liberal like them or illiberal like 1:14:32 them where liberal meant that you thought that um housing projects should be biracial and you thought the 1:14:39 race has gone along illiberals thought the opposite but there was this other category called ambivalent which thought 1:14:45 the races shouldn't be mixed but people got along and then there was also the fact that 1:14:50 they interviewed all the black residents so to come up with the term homophily they got rid of most of their data they 1:14:56 got rid of like all the black residents responses and they got rid of the white ambivalence which was also the largest 1:15:02 group right because they wanted the sort of intense um decision-making 1:15:07 but if you go to some of the responses of the black residents they said we would you know we're here because we're 1:15:14 finally getting equal housing for equal money um that we're here because finally you 1:15:21 know the what's being um done is what's fair 1:15:27 um and it's and they said you know we would welcome everyone here but we're not out there to say oh we need a white 1:15:32 person here to make it diverse right so I think that thinking through for me what's key is indifference 1:15:39 which is the ways in which you start you build structures of care outside of this logic of love or hate um 1:15:46 but rather through this this notion that we will have to live in difference and we live in difference um 1:15:53 in ways that can be caring and so that for me it's absolutely key and I know that you and I Lisa at least have been 1:15:59 thinking through the limitations as well as this notion of heterophily that somehow if you simply just love the 1:16:05 opposite this will solve everything um and it's key that you know 1:16:10 this idea of simply loving what's different is also what drives corporate models 1:16:16 yeah or what drives certain ways to quote-unquote expand um recommender systems yeah it's so easy 1:16:23 to fetishize the different too because it's pleasurable to have difference it's stimulating and fun and 1:16:30 everybody likes Alex you wrote something can you read it or speak to it yeah go ahead I did yeah sorry I kind of hit 1:16:36 the same question on text that you were alluding to but 1:16:42 I mean I would love to know more of the social science literature I'm not an expert on that but 1:16:48 on the student housing I have read quite a bit and Elizabeth Armstrong in 2006 and her colleagues 1:16:55 embedded themselves in a sorority which with very homogeneous uh sort of 1:17:01 uh basically a white midwestern sorority and study like mechanisms where competition 1:17:08 leads to more dangerous behavior in student life and so I don't on the top of my head 1:17:15 have a paper that shows sort of the randomization how randomization works but you need small communities here you 1:17:21 can't really say we're going to take the whole internet and randomize everybody because then you will get the noisiest 1:17:27 most racist you know hate speech will still like come to the top because of the way our algorithms 1:17:34 you know work you have to build I think small communities that have accountability uh 1:17:41 online just like you do doing they work in real life right uh you know human beings have been 1:17:48 fighting the urge to form gangs that kill each other over the 1:17:53 years you know which which happens right but you've got 1:17:58 to try and form small groups of uh where there isn't a power imbalance right but that's 1:18:04 it that's the other thing you can't randomize people uh balanced in power 1:18:11 so I don't know how you do that online you know 1:18:16 your friend group should be chosen you know the people that it suggests should be 1:18:21 complementary and uh rather than you know just more of the same 1:18:27 but also um this is where you to bring up like ethnic studies and you asked me about ethnic studies and Asian American 1:18:33 Studies has done a lot of work precisely around this you know what does it mean that the wound is actually the also the 1:18:39 the formation of community like how can we think through the the legacy the psychic legacies of 1:18:45 of um abuse and how can we as well the ways in which care emerges in these communities um and think 1:18:52 rigorously about them in ways that that don't devalue this and I think that thinking through those those modes of 1:18:58 care is absolute care and powers as Alex put it are absolutely key as well as the 1:19:04 deeper sort of psychical legacies well thank you so much for writing this 1:19:09 book I think it's wonderful for teaching um it demystifies a lot of what 1:19:14 computers are doing which we can always use more of and it's just beautifully written so thank you both so much and 1:19:21 the book is available everywhere and right when did you want to talk more 1:19:27 about how people can get it yeah um so it's available everywhere there is a discount code as well 1:19:34 nice yes but it's only for 10% off and it only works in the U.S. 1:19:40 um and so I'm trying to find the discount 1:19:46 well um while you're looking I think we can also push it out through our social 1:19:51 media in DISCO and Digital Democracies and if you love this talk you should 1:19:59 become involved in DISCO which is a growing network and we want to include you know of course junior scholars 1:20:05 Jessica's going to say something about it great we just want to share a screen right now 1:20:11 [Music] hi everyone this is Jessica Hill Riggs 1:20:18 speaking and I'm the program manager of the DISCO Network which is a co-sponsor of this event 1:20:24 and we wanted to end this book talk by sharing more information on how to get involved in the DISCO Network 1:20:30 DISCO stands for Digital Inquiry Speculation Collaboration and Optimism 1:20:37 network and we're a new research initiative funded by the Mellon Foundation that 1:20:42 explores the intersection of identity and technology to envision a more just equitable 1:20:49 anti-racist and anti-ableist digital future we're a consortium of five universities 1:20:55 and research labs at the University of Michigan Purdue University Stony Brook University and the University of 1:21:01 Maryland-College Park and Georgia Tech and as a network we offer fellowship 1:21:06 opportunities scholarly training mentoring and publishing opportunities and public programming such as lecture 1:21:14 series like this one and conferences on cutting-edge digital topics regarding identity and 1:21:20 technology and we'd love for you to get involved please visit our website 1:21:27 disconetwork.org to join our listserv to learn more about our network's upcoming events 1:21:32 and opportunities this spring and summer 2022 we're also active at Twitter @disconetowk_ 1:21:40 so please tweet at us or please feel free to reach out to us via email 1:21:46 at disconetwork@umich.edu and thank you for joining us at this event 1:21:52 have a great rest of your day and I also just wanted to say thank you to Alan our ASL interpreter and to Katie for 1:21:59 providing live captioning services and thank you 1:22:05 yeah thank you and I have to say DISCO is one of the most exciting things that are is happening right now so that 1:22:10 everyone should join and participate and that this is you all are going to change the world 1:22:17 well thank you Wendy and Alex this was amazing thank you thanks everybody thanks for the great questions thanks