*transcript cleaned by Kerri Graham Okay, welcome everybody to the Futures of Race, Gender, Disability, and Technology Superpanel from DISCO, which is the Digital Inequality Speculation Collaboration Optimism Network. And I'm Lisa Nakamura. I'm a professor at the University of Michigan, and I'm a Co-PI in this project, which lasts for three years. So I'm going to talk through a little bit about who we are, who has helped us to get here, above all, the Mellon Foundation, who funded our project, but in addition, some of the sponsors I list here, DEFCON, Critical Digital Humanities, our own Michigan Library, and so on. We are a network that integrates critical humanistic, social science, and artistic approaches to digital studies and foregrounds questions about the cultural implications of technology to envision a new anti-racist, anti-ableist digital future, hence the optimism. We are a lab made up of labs with a hub, sited at the University of Michigan. We have the five labs you see here on this map, if you're looking at them. If not, I will just read them. We have Remi Yergeau's Digital Accessible Futures Lab, digital AF, also sited at Michigan. Stephanie Dinkins's Future Histories Studio at Stony Brook University, Rayvon Fouché's Humanities and Technoscience Lab, the HAT Lab at Purdue, Catherine Knight Steele's Black Technology and Communication Technology Lab, BCAT, and André Brock's Preach Lab at Georgia Tech. If you want to get involved with what we're doing, which I very much hope you will, you are welcome to connect with us. There's two ways to do so, in addition to following our Twitter and seeing what we're doing with the many people we're going to be bringing on to work with us and events we're going to be having. If you'd like to be a member of the network, you can click the link below to receive the newsletter, get invitations to events, and learn about opportunities to do collective research, work in groups, and projects. If you'd like to be more deeply involved in the DISCO Network as a researcher, as an institution, you can apply for affiliation as a research affiliate. So here's the order of things I'm going to present. We're all giving these short talks. Rayvon Fouché, who is the Professor of American Studies in the School of Interdisciplinary Studies at Purdue, his work explores the multiple intersections between cultural representation, racial identification, and technological design. He is the author of many important books, most recently the Game Changer from Johns Hopkins, Technoscientific Revolution in Sports. He is currently working at the NSF and was the inaugural Arthur Molella Distinguished Fellow at the Smithsonian's Lemelson Center for the Study of Innovation. Next, we have Stephanie Dinkins, who is a transmedia artist who creates platforms for dialogue about race, gender, aging, and our future histories. Dinkins' art practice employs emerging technologies, documentary practices, and social collaboration toward equity and community sovereignty. She's a recipient of many, many awards and holds the Kusama Endowed Professorship in Art at Stony Brook. Next, we will have André Brock, who is an Associate Professor of Media Studies at Georgia Tech. His first book, titled Distributed Blackness, African-American Cybercultures, as you have no doubt seen and heard about, came out from NYU Press in 2020, theorizes Black everyday lives mediated by network technologies, recipient of several book awards. And Remi Yergeau, my colleague here at Michigan, Associate Professor of Digital Studies and English, and Associate Director of the Digital Studies Institute. Their book, Authoring Autism on Rhetoric and Neurological Queerness, was awarded the 2017 MLA First Book Prize, the Four Seas Lavender Rhetoric Book Award, and the 2019 Rhetoric Society of America Book Award. Currently, they work on a second book about disability, digital rhetoric, and surveillance, technically, tentatively titled Crip Data. Last, we have Catherine Knight Steele, an Assistant Professor of Communication at the University of Maryland College Park, Founding Director of the African-American Digital Humanities Institute. She will soon be directing the Black- BCaT Lab, and she has a PhD from University of Illinois at Chicago. You probably know her, however, as the author of Digital Black Feminism out this last year from NYU, which examines the relationship between Black women and technology as a centuries-long gendered and raced project in the U.S., using the virtual beauty shop as a metaphor, Digital Black Feminism walks the reader through the technical skill, communicative expertise, and entrepreneurial acumen of Black women's labor, born of survival strategies and economic necessity, both on and offline. We have a little time for questions, so please tweet out your questions if you have them, or you can put them in the chat. So I'm going to stop sharing my screen, and I will give my talk, which is our first talk. And I don't have any slides, so there's no visual assets here, just me talking. So this talk is an excerpt from the book manuscript I'm working on right now. It's called Digital Racial Capitalism, and this section of it is called Digital Racial Capitalism and The Metaverse as a Digital Diversity Workspace. The past, present, and future of digital racial capitalism, which I define as the foundations of capital or wealth accumulation upon the unfree labor of slaves and other non-compensated non-white labor, has roots in racial capitalism, Cedric Robinson's term. Jonathan Beller coins the useful term computational racial capital to account for the ways that money, quote, money, race, and information become, quote, real abstractions that subtend this new form of commodification enabled by the computer. He argues that information, computation, and money must be occupied and decolonized at the level of social relations. Agree entirely. I take his claim and his invitation to make abstractions less real by focusing on the value that women of color as such, in other words, women of color who claim this label or claim this identity, how they produce digital labor in the same way the capital was produced by converting sugar cane into sugar, cotton plants into cotton, and in this case, the data that then and now allowed European and white owners and investors to rule from a distance. This is the data that feeds what Jonathan Beller calls the world computer. It's difficult to write about the contributions of women of color to an industry whose history actively excludes them. However, excluding these contributions as, quote, real work is necessary to uphold the diversity, equity, and inequality, DEI, a term that started trending in the early 90s and became ubiquitous during the techno racial period, in other words, the post COVID kind of post Floyd moment, actually matter. The diversity industrial complex defined by Kimberly Springer as organizations and individuals invested in framing discrimination as an apolitical tolerance for difference through linguistically downplaying bigotry, social norms, and business practices while avoiding historical context of power and oppression, end quote, became itself an industry rather than a handmaiden to industry, the creator of a new sentimentalism based on calibrating white, quote, sensitivities. Legal scholar Joan Williams describes the diversity industrial complex as, quote, the standard approach for making token hires, offering sensitivity training, setting up mentoring networks, and introducing other incremental changes that focus on altering women's behavior to, say, make them better negotiators. So this section is focused on the increased need, kind of the exhortation for all industry, but especially the technology industry to be more diverse, to create new positions in DEI and make those C-suite positions, in other words, reporting to a CEO as opposed to kind of being off to the side, while at the same time using technology itself to make people more responsible for the kinds of racial misogynistic, you know, problems that they have and seeking to solve those problems through technologies that are ultimately apolitical and exploit the labor of women of color. And I mean the labor of women in color, both of actual women working in companies, but also avatars of women of color. Sarah Ahmed defines diversity work as public relations work that creates a, quote, what she calls a diversity culture within an organization in order to protect it from the injurious charge of racism because, quote, the diversity world is shaped by the failure of diversity to bring into existence the world it names, it contains a double bind within itself. In its current form, it domesticates and renders a liberal critique, liberal again. As Barbara Christian writes elsewhere, well before the internet, women of color serve a surrogate function at the university. What they do is quickly not only made into not labor, in other words, if you're a creative writer, now that's not seen as legitimate scholarly labor, but rather as art. So as soon as you are brought into the institution, what you're doing is made not labor, but it's interpreted as an institutional threat. In other words, the thing that you were brought in to address now becomes the thing that you are seen to produce. Similarly, digital technologies have continually positioned themselves in relation to women of color as threats. So women of color are threats to non-unionized and cheap labor, because they are that labor. Threats to grow a thicker skin culture of online gaming, which has governed that industry for a long time, threats to free speech rights, supposedly that conflict with hate speech laws. Yet at the same time, women of color have used digital to build their identities, again, as such, claiming that term specifically. In 2011, Google Trends saw a massive uptick in searches for that particular term, which doesn't index anything but interest in that term on that particular platform. But it's still really interesting that that was the moment, right around that moment, that Facebook became a mobile technology. In other words, when social media becomes actually mobile, it goes out into the world. You see this shift, too. Yet at the same time, women of color have used digital, as I said, to build their identities as such, an enterprise that produces great value for the tech industry specifically, which is plagued by diversity problems to an extent that other industries even are not, and as an intrinsic condition to computational powers' evolution. And I've written about this before. This is a chapter in this book as well, how Navajo women's labor was foundational, as was the labor of lots of other women of color in Malaysia and other places where women are cheap and set to the side of what industry thinks it's doing. So the aesthetic and technological innovations of marginalized people are and have always been a wellspring of critique in the midst of impossible circumstances, an invitation to think otherwise. And I credit Grace Hong's book, Death Beyond Disavowal: The Impossible Politics of Difference for this. Reading digital history through women of color's labor allows us to learn about, to remember, in other words, to put people in their stories, to kind of add limbs to this body, to make material what was made immaterial, to populate stories with specific images, narratives, scripts. While digital solutionism attempts to solve social and economic crises by incrementally adjusting or changing features like algorithms, gathering different data sets, or even by making the way they work more transparent, giving more information about how processes do what they do, in the end, they all reproduce the problems of computation as governance. And I mentioned, I see Louise Amor is working on this. I propose instead that we engage in acts of critical and creative memory and recovery of precisely those people who have always been excluded from the category of digital innovator and holder of capital, women of color. Examining their specific contributions to infrastructures, platforms, and content enables us to speculate a different trajectory for the internet. In a chapter from my manuscript, Racial Digital Capital: Women of Color, Digital Labor, and the Internet, I extend the concept of identity tourism, which was developed in the 90s, to describe the use of racialized digital avatars as prostheses for white users to enjoy the benefits of cross-racial passing and sympathy. The post-2020 pivot towards the metaverse and the rebranding of Facebook as Meta draws upon previous global village images from the 90s and 2000s to describe a new cosmopolitan and multiracial post-racism. In this chapter, I argue that the advertising campaign for Meta-Oculus' Open Your Eyes, which aired in movie theaters and has been viewed a lot of times on YouTube, takes you into an immersive experience where you stand with Black women in a candlelight vigil mourning the death of probably a Black person at the hands of the police. It shows you beating a drum with an Indigenous man and chanting. It extends the opportunity for embodiment and creates a diversity surround immersion with Black and Indigenous people. What I do in this chapter is critique the idea that, quote, feeling and care as ends in themselves do anything more than to limit the points of possible action. In a wonderful article that came out not too long ago about the ruse of repair, Stilke talks about this as inextricable from US imperialism and neoliberal racial capitalism. These projects model how immersive and augmented reality applications position digital experiences of racial pain and conflict as edifying and diversity enhancing. So there's more, but I'll leave that. I'll leave it there. Thank you. And our next speaker is Ray Fouché. All right, excellent. I hope everyone can hear me. I will start sharing my screen. I think I'm going to go maybe in a different direction from Lisa a bit and really thinking about answering the question of the future of race, disability, and the digital. And I'm going to talk from my position where I sit right now. And I think for the folks who are on this Zoom space and are interested in questions about race, difference, diversity, inequality, and equity, and various forms of information technology, digital space, I would say that the future is here. And trying to get to that question. And again, I'll think about this question in a more broad way. Well, I shouldn't say a broad way, but a different way. So meeting of the future is here. 2022 has been an amazing year, right? April 7th, we get the confirmation of Katanja Brown-Jackson to the Supreme Court. And right, that's an important marker in thinking about race and difference and equality as having a Black woman on the Supreme Court. But kind of stepping down from that level, there are also things that have happened that have changed the landscape of how we potentially will think about race and difference. May 10th, a few weeks ago, Lisa Cook, first Black woman appointed to the Federal Reserve Board. It's interesting Lisa Cook's work is on Black innovation and invention in economic context. But what's also happening is, moving down from Lisa Cook, is that you have Dr. Carlotta Arthur in January, named the Executive Director of the Division of Behavioral and Social Sciences Education at the National Academies. And similarly, you have Alondra Nelson, on February 17th, appointed the Director of the Office of Science and Technology Policy. And as well as myself, February 28th, I became the Director of Social and Economic Sciences Division at the National Science Foundation. So what this all means, is this a unique moment, a deeply unique moment for people of color and government service, and specifically setting government-level policy and questions and conversations that not only can insert commentary about diversity, equity, inclusion, and technological spaces into existing policy, can actually literally shape it and have the funds to make it happen. So this is a fundamentally unique moment, and timing is everything. And the first questions I have for answering this thing, and now that we have this future and it's here, what do we do? And I'm excited to be part of the DISCO Network, because in many ways, Lisa bringing this group together has put us out ahead of the curve. So DISCO is at the forefront of this conversation about how do we think about the digital future and the ways in which people of difference and color can fit into this conversation. But as we move forward in this kind of collective memory of our effort, we have to remember that rhetoric, language, and terminology is key. And so I will speak a bit from my seat at the National Science Foundation, and the terms that I hear floating around regularly from information to data to equity, as I say, minus the D and the I, the language of evidence and co-production. And so for me, what does it mean to get scholars that are interested in the relationship between techno-scientific infrastructure and diversity and equity and inequality to produce research and scholarship that can make its way into policy-level conversations? And at this moment, we have a president and multiple people in government service that are willing to listen, and they have their ears open. However, we have to think about how can we be very proactive in shaping that language in a way that we want to, to make people understand what we're really thinking about. So I'm intrigued with the language of equity being deployed regularly without diversity and inclusion. So what does it mean to strip diversity and inclusion when we're talking about equity? And I think people understand what that move is. So how can we remember to bring that conversation back in? And for someone like me, I see the word evidence a lot. How do we produce evidence? And I'm interested in using that flexibility in the term evidence to think not just about quantifiable data, but evidence can come in multiple different forms. And how can social scientists, humanists, other scholars, artists bring and open up the language and understanding of what evidence can be and how evidence can shape our contemporary and moving future. So I think it's important to think about that. And more recently, I'm intrigued with the language of co-production because co-production from my background in science technology studies is a very fascinating way of thinking about the production of scientific and technical scientific knowledge. And there have been some fascinating things that have happened just very recently. Literally a couple of weeks ago, Ojandra presented a talk at the National Academies titled Co-producing Knowledge of Communities: Equity in Federal Research Programs. And what's fascinating is that I think about co-production in one way. However, the climate scientists who were funding this and organizing this event really are thinking about co-production in a potentially very narrow way of adding people of color and difference to the studies that they do. However, how do we get research and scholars and people say in the DISCO Network to produce research that will continue to open up that conversation that the co-production of knowledge with communities is a continually iterative process. It's not adding one piece to the puzzle. So as I put it, oftentimes the language is that the communities are just sprinkles on the cupcakes of knowledge production. And what I'm interested in having the conversation about is to think about how do we, in a sense, cook diversity, equity, and inclusion into the batter of the cupcake so it no longer can be flipped off. It's interstitial to the actual working of scientific and productive knowledge. And I think there's a moment in time, specifically this moment in time, that we can make that argument and leverage this conversation and open up that discussion in broader ways. And some of those conversations are having, I would say, real fundamental impacts. For instance, this landed on my email this morning from my NSF account, highlighting Asian American, Native Hawaiian, Pacific Islander Heritage Month. And so I'm sure we've all seen these messages sent out to us at different moments in time. But it's probably already hard to see is that if you go down to the third paragraph, it says, NSF is taking a comprehensive approach to mitigating such barriers by investing in research to understand, address, and end bias discrimination and xenophobia. Right. That's a powerful statement for, I would say, a conflict-averse government agency to put out in front of the discussion. Right. And it notes that NSF is currently investing in more than 100 grants across the country to help reduce discrimination and violence experienced by historically underrepresented groups, down to the level of bringing together Roli Varma and others to have this conversation. So part of the idea for me is like, so how do we leverage what the DISCO Network has begun and actually provide research, scholarship, questions, and ideas to these government agencies and not only just, again, be the sprinkles on the cupcakes, be built into policy and strategies and ideas about how this can happen. So my last comment is that time is of essence. The midterm elections are coming soon. And this is a moment where there are a cohort of people who are invested and interested in having these conversations and hopefully creating long-standing fundamental change in the ways in which government agencies fund research and support people of color and difference. So I'll end it there. Oh, please. I'm trying to give you the clap emoji. I'm sorry. Okay. Well, thank you all. Stephanie. Yeah. Hey, everyone. I believe that I am up next. So I'm just going to get my share up and then we will do a little talk. It's really great to be a participant in this discussion. I did not know exactly what I was going to talk about, but listening to Lisa and Ray has left me in a place that I think will be good. So Ray was talking about the future is here and what it immediately made me think about well is yes, the future is here and the future's future is upon us in a minute as well. And for me, that's a question of what do we do with it? How do we start to sculpt it? How do we craft it? And so I'm going to give you a really quick kind of weird insight into some experiments I've been doing lately and some thoughts I've had. So what you're looking at here is a self-portrait of me as a UV map, which means the image data that would make me a 3D character or a virtual character. I realized lately that I've been almost referring to myself as stephaniedinkins@gmail.com. That feels to be my name. And so it feels as if I am being slowly pulled in to this realm where I am digital, like I'm being assimilated into it, sometimes very consciously, sometimes unconsciously just through use. And the question for me is, well, what is it doing to us and what can I do to it? And so the first experience or experiment I'll share with you is this image of a Black woman crying. The image was made with a text-to-image GAN, so it's an algorithm, a general adversarial network that you put in a text and then it produces an image. And this is what a Black woman crying looked like in 2017 using a system online called Runway ML and the data that they had available, right? So that's the illustration that it came up with. This was really an intriguing moment for me that started me thinking about, well, if this is what a Black woman crying looks like to the system, what do we look like in general? And how do we start to teach the system to see us a bit better? Same here, an African-American woman smiling. I will tell you that I switched from Black to African-American trying to give the system a little space to function. Like I'm trying to help the system to come up with better images. And clearly about a year and a half later, it does a better job. And this is from Google Colab, and I tell you where they're from because anyone can go online and play with these. This is just a slide of the system trying to put together these images and mold the data that it has into an image, this one, of an African-American woman smiling. And there it goes playing, and it just goes through like thousands of iterations of this. This is a dark-skinned African-American woman smiling. And you can see how like the more specific I get, the less specific I get in the imagery. Like I get these abstract images that don't quite have any basis in what the representation would be. And in fact, when I ran a few others of these, it seemed as if the image were coming really from porn sites, right? So we're being constructed out of porn. Again, you can try this on your own and see what happens. This one you can do on Google Colab. But it made me think a lot about data sovereignty, right? And if we're thinking of data sovereignty, a group or individual's right to control and maintain their own data however they see fit, which includes the collection, storage, and interpretation of their data. I've done projects that kind of work towards this, but at the same time, I don't believe it's possible, right? You come to a point where you're like, well, my data is going out there. More and more things are being based upon whatever the data is. What on earth do we do to start to make it something that protects the future's future or makes the future's future what I think about as caring, right? And so the last project or maybe last two projects I'll share with you is binary calculations are inadequate to assess us. This is me trying to make a system that allows us, all of us, to start to tell or create data sets that are full of information that understand us. So the nuances of who we are and what our cultural accoutrement are, what's important, how we're defining ideas around what our communities are, what our collectives are. And Ray, I'm going to be thinking about the idea of community and collective a lot going forward. And this is the app itself. You guys can download it if you want. It's a participatory app that asks you questions like, how do you define care? If we're going to think about ideas of care, we need to have a definition that we kind of agree on. The question becomes, how do we get to it? And for me, it becomes a question of kind of digital democracy where we put a tool out there and allow people to answer in any way they can see fit. In this app, we've tried to make it pretty open where people can talk to it. They can type into it. They can take pictures and submit it. And we ask questions like these, like, how do you care for Black women? What would be the best way to make sure all of your neighbors are cared for? And then again, we will use algorithmic means because I think those are going to be the ones available to start to coalesce that information and spit out some, I'm going to say, quote unquote answers to those questions so that we kind of have an idea what we're thinking. And then we're trying to make a data set, right? This data commons that has images and text like a data set that researchers can use within their projects that has images that are well explained and explained with nuance and text that are well explained and explained in nuance. And the reason I think this is important because a lot of times when people are doing research, they're using the same old data that they pull off the shelf and use, or they're making really small data sets to put together that are like they're not big enough to do the work. And I really think that our data set will be wonky because we're collecting and letting people express how they do, not the way we want them to. It's going to be hard to like mold that into something usable, but we're going to do our best to mold it into something usable to model that it's possible, right? And I always think that I, through my art practice and making weird models of things, and if you guys want to download the app, you can use those QR codes on screen to download and submit. But also to append to those data sets that people are already using, right? So if you need data sets that are huge to do things, and if they are inadequate, can we put, right, can we amend those data sets with other data that fleshes it out to be broader, right? And to me, this is a question of can we create data sets of care and generosity? And what does it take from us, sadly, right, to do that? Like, what does it take in terms of labeling? How do we put things together? What do we do, right? And the what do we do? How do we do it becomes the really important question for me, and the experimentation and really prodding the law to do some of your own play in the space to see what actually starts to come out. And I'm going to read you this quote that I love from Amiri Baraka in 1970, and this speaks both to what we can do and the loops that we get stuck in, right, and thinking about how we come out of those loops. So Amiri Baraka said, machines, as Norbert Winner said, are an extension of their inventor creators. That is not simple. Once you think machines, the entire technology of the West is just that the technology of the West, nothing has to look or function the way it does. The West man's freedom, unscientifically got at the expense of the rest of the world's people has allowed him to expand his mind and spread his sensibility wherever it could go and so shape the world and its powerful artifact engines. The question becomes, how do we shape our own powerful artifact engines? And had I time, I would share with you Say It Aloud, which is a piece that has a little rant in it about who we are. But really, it's thinking about how do we rediscover ourselves anew? Can we create a headstrong, self-determined, and changing this memory? Can it help us transcend and transcend entrenched supremacist systems to create something supportive and entirely new? And the things that we want, like at the bottom of our hearts versus the things we're fighting against is really where I think that becomes most important in my estimation. Thank you. And I will stop sharing. Thank you, Stephanie. Up next is André Brock. Please grant browser access to screen recording. One second, please. My bad, y'all. Y'all just got to live without slides because I just don't have the patience. All right. So good afternoon, everyone. My apologies for the technical difficulties. I am having Zoom troubles, as those who were with us this morning may have realized. I am going to struggle to keep up with my previous three co-PIs who dropped some bombs on y'all. And I just want to go ahead and give you a few thoughts of what I'm planning on doing, given my participation in this project, and with the fantastic graduate students and other people who we'll be working with over the next couple years. My vision for the future is a project I'm calling the Black Mundane. And I want to talk about the intimacy and the interiority in the Black Archive. What do I mean by the Black Mundane? My vision for the future is a publicly accessible archive of the libidinal economy of Black Mundane Twitter. Not the celebrities, not the politicians, and definitely not the influencers, right? But the people whose everyday contributions accrete individual spaces to build out a living archive of Black discourses and heterogeneity. And I was particularly taken by a question that Ray Fouché was called out about in this morning's session, one that Ray and I have had discussions about as well. And his question paraphrased is, what makes the Black digital Black? And it's something that has been of interest to me since I started doing this work. And I think it's something that needs to continually be paid attention to as we move forward with this internet research that we all do and love. I wish you could see this slide, because on the back of this slide, I actually have a picture of the Great Value Juneteenth celebration, that ice cream that Walmart is issuing to celebrate the upcoming holiday. It is a combination of, I believe it's, oh, salted red, swirled red velvet, and I want to say watermelon ice cream, but that would just be too funny, right? And cheesecake ice cream, right? And this is Walmart's intention on how to commodify and commercialize what Blackness is and means for this country. And the reason why I found this image captivating is the responses that my timeline had to this fuckery that was about to be foisted upon us as we approached the combination of June 19th, I'm sorry, the combination of Juneteenth and Father's Day, right? And so my argument for the Black digital is that something that I wrote in my book, right, again, at Ray's prompting, I wanted to be sure that I was explicit about what Black is, and so how does the Black digital become Black? And in the book I ask, this is distributed Blackness, which some of you may have, if not, I'm giving it away for free, holla at me, an ethnic group is not one because of the degree of measurable or observable difference from other groups. It is an ethnic group on the contrary because the people in it and the people out of it know that it is one, because both the ins and outs talk, feel, and act as if it was a separate group. And for me, this definition maps precisely onto the ways in which online identity is constructed, contested, and deconstructed through online discourses, mainly but not limited to text, multimedia, televisual, or other user-generated content, such as Twitch streams, right, or Discord servers. More important, this dialogic formulation of the discursive, affective, and performance aspects of ethnic identity is also a powerful conceptualization of racial identity, and it is powerful precisely because we are able to identify and operationalize the pervasiveness for Black folk of racial ideology's effect on both the ingroup and outgroup members. Thus, this definition accounts for beliefs that are evoked in everyday life, in ways that are occasionally outrageous, sometimes ratchet, but always problematic for both ingroup and outgroup members, but also in ways that are also mundane. The repetition of certain topics in your social media spaces that serve to remind us how people are often coming new to the context in which we build, discuss, and enact our performances of self, right, and how we have to rope them in. I will not pet apologize. And how we have to rope them in to help them understand what it means to be Black in these spaces, or whatever your particular ethnic group is. Finally, this discursive definition, this dialogic dialectical definition of racial and ethnic identity, or how both in and outgroup members talk, feel, and act, compliments this philosophy of technology concept that I use throughout my work to understand that technology is an artifact of practice and belief, right? And so the artifact for me symbolizes the space in which people can talk, the practices that are enabled by that artifact symbolize how people can demonstrate interiority or how they feel, and through both of them, people act to construct, deconstruct, or maintain their identities. So the last piece of this, for those of you who may be familiar with my work, you know I often talk about libidinal economy and Black technoculture. So arguing for the Black mundane for me draws from my libidinal economic framework where I focus on moments of joy and repetition as opposed to trauma, resistance, liberation, or innovation. From a libidinal economic perspective, joy offers energies that help to redefine labor, whether emotional, capital, or otherwise, as effort, right? To understand play as necessary resources that afford us the navigation of the desiccation of modernity, right? We can't, I mean, we may try to resist it, but we're still always going to be enmeshed within it. And finally, to avoid the capture of our humanity as a standing reserve for the digital as content. So Black technoculture and cyberculture, I feel, offers us a window into Black interiority that was previously only available through our understandings of the arts. Afrofuturism is a notable space in here, but I would also add the blues and rap music as ways to understand Black interiority that become immensely popular because of their rhythm, repetition, and visceral impact, right? That is the manner in which we conduct ourselves in these digital spaces, the ways we enact and archive and retrieve for any of those who have been on the other side of a, is this you comment Or even renegotiate the context in which we live in order to build and perform acts of selfhood are made available through Black technoculture and ways that afford me the possibility of thinking through this as an archive. So I see this project, the Black Mundane, as leading to multiple curated archival spaces, each centering on specific technologies. And I was inspired by some of the talks we had this morning where our different graduate students were talking about TikTok, we're talking about Instagram, we're talking about building out their own apps, we're talking about online communities that happen in the comment sections of the aforementioned or Twitter or Facebook and the like. And so centering on specific technologies where articulations of the Black Mundane are highlighted through one-off pronouncements of agency, viral tweets, and hashtag activity, right? Think of the Vine archive. I want to avoid, 20 years from now, a recollection of the tweets such as Meet Me in Temecula, or Thanksgiving for Black Families, or bullying white women into acting like Karens, right? I want to avoid a recollection of those signal tweets only through surviving embeds and BuzzFeed articles or through video captures. I want to build a living, accessible archive of Blackness visible both for future generations and the people who have just found their way online, the new digerati, in order to preserve a compelling vision of Black life and futurity. Thank you very much. Thank you. Our next presenter is Remi Yergeau. Hi, this is Remi Yergeau. And just if I could get a quick thumbs up that my slides are indeed showing. Awesome. Thank you so much. I'm excited to join this conversation and looking forward to learning more from folks here this afternoon. So I'll begin by narrating the image that's on my slide. Also, I don't know, Jessica, if you're able to paste into the chat. Eric, I could do this as well. A link to the full text script in case that's helpful. So the image on my slide hails from Joel Schall's Autism Teaching Strategies website. And it represents one of many different kinds of so-called self-control meters. And this particular self-control meter is a volume chart which showcases a range that runs from too quiet, hard to hear, to the all caps, giantly-fonted, too loud exclamation point. In between are grayer shades of loudness, almost loud enough, just right, and getting too loud, which is also in all caps. This chart is intended not only for use in special ed classrooms, but it's also used with disabled adults in supported employment and social integration programs. Generally, a teacher or health provider will use a paper arrow. We, I just made the arrow spin, to make the volume chart appear more dynamic, updating the volume scale in real time as their disabled clients speak. And as a young adult, I was referred to a social skills program at a local autism clinic to quote, unquote, practice my social interaction. And as part of the program, I attended a book club that met weekly at a Barnes and Noble and was moderated by several social workers and undergraduate students. And here then was where this self-control meter and its demonic paper arrow made my acquaintance. For much of my life, the bane of my existence has been commentary on my voice. I'm too soft-spoken. I'm talking too quickly. I'm using a monotone. I'm being obsessive, on and on, blah, blah, blah. And my experiences in this particular social skills program were no different. So we might think about this particular meter, along with other disability management tech, as a form of social access, which is emblazoned on my slide in blue block letters. So social access, or the idea that social inclusion of disabled people requires techno solutions that mimic and extend the data-driven surveillance practices of psychiatry and other medical institutions. And drawing from work in disability studies and critical accessibility, I'd argue that social access is a so-called version of access to site Casey Boyle and Nathaniel Rivers, in which neurodivergent and mad people are positioned as lacking social capability or as needing intricate social support in order to participate and remain legible as citizens. So social access then is a version of accessibility that understands the so-called solutions to social impairment in mechanistic, monocultural, rigidly gendered, and technologically deterministic terms. So take, for example, researchers who back in 2005 posited a so-called emotional hearing aid, pictured here in cartoon form, a drawing showing a wearable device mounted on a user's head that's recording and interpreting others' facial expressions. What's also notable in the cartoon drawing, the characters appear to be light skin and masculine of center. So this emotional hearing aid, they're positioning it as a technologically driven solution for enabling neurodivergent children to interact with neurotypicals and, quote, automate the process of empathizing, end quote. So some questions here. First, who does such a design actually serve? How is the labor of empathy both feminized and culturally enacted in this framework? Which data sets, so faces, bodies, spaces, inform the device's construct or supposed construct of what facial expressions might mean in a given situation? And who is the burden of access placed on in these encounters? How does this version of access understand and ultimately create social bodies? So social access, in other words, to go back to the previous slide with the black letters, social access, typically unfurls as desires toward automated rehabilitation and motion. Desires that are heralded by numerous designers, educators, clinicians, and sometimes even disabled people themselves. And so too are these desires deeply nested in cultural attunements toward race, gender, and sexuality in which being socially appropriate mirrors interactional models that value whiteness and cisheteronormativity. So take as another example this screenshot from the Apple Store page for the Shine Center's Social Skills for Autism app. The image showcases a conversation between a human and an alien named Kluge. And the alien's back is facing the human. And the following breakaway quote falls at the bottom of the image. Always face the person you are talking to. So first, alien tropes frequently follow autism. And as one moves through the app narrative, various alien characters demonstrate their supposed social skills learning by making claims about how to form friends, demonstrate enjoyment, and refrain from interrupting others. The image on the slide is a classic example of how autism treatment enterprises, under the banner of social inclusion, forward Western and whitely logics around eye contact and social engagement while also simultaneously decontextualizing the fluid nature of social interaction. The snarky part of me just wants to say things like, so what happens when you're standing in back of someone in line? You're not allowed to say anything to them. Right? There's a total flattening and always mandate in these sorts of images. Numerous bloggers and activists have also commented on social skills and the white cishet normativity of these lessons, including a very dominant genre in the autism world called social stories. And I'm especially indebted here to the thinking of folks like Kasiana Assasumasu, Rudy Regan, Karima Sivik, and Iash Kanazi, among others. So there's much to say about social skills tech, not to mention the ideology behind self control and clinicians' supposed ability to judge and measure it. In drawing our attention to social access, I'm offering less of an argument, I think, and meditating more on a series of stories about possible digital futures. And what I hope are ways of thinking through together what it means when compliance is positioned not only as a communication value, but as the pinnacle of recovery and wellness. If we want to think about other examples of control metering and motion, we might call to mind disability accommodation forms. There are accompanying accommodation platforms that are often used in university and workplace settings. The workbook on Long COVID that Lise Lao and Sarah Hughes analyzed this morning, supporting students in higher ed with Long COVID, waves towards such examples, and it's likely that the forms and systems used at your institutions do as well. On this slide are two images from Microsoft's Disability Hiring Initiative website. So one is a description about disability hiring at Microsoft, and I'll just draw our attention to the last sentence on there. We host ability hiring events. Ability hiring has capital A, capital H. Inclusive interviews train and educate our teams on disability etiquette and provide interview accommodations to increase the diversity of our teams and positively impact the culture of our workforce. I'm always particularly struck on how disability etiquette is always framed in relation to the accommodation process, taking up again these appeals to social access and stock forms of inclusionism. And then there's the form itself, which is sort of this nameless Google form type thing. I'm sure it's not a Google form because it's Microsoft, but you're left to wonder where on earth your information goes once you throw it down the wormhole. And there's a phrase that says, rest assured, your request has been entered. Rest assured. So following Jay Dalmage and Amy Hamright, we might think about accommodations and social access as additive retrofits rather than reimaginative frameworks. And we might also ask the following questions. What and whom do we lose in compliance-based theories of communication and relationality? If mad and neurodivergent selfhood is entirely based on compliance and accommodation systems down to the intricacies of controlling our voices and our prosody, then what are we to make of how neurotypical people understand us? And if I were to narrate some ultimate goals for the Digital Accessible Futures Lab, they would include highlighting how much of what is construed as accessibility for mad and neurodivergent people is not the radical, coalitional, or liberatory politics espoused by disability justice frameworks, even if often the language of social access tries to borrow language from disability justice frameworks. Social access tech, if we were to so call it, harbors the residues of all means and manner of size systems and institutional control. In the realm of possibility, I hope we can meditate on a more optimistic set of goals following the work of scholar activists such as Leah Lakshmi, Piepzno Samarasina, or Julia Rodas, lurching toward a perverse rhetoric of the asocial frenemy, where neuroqueer affect, bluntness, abruptness, disinterest, pedantry, and awkward silences might pose affronts to normatively human and metallic companions alike. So I will pause here and thank you. Thank you. And our last presenter is Catherine Knight Steele. Okay, it says I can't start my video. So I'm not certain if someone who's hosting needs to undo that. Otherwise, I'll just talk from a blank screen, which also works. We're trying to figure it out on the back end, Catherine. All right, here we go. Thank you all. And I recognize that we are late and on time. And so if folks have to drop off because life calls, please know that my feelings are very not easily hurt these days. And I will charge it to the fact that we all are living very busy lives. And I want to start by thanking, actually, Dr. Gareth Williams for the inspiration for this talk. There is really nothing better than sitting in on a student's dissertation defense and truly learning new things, which I had the privilege of doing in the last few weeks. Dr. Williams, the newly minted Dr. Williams is a public relations scholar in my department who recently defended his dissertation on the topic of wicked problems. My colleagues in public relations, I'm in a communication department, focus their energies on identifying and finding resolution to communication challenges for various publics. So what that might look like in real time could be something like how do we best engage with citizens about impending disasters? Or how do companies create trust? How do governments who have important health and information to get out to their citizens, are they able to do that? How do corporations who mess up on things like their public relations campaigns find ways to talk about it? And how do celebrities like Rihanna manage discussions about the birth of their son on social media? Dre, I told you I would manage to get a picture of Rihanna into my talk. So wicked problems, as Dr. Williams explained to me and to the rest of his committee are the intractable challenges in society for which there aren't real solutions. His work focused on what exactly it is that communication managers are supposed to do when they're faced with those kinds of problems. Things like COVID-19 or cybersecurity, things that the communication managers themselves within a corporation can't resolve, but are left to deal with all of the problems that arise out of them. So today I thought maybe we would talk a little bit about our discussions of race and gender and disability online in terms of this idea of wicked problems. I feel fortunate to go last today because I get to listen to my colleagues and learn. I got to learn more about digital capitalism and about creating more fulsome data sets through projects like Stephanie's binary calculations about the possibilities of archiving digital Blackness. Dre, I have thoughts. We should talk. And about the very relevant kinds of techno solutions that Remi just provided us with. All of the real and practical harm mitigations that we hope our research attends to. These are not the wicked problems, race or disability. Rather, it is the racism and the discrimination that function as the wicked problems that we wish to address. But what does that look like? How do we as scholars of digital studies approach this notion of wicked problems? Now, one version of that would be to look at digital culture as inextricably bound to wicked problems. And that leads us to a rather pessimistic vision. Wicked problems like white supremacy and techno ableism are intractable. They can't be solved. They will always exist. But then we have this thing called the DISCO Network, the digital inquiry, speculation, collaboration and optimism network, and the scholars that make up this network, I think, are rather unique in our vision of looking at a network that's based on collaboration based in optimism. Not naive about the existence of said wicked problems, but focused on how we might use tools of both history and culture to create a pathway forward. Let me show you a little example of something that came to mind as I was thinking about that history and pathway forward this week. There's a thread that I saw on Twitter from AJ Christian who was talking about something that happened on Netflix just a week or so ago where we saw some of the heads of diversity initiatives within within Netflix, like strong black lead all lose their jobs all at once. All of these people who had been working to promote diversity within this streaming corporation were no longer needed. So it seemed by this corporation. And what Christian says here is that the post Netflix purge that we we've reached is the next phase in Hollywood's diversity cycle. Now that a new technology streaming is no longer growing, people of color and queer people are becoming less fashionable. He goes on in this thread to explain how this actually isn't something new. This is something that points us toward a relatively recent past phenomenon in television studies that could have made us more aware that this was on its way. As he pointed out in 2010, when this cycle began to happen in the early 2000s and networks like the WB and Fox systematically cancelled all of their black lead shows. Right. And it moved to cable networks who then went through a similar phase out of black hosts and black lead shows. And for me, a thread like this might seem a little disconcerting. Oh, this just happens over and over again. We find ourselves in fashion and out of fashion. We find our things used and appropriated and we get paid for them and then they get taken away. But it's wildly optimistic to me because it's further proof of how collaboration can actually change our path forward by pointing to the past and specifically pointing to the work of race and television scholars. It turns out we always had a blueprint for what would happen with streaming because we accept that there is a wicked problem at the core of this, which is racism and hyper capitalism that will indeed lead us repeatedly down the same path. We actually know what is coming and perhaps have a better sense of how to navigate through it. So it's history, attention to documenting it as it happens, of learning the key lessons from our communicative past and our present challenges that these sit within a legacy rather than popping up all at once unexpectedly. We don't have to view every new challenge as something brand new based on brand new technologies, because ultimately our communication technologies, we create them to solve for the same challenges we continue to face as society. And ultimately we keep reproducing many of the same intractable wicked problems within our new inventions. So I think by focusing on wicked problems, we learn several key lessons. The first is how important it is to actually identify something as either the root problem or a symptom of a larger wicked intractable problem, because so much of what we do ends up focusing on the symptoms. And that's actually just fine so long as we name them as such and don't claim that these temporal solutions for individual technologies are actually solving for the larger wicked problems within our culture that would require much more than one study or one set of researchers to deal with. We can reduce harm with things like surveillance and online hate speech, but often when we're focusing on these symptoms of larger problems, hyper capitalism, racism, patriarchy, we're not naming them. We're only looking at what we're seeing as we should be looking at rather what we're doing as a small note in our long history and how previous societies have dealt with this or how our society has dealt with these intractable problems before. How have we found success? Where have we met pitfalls? How might we not waste our time by replicating the challenges of the past and instead look toward more inventive solutions? And what places might we actually look to for those solutions as a guidepost toward our digital future? Now, since I began this talk by shouting out the social scientists in my department in who study things like real problems and real solutions and how they might marry those things together in their research, I'd be remiss not to end by reminding you of my humanistic colleagues as well, whose efforts I think best direct us toward the long arc of the human condition, the cultures and the histories, particularly the groups that are most harmed by wicked problems, their resilience, their creativity, their ingenuity. And I think that can and should give us rise to a collaborative optimism about our digital future. Thanks so much. Thank you, Catherine. These are amazing talks. I say that having given one. I know it sounds kind of immodest, but if you are interested in keeping up with the kind of research that this network is doing, please watch this space. We have two monographs we're going to write as a large group. One of them is going to be finished this summer and the next next summer, as well as each lab producing its own work with its own fellows and with its PIs that will be coming out, too. So thank you very much for attending. This was magnificent and have a good rest of the day.