comparemela.com

And the International Communication association, this is 1 10. All right. It is 5 00 on the dot and i want to give our speakers every single second possible. It gives me great pleasure to introduce our final keynote. Shes, at the end of the day, it seems youve saved the best for last and weve had a day full of bests and we have our final set of bests. Here with the keynote on Artificial Intelligence and the new gen code professors Ruha Benjamin and Marilyn Broussard and with that, i will turn over the floor to both of you. Thank you. Okay. Thank you charleton and thank you everyone for coming today. This has been a very intellectually stimulating day. It is my great pleasure to be here today with dr. Ruha benjamin who is an associate professor of africanamerican studies at princeton university. She is the author of peoples science, bodies and rights on the stem cell frontier and a new book called race after technology, abolitionist tools for the new gem code which you should get because it is amazing and it is available for preorder now and is coming out in early june. She is also editor of a new book called captivating technology, race, carcerial techno science and literal everyday life which comes out this week and so it is my great pleasure to be on stage with ruha. I get to introduce professor broussard, but i have to say as i introduce her that unfortunately, her book is already dated. I would love to do a plug, but unfortunately shes now an associate professor of journalism at nyu. So we need a naomi wolf moment and recall the book. Professor broussard is at the institute of nyu and auth offer artificial unintelligence, how computers misunderstand the world. Her research focuses on a. I. And investigative reporting with a particular interest in using Data Analysis for social good. A former features editor at the philadelphia enquirer, shes also worked as a Software Developer at m. I. T. Medial lab and with that ill get us started with the first question. One of the things thats really striking about art fshl unintelligence is this balance between tech enthusiasm and tech skepticism. Thery is a line that i love many in the book that you have this balance and so you say, as a car, the tesla is amazing. As an autonomous vehicle, i am skeptical. So tell us a bit about how you balance this together, someone who writes code and is engaged with the entire Tech Industry and how that works together for you and your analysis. I actually opened the book with the story related to this. I opened up to a story about a time when i was little and i had this robot building kit and so this was my introduction to the fact that we can build things with technology and what we imagine the technology is going to do is not necessarily the same as what the Technology Actually does. So there is a big gap between our imagination and reality. And so i took this knowledge into my work as a developer because often, i would write code and i would imagine that the code could do something, but once i implemented it, it couldnt live up to my expectations so this is something that we really need to confront as people who make technology is we need to not get out over our own ski, and so i employ in the case of Autonomous Vehicles which i write a lot about in the book because its important technological issue as well as a social justice issue and the makers of Autonomous Vehicles are focus on the the fantasy of Autonomous Cars and theyre not given enough attention to the often disappointing realities of them. So it is absolutely possible to keep Building Technology in the face of constant disappointment. I mean, this is actually how you build code. You have to be okay with constant failure, basically, because nothing ever works the first time that you build it and you have to bang your head against the wall and it feels amazing once you stop, but this is this gap between imagination and reality is something that we need to be more honest with ourselves about when were thinking about technology and society. So ruha, i want to ask about your work. Your work focuses on science, medicine and technology. So how did you come to this interest and how do you feel it intersects. The short answer is when someone tells me not to do something it makes me just want to do it. So when it comes to all of the things that sociologists study i found when i was an undergrad that there was a black box exceptionalism around science and technology that didnt necessarily pertain to other fields whereas someone is studying economics and politics people dont stop them and say were you a politician before this and were you an economist . The assumption is that you could have some kind of access without being someone trained in that arena which we dont grant to science and technology and i was interested in that exceptionalism and breaking through that, sort of piercing that bubble and within that, as i moved further into the social studies of science and technology i found there were lots of people doing it, and oftentimes the way it was framed was looking at the social impabli impacts of science and technology and how it was impacting different communities and i became interested in the social input and the a sump lgdzs, norms and biases are taking that for granted and studying how it Impacts Society and looking upstream is how were taught is inevitable. This idea of inevitability i think is really interesting. One of the really exciting things about reading your new book was realizing that we were talking about so many of the same things and we were i wish we had been in i wish wed had this conversation about three years ago. I know. I will Say Something about the book is that i do see both of our books in this field. The books as provocations, as conversation starters and not trying to close up or sort of tie a nice bow around it. So in some ways im glad that its sparking a conversation after the fact. Very much so. Very much so. And this feeling of inevitability i think is really interesting because my sense is that people feel disempowered in the face of technology. Theres been this narrative that tech is something that we have to sit back and let it happen to us, that we dont have agency in deciding what are the technologies that get developed or what are the interventions or the invasions in our lives. I think there are two sides to that techno determinism. One is a kind of fatalistic that will devour us and destroy us and the other side of that that its techno deterministic and whether it will save us. Both are deterministic ways of thinking that robs us of our agency, and so just thinking about the crafting of your own book, one of the things that i love is the way that you bring together critical analysis with story telling. There was a panel or two ago where a lot of the conversation was about the role of story telling in both reinforcing certain kind of hierarchies and inevitability, but also as a tool for subverting them and i want you to take us behind the scenes about how you tell in the story crafting and analysis. I get to talk about literary craft . This is very exciting. So i come from a Literary School called immersion journalism which derives from the new journalism of the 1960s and heavily influenced by participant observation. So as an immersion journalist you immerse yourself in a world in order to show your readers what that world is like, and so i do a kind of immersion journalism for technology whereas a data and computational journalist i write code to commit acts of Investigative Journalism and often ill build code in order to demonstrate something about how Technology Works or how particular technological interventions work. So in a couple of episodes in the book i build Campaign Finance software to try and fix the Campaign Finance system which p. S. , is really broken and didnt work. I also build Artificial Intelligence software to try and fix a particular problem in philadelphias public schools. Public schools did not have enough books for the students to learn the material that were on the state mandated, standardized tests and nobody had ever asked the question before of do the schools have enough books and so i tried to answer that question, but i couldnt because the software to answer that question didnt exist, and you cant actually do the calculation in your head because its just, too hideously complicated. And so the process of building that technology was really illuminating in understanding the role of data in that system and then also understanding why it is so hard for kids to succeed inside resourcestarved School Districts and so its not really a data problem. Its a people problem, and so by Building Technology and talking about how i built the technology was a way of accessing kind of larger social issues around what are we doing to kids in public schools. One of the stories that you tell you hinted to a few minutes ago took place in 2007 and this harrowing story of you riding in a driverless car, and so one of the stories that we tell about tech is just give it time, right . Well fix the bugs, and so its been 12ish years. Where are we with that, and would you say that were at a place where its reliable and safe enough for wide use . Not even vaguery. So in 2007 i rode in a driverless car and it almost killed me and it was in an empty parking lot with no traffic. So the technology has come a long way since then, but it has not come as far as the marketers would like you to believe, and so next time you hear people saying Driverless Cars are about five years away, think about how many times youve heard that and how long people have been saying that theyre five years away because theyve been saying it since at least the early 90s and so theyre actually not five years away. They do want work as well as you imagine and, in fact, one of the things ive become increasingly worried about with Driverless Cars comes from something you allude to in your book which is the inability of systems to see darker skin. So Image Recognition systems, facial recognition systems, object detection system, they dont see. First of all, they dont see the way a human being does, and the other thing is they dont detect people with darker skin as well as they detect people with lighter skin. And so the systems that are embedded in these twoton killing machines are not detecting people with darker skin, and i would i would pose the question, who exactly do we think is going to be killed by selfdriving cars out on the street . And so you go into this when you talk about the racist soap dispenser. Are you all familiar with the online video, the viral video of the racist soap dispenser . There is a man with light skin who puts his hand under the soap dispenser and a man with darker skin and it doesnt work. So can you tell us a little bit about how this artifact functions . Sure. One of the things that ill try to do in that section is bring together something that seems kind of trivial. In fact, the people displaying this on youtube theyre kind of giggling through the clip, if you recall. So it seems kind of funny, its entertaining and it doesnt seem that consequential when you just read it as a glitch of one particular technology, but we also have to think about how we got to that point. What are the larger, structural aspects of the research and Development Process that would make it such that that would happen and those same context clues are also operating behind much more consequential kinds of technologies that are making life and death decisions about people and whether its the risk scores that we heard about earlier in terms of whether to parole someone or not and whether its decisions in healthcare, education and whether someone should get a loan that they really need and so the same type of questions about how algorithms or in this case Automated Systems are not developing and who theyre seeing are a set of questions that i think we can apply beyond any particular type of technology and at the same time, i think making distinctions are important and im juggling the attempt to make important decisions and distinctions. So rather than just talk about racist robots which is a great headline which is how i became interested in this is the headline about oh, our Automated Systems are racist. To think about how exactly are they racist . What do we mean by racism and so we have a spectrum from technologies that are explicitly attempting to reinforce hierarchies. Theyre not hide t right . We have to make some distinction on that end of the spectrum on technologies that are designed to and are reinforcing longstanding racial cast gender hierarchies and think about them in relation to and this is what interests me most. Technologies that are being developed to bypass bias that are sold to us as fixes for human biasses and so thinking about those as a kind of technological benevolence, rid . And so tech fixes that are say, we humans are messed up. The judges are racist, the prosecutor, the mreeshpolice an teachers and heres an app for that and an algorithm for that and we need more attention around those things that are purportedly here to do good to address inequity through technology. We have to keep one eye on these obviously racist robot, right . But i think we need more attention around the promise do gooding of so Much Technology and the way that we turn to it in lieu of more transformative processes that we should be investing boeing imagination into and Economic Resources into. And so the system of the way that racism is embedded in everyday technologies is i think what youre referring to when you call it the new gym code which, of course, is taken from Michelle Alexanders version of the new jim crow. So can you unpack for us what you mean by the new jim code and whats going on in the systems . Absolutely. This is my attempt to remind us of the histories embedded in technology. So by invoking this term that was first a folk term that was developed to name a broader and broader rate of practices around whit supremacy that got resuscitated through the book, the new jim crow to describe you on mass incarceration has a license to legally discriminate and enforce caste hierarchies and extending jim crow into the present, for me the current malu is not simply the next step. Its not that were over old School Segregation and mass incarceration, but thinking how technoskie technoscientific fixes containment and how its important to understanding the impact of these technologies so its a combination of coded inequity and often more subtle than past regimes of racial hierarchy, but also the key to the new jim code, really the distinction is that its purportedly more objective and neutral than those prior regimes and so thats where the power and problematic of this regime takes place is that we put our guard down because we are promised that its more neutral and objective and precisely when we put our guard down is when our antennas should go up in some ways to thinking critically about this and this came up a number of times in the previous panels and so i think there is a heightening consciousness around really being aware of anything that is promised to be more objective and neutral, and so what we see here is a whole set of practices under that umbrella. And the notion of objectivity and neutrality and the idea that a machine could be free of bias is something they really grappled with a lot as i was writing my book because as a Computer Scientist, when youre doing math, ultimately when youre doing Computer Science youre doing math and computers literally compute which we kind of forget about and really, its just a machine and its doing math and all of the fancy stuff were doing with computers comes down to math and so when youre doing a math equation, its just an equation, but one of the things that i think is so interesting that has happened, and among the math and Computer Science community is they got carried away with the idea that math was superior. Yeah. So this is this is the basis for an idea that i call techno shoaf chauvenism, that Technical Solutions are superior from Human Solutions and it derives from the idea that math is superior to the humanities and social sciences or any other form of intellectual inquiry and then when we look at the people who for centuries have said oh, math is so superior and we as mathematicians dont have to worry about pesky social issues and our work is so highfalutin and important, those people are homogenous looking and they policed the boundaries of their profession for centuries so that women and people of color, for example, are excluded. So its i mean, its not necessarily that Computer Scientists are out there going out there saying i want to make racist technology and i dont want to make Computer Scientists that are doing that, thank god, but its the unconscious biasses of a Homogenous Group of people that end up manifesting this way and i think oh, its just math and its just computing and it is going to be superior to social issues and its not true. One of the things that i get from wrestling with techno chauvenism isnt solving all of the problems out there, but even when this kind of conversation about equitien justiy injustice brought up it is looking for a mathematical way to define fairness and equity and i presume and ive seen that Computer Scientist and data scientists are forced to grapple with questions of equity and fairness. How are they doing that and what do you think about how theyre sort of domesticating the critiques that were raising here . Thats a really good question. I am so excited that there is a conversation happening around fairness and transparency and ethics in math and Computer Science. I think its thrilling that the conversation is happening and the f. A. T. Star conference is doing some really interesting work. One of the fundamental problem, though is that not everything in the world can be explained mathematically and thats being on, but thats really hard for a lot of people to hear because theyre really invested in the idea that math is superior and that mathematical solutions will just, like, make all of the problems disappear, but when i think about fairness, i think about the social and mathematical definitions of fairness. So lets say you have a cookie and when i was a kid and i had to share a cookie with my little brother, i say there was only one cookie left, obviously the fair thing to do is to break the cook ney half and each kid gets 50 of the cookie. So the computer, if the computer were making a decision about the cookie it would get split down the middle and each child would get 50 and the problem would be solved, but when you actually break a cookie there is a big half and a little half, and so you have to negotiate the social issue of who gets the bigger half and so if i told my little brother i want the big half of the little cookie and if you take the little half then you can pick the show that we watch on tv after dinner tonight. Hed go, like, thats fair. We have a negotiated social definition of fairness which works also. Its a totally reasonable transaction. Maybe the com pew asianal dont always sink with definitions of fairness. So if were trying to create purely mathematical definitions of fairness in every context were simply going to fail because thats not how society works. And this relates to one of the more striking critiques that comes up again and again in the book and you completely reject the idea that machines can learn, and so i love there are moments where meredith articulates common sense undersigning of technology and then she says wrong, thats not it. I love it. Its just wrong and the clarity is so refreshing, but this particular idea that you say machines cant learn. So tell us how come. This is a really interesting feature of language. So im really interested in the im really interested in building code, and im interested in nuances of language as an artist and as a writer, and im interested in cog mission and so when we say Machine Learning it sounds like theres a little brain in the computer because learning is something that we attribute to scentient life forms and the term was literally chosen because the people making the mathematical discipline that we call Machine Learning are people who were really invested in Science Fiction and really invested in making Science Fiction real, and i love Science Fiction. I think its really fun, but we have to draw a clear line between reality and fantasy, and so the habit that people have gotten into of naming things after their favorite imaginary object really muddies the issue. Artificial intelligence makes it sound like theres a little brain inside the computer. So one time i was giving a demo of Artificial Intelligence software at a science fair for grown ups and this undergraduate came over and said you made an a. I. System and he said yes, and is it rell a. I. And i said yes and he starts looking under the table, as if theres some little person, and i realized he thought i meant that a. I. Was something organic, and that they had brought Science Fiction to life finally because this is something that a lot of people are really invested in doing, but i would encourage people to just give up on the idea of general a. I. Wrong i mean, weve tried it for a really long time and it hasnt worked and there are much more practical ways of making something in your own image, right . When people are invested in this idea of making an artifact that thinks and does what you want and is this thing that will live on after you, and i feel like if you want something if you want to make something in your own image, like, have a baby. It works. Yeah. Were okay on time. Let me flip over on something about babies. You write about dna, dna technologies and you connect this to eugenics. So can you tell us a little bit about the new jim code, about ideas and about eugenics and how this is manifesting in todays technology . Sure. My previous work was around the Life Sciences regenerative medicine, and i have the ongoing interest in the social dimensions of genetics and stemcell research and so it it doesnt take up much of the book, but im trying to make a connection between Genetic Technologies and the questions that scholars and analysts and activists have been asking a long time around the role of genetics and reinforcing ideas of race and racial hierarchies and take that into now the realm of the Data Sciences and one of the things that i love about the set of panels today and our ongoing conversation is people marking the fact that what were talking about isnt all that new. Their underlying themes, concern, politics that are taking a slightly different iteration in the context of algorithms and Computer Science, but oftentimes the people who are just starting to think about it are failing to reference or draw upon this deeper bod of knowledge of people thinking about this from other areas and so in one particular section here, im looking at the way that a Genetic Technologies are incorporated in the cars raising system in criminal justice and in looking at the kinds of objectivity that are placed on the shoulders of genetics as this arbiter of truth, whether its the arbiter of truth about the identity of someone in their racial group and whether the way in which genetic fingerprinting is used to create profiles of people based on a whole host of assumptions that actually dont hold up under screwed me and so its a similar kind of phenomenon in which we are viewing technology with a whole host of assumptions around objectivity and neutrality, but not stopping there. We enroll it in already inequitable systems to make very important decisions that then go unquestioned because we say, well, the science says. Well, the test says without thinking about who sdeendesignet with what assumptions and what biasses and with what Training Data . So it is part of the larger suite of technologies, and we are outsourcing decisions to these technologies because we think that theyre more objective when we should be really concerned in part because of this longer history and relationship between the Life Sciences and eugenics. I want to talk a little bit more about this notion of objectivity and impartiality, and i want to read all of you a really terrific passage from the book. Ru hshs ruha write, the danger of new jim code impartiality is the neglect of ongoing inequity perpetuated by colorblind designs. I find myself keep wanting to do air quotes. In this context algorithms might not be a veneer that covers historical faultlines and they seem to be streamlining discrimination making it easier to sift, sort and justify why tomorrows workforce continues to be socially stratified. Algorithmic neutrality produces algorithmically sustained discrimination. I like that you said you have the urge to put air quotes or scare quotes because especially with respect to discussions of race, we kind of reserve our scare quotes for racial discussions when we dont do that with so many other areas that are as socially constructed and politically constructed, right . But its this kind of urge to say, to create a distance from it, and i would say we dont say that money that you owe me. [ laughter ] socially constructed, right . That political decision, we dont use scare quotes for all these other arenas and with race we do and i like that early on in one of the panels we were talking about racial realism to say that although its socially constructed, its powerful in its impacts which means we dont need scare quotes. That its reality is in the impact that it has in peoples lives because of the fact that people are dying because of this reality and we dont need to set it apart as uniquely socially constructed when its related to all these other areas and that is just a basic, i think, thing for us to begin questioning to instead of creating distance, this is part of the lived reality that we have to wrestle with. How about we do scare quotes around algorithmic reality. I would be onboard with that. Another thing that i think is really fantastic that you talk about in the book is the idea of default in the system, and how these systems default to discrimination. Can you tell us more about that . Thats my way of trying to think about business as usual. Like, if we just inherit a certain process or technology or practice, and we dont do anything differently, we just follow the intrugzs and we just clock in or out, the default settings of our society and White Supremacy among other things. Just be as usual. You dont have to have animus and you produce the default, right . Which is an urge for us to actually exercise that agency we started with, exercise that kind of latent power that were taught we dont have and begin to question the default settings, begin to rrather than code fit in to rewrite the underlying norms and the values and the discriminatory practices that we inherit and so when we think about default its not just the technology that has default settings and its our social that has default settings and this is a way for us to start making start setting technology apart as somehow removed from society, right . Its with us. Its part of us and so the things that we take for granted in our everyday social interactions those are also places that we can begin to exercise different kinds of practices that engender equity. We dont necessarily need a fix for that, a tech fix for that. I was thinking about the conversation about Voter Registration, right . And thinking about the question the moderator asked about maybe we can Harness Technology to engender greater Voter Registration and there are Old School Ways we can do that, too, right . That doesnt involve technology, like, why are we voting on a week day. There are so many ways we reproduct you ared the defact, we could change yesterday that the spikes number of voters that doesnt involve new apps, no anything, right . Its a political decision not to do that and so again, its thinking when were offered and i have a new thing for that and i wonder if i can do that through other means that dont require fancy, new shiny gadget which is gets to my last question for you which is kind of the marching orders in this book that we say we need to reorient how complu think of new technology. Why you end up with the orientation for the reader . I really care about empowering people around technology. So often, especially in Artificial Intelligence people talk about a. I. , but they dont truly understand what a. I. Is and isnt and therefore we make ourselves susceptible to people telling us things that will lead us to poor decisions and so i really want to help people to understand how a. I. And how Machine Learning works and what Artificial Intelligence is and so one of the things i do in the book is i actually demonstrate what it looks like when you do Machine Learning because most people have never seen that, right . You have probably heard the term Machine Learning, but unless youve actually sat down and looked at what it looks like when someone does Machine Learning it still kind of feels like its magic and its confusing. So i want to demystify things and i also want to go back to what you said about defaults because that was really interesting and that defaults a lot as a programmer because when youre writing code you need to figure out what is your base case. You need to figure out if there is no value for variable what do i put in it as a default . And the easy thing to do when youre writing code about, you know, about social issues is to say okay, well, the default is just going to be what its always been and so then we have to think about who are the people writing code and what is their experience of the world because a lot of the white men that i know writing code look at the world and say its pretty great and so we put in the default of how everything always operated and as a black woman, i look at the world and i see racism. I see sexism and so im aware that those are the defaults in the world, and so when i write code i try to do it with an awareness of the defaults of the world and some of those defaults, the programatic issue hasnt been solve period upon we havent solved racism in the real world so theres no fix for racism in the code because the real world fablgses have to come before the computational fixes because this is not an equation. Right . Its not the pythagorean theoren. It functions differently. I felt in order to have these intense nuanced conversations that we need to have. We all need to be on the same page first and so we need to understand what were talking about and so we need to stop having conversations where one person is talking about the robot apocalypse and another person is talking about Risk Assessment scores and so when we talk about Artificial Intelligence we need to be talking about the same thing so that we can have a productive conversation. And there is a really beautiful sort of pedagogical book as which professor broussard as teacher is drawing people in to understand and demystify that will make it a wonderful addition to any classroom. Thank you. Thank you. You know, do people often ask you what can we do . Because i get this question a lot. As if there is kind of one thing that an audience can take away or one thing that we can all go out and do and if we did one thing it would solve the problems and i i dont know, i give an answer and this is the one thing you do and other times i push back a little bit and i say okay, the notion that theres one thing that we can do is actually part of the problem because the notion that theres one thing is actually a notion that comes from that culture where you boil down a problem to the pinpoint and you artic lit what the pinpoint is and then its fixed. Do people often ask you what can we do and what do you say . You know, my thinking around this is that weve gotten to this point through many different avenues and through top down, bottom up and horizontal that is every tentacle of social life reproducesen equity and injustice which to me means that what justice and equity looks like is a challenging of every single tenticle. That is, no matter what we do we have an opportunity to try to redress and to engage with the problem so its definitely not a single fix. An umbrella term that i use in the become to try and articulate a move against the new jim code is to bfrmg what abolitionist things look like, but as techniques and practices says as it runs throughout the last chapel, and it has communitybased organizations to engage critically with right and the role of artists that are creating around the new imaginaries around the harm and promise of technology. Things like the white collar Early Warning system that takes the idea of predictive policing and turns it on its head and the heat maps financial crimes all over the country where we end up with hot red on wall street and were probably in a hot red zone right now. Theyve created an app that will ping you, warn you when youve entered a whitecollar crime zone and theyve afrmgd the likely face of a criminal using the Linkedin Profile of 6,000 executives and so taking that kind of idea that derek bell articulated about Critical Race Theory and says sometimes the best way to know what were dealing with is to reverse reality and where you see the absurdity of a particular reality through its op riposite right . And we chuckel and laugh when those same things are being weaponized against black and latino communities and then you have a wholest is of abolitionist techniques and tools that we dont need to lay out publicly, right . That people are using to survive digital dragnet, right . That by writing about it and exposing it youre actually giving that information to people who can then harm. So there are a whole host of abolitionist tools that we need to know or exist and are being developed and for example, there is a digital defense playbook by our bodies in that collective and so here, one of the things we have to think about is what we need to survive and to thrive that doesnt necessarily need to be blasted on cspan, right . The Overground Railroad as douglas pointed out, and so there are many things happening in locales and through individuals for abolitionist tools and for me it was important to try to begin articulating and be in conversation with people who are seeking something more transformative than another tech fix and a solutionism, a reform that will make something more marketable, and make something more ethical in a superficial sense without challenging the underlying inequality and injustis that is killing people, and so me this is a concept that can be in conversation with other cousin concepts and ideas of people working to that larger needs to be changed before i see it in the code. I feel like one of the abolitionist tools that you used in book is using your own story as a jumping off point for larger social issues. Is that something thats going . Thats part of the process of undisciplining myself, right . Again, if we think about the famed objectivity and the god trick and what they call the view from nowhere in which data tells us were sitting above it all and making judgments then the opposite of that is to situate yourself, to show from where you look and to show why you care about things and so in terms of our knowledge production, if what were critiquing is the god trick, the view from nowhere, then you need to tell us where youre looking from and so throughout, it doesnt overpower the text, but you get to know me because otherwise how can you evaluate my analysis if you dont know whats motivating it, right . So this is a way that even how we write and how we talk is part of the transformation that oftentimes our disciplines do not condone. My favorite way that you employ this technique is in the introduction where you write about your personal experience with Police Surveillance in los angeles influences your understanding of how technology perpetuates racism. And this is i feel that this is really important because in immersion journalism we talk about whether or not you put yourself into the work as a character and so the real of thumb is you dont put yourself in unless you absolutely have to be there. If you cant tell the story another way or if your subjective experience is really meaningful to the way that you tell the story and i felt like your subjective experience was crucial to how this story is told to how the issues you explore are set up in the world. And to know what she means youll have to get the book. Just kidding. Its just the preface, but and its terrific. You should definitely look for this session. Its an experience shared by many people of growing up in an overpoliced or heavily policed neighborhood, right . Where you grow up seeing your friends lined up against the fence on your way to school and getting patted down and having helicopters rumbling over the roof and this is los angeles and its growing up with this as the normal way of thinking of what police are and thinking about how now you dont need the audible rumble of helicopters to surveil communities and what is that transition from the more explicit obvious that you can point out, right . To the much more embedded and so this is where lapd spying come in and theyre thinking of the more invisible forms of surveillance that nevertheless, harm community, but we need more sort of we need to highlight it more and so thats what the text is trying to do. I thought we could finish by talking about some organizations who are doing good work in this vein because id like to uplift some of the organizations and individuals out there who are doing really amazing work in destroying the new jum codim co. The first one i would offer is data for black lives which is a policy group and they are they have a really great conference every year. Yashima is a powerhouse and theyre doing interesting work on nutritional labeling on olga rith ri al g algorithms. I have a whole list on the resources page on my website, but one of the sort of heartening developments i feel like over the last couple of years is the growing movement from within the Tech Industry so tech wont build it more than code and so we begin to break down these bien aers between outsiders and insiders and one of the things that i attribute that movement to, those people must have taken a good sociology class when they were in college. Absolutely. There is a growing awareness of those producing about their responsibility to the greater good. We cant rely only on that because one of the questions i have as that movement has grown up around opposing i. C. E. , and opposing various military contracts is the lack of movement when it comes to the vast carceral net around plaque a black and latino communities so the use of surveillance technologies going back and so we cant just wait for tech insiders to grow a conscience, but im happy that that movement is developing, and so for more theres a lot of organizations and various movements that i list on the resources page for you. Very important. Very important list and very comprehensive list. I thought ruha really nailed it. And i missed a lot. And your website is. Ruha benjamin. Com resources. Thank you all. We have time for a few questions. We have about ten minutes left in our time. It looks like weve got a question up there in the back. Can you hear me . Working with an inner ear thing, bear with me. This has been a great day chock full of information and i want to start with the idea of exceptionalism and ruha you talked about you went into the field because science and technology was lumped together with the idea of exceptionalism and in working in diversity sxshs of race and academia, there is always this feeling that you have to explain or that its kind of marginalized because youre talking, you, about yourself and its seen as subjective. So im getting to a point about that idea of reverse the role reverse, so i have a colleague and i had to tell the story because i had to situate myself to get the point through. I was the chair of a Research Committee and bottom line is the black candidate was just as qualified as the white candidate and asianamerican candidate and the committee, i found that my colleagues were did not see the way that race was operating in the room and yet, were in a field of Library Information and science studies and so one of my colleagues who i have a personal relationship and he came to me and said can i be honest with you . I dont feel comfortable talking about race and so, you know, thats one of the reasons that, you know, so anyway in thinking of this idea of exceptionalism, how is it that if you use reverse logic how is it that white men are able to kind of come into this tech sphere and be given all this prestige when they havent done the work of unpacking their own biasses and things like that and theyre just given this Carte Blanche prestige that theyre doing this subjective, neutral work . So one of the things that you said we tend to default to math and the criteria for the search and weve done the strength and weak businesses and he came back later after weve done the work and said i came up with this chart and i put it together and put the candidates on the chart and this helped me make my decision. His decision had been made, but that was the logic that this chart will help us all make this decision, but in reality the biasses were already made and the candidate that everyone else wanted. Saying that to say if we use reverse logic we would be able to say to a white person, you have to do your own homework about who you are and we would never necessarily say that in order to do tech you have to question your White Privilege and see how you got to this place to create facebook, Mark Zuckerberg and question your own privilege. Its a good example because its an example of anal ga rithsm that has biasses and values embedded in it, so maybe a derek bell approach would be for you to come in and say well, i did my own chart and this is what i came up with and in that way show how your prior assumptions about who was the good candidate shaped what you valued and what weighed more or not and that would be a way to reflect that. Im not saying that you should have done that, but rather in this scenario rather than saying your thing is bias to actually perform it by showing i can create a whole different algorithm so my candidate will come out on top because of the characteristics that i value and how i waited it and this is the thought process and searches are fraught and so we can talk about that after. I thought maybe it would be pretty fun if we provoked people by saying you need to examine your White Privilege before you build technology. One thing i would also recommend in terms of vocabulary for these kinds of conversations is a book called damned lies and statistics, and theres a sequel called more damned lies and statistics. So a lot of the rhetoric around over using these supposedly neutral statistical frameworks or statistics or frameworks or what have you is the language of calculation which is a kind of distancing, as you said, and so if you learn the lingo then you can offer the same lingo as a counter, right . So if you are familiar with the availability, you can throw that out in these conversations, and i i dont know. Its a useful way to counter the distancing. And i will just i would say on. That im all for empowering people with the language and the tools that you are, the terrain on which youre fighting but i also think that, you know, theres often call for people to gain this technological literacy and not the corollary. For people to gain a basic level of social and racial experience so the challenge is not to become better to play the game as its been. Set up but to question why is it i have to develop this language and these tools and you get to be ignorant on x, y and z and still call yourself an expert, right . I think there needs to be a related demand for people to develop a basic sense of racial and social literacy before they start getting their hands dirty, developing things that are going create all kinds of havoc in peoples lives because of their handiwork. Absolutely. Do you feel like Computer Science education has enough sociological literacy embedded in it . No. I think theres some programs that are better than others. Theres a growing sort of demand for it, sort of infusing ethics but goes back to a point you made earlier in term of the way ethics is operationalized. I am ambivalent based on my, you know, understanding of how bioethics as a field has grownup, as kind of a hand maiden to biomedicine than a challenge to a set of questioning and critical engagement. And a field that often clears the way and makes consent possible without empowering communities for informed refusal. Thinking about the way other ethical fields developed in connection with a s. T. E. M. Field, we have reason to be wary about roboethics or techno ethics being developed solely to enable Technology Without allowing community and people to have a critical say in refusing types of technologies such as facial recognition and others. Nelson says we need to open up a space to say no to technology, which i think is really important. To your point about ethics, when we think about ethics and historical tradition of ethics, we have to think about which field does it come from . It comes from the field of philosophy. Philosophers own the domain of ethics. We have to look at philosophers and think about who is in the field. Its more than the s. T. E. M. Field. Exactly. I think we need to open up the conversation to the social sciences and humanities as well. It needs to be truly an interdisciplinary conversation and we need to do less policing of boundaries. I think we have time for one more question. Right over there. Thank you both so much for what you brought to us today. I really appreciated it and hope its given me a lot to think about going forward. Thinking about the future of the intersection of science and race, i begin to think about two things. Toni morrisons statement of the very serious function of racism is distraction, but im also thinking of Dorothy Roberts consideration of biosocial plasticity every time we take a step forward in these intersections, that gives new ways for excuses for biological and scientific determinism. So my question is how do we confront these issues, anticipate them as well, but not reduce ourselves to always being in and around these problems. How do we not reduce our livelihood and research for being in and around these problems. Thats a great Million Dollar question. Do you have an immediate thought about it . I have some thoughts around it. A set of responses to the Toni Morrison part and the Dorothy Roberts part of it. One of the things that ive sort of come to over the last two years prompting by some senior scholars is so ill give you a very concrete example to the first Toni Morrison point. I finished the edited volume and trying to think about the title for it. The title was technology oppression, coming out hard, you know, with the critical analysis. One of the senior contributors to that volume was like, youre ceding all of the intellectual terrain, the framing on the problem and youre not giving any space to your own imagination of what the good life is, what alternative it is, what world you want to live in. To me, the exchange, in the simple naming of something how we give over to the response, however important, without giving intellectual space to the world we want. Naming it, claiming it, having a word for it. Thats partly why i went back and retitled it thinking what is liberatory imagination, if we dont reclaim that intellectual space, i do agree that the imagination of an elite few that gets infused into so much of this shows us the power of imagination, the kind of perversion of imagination as an oppressive force, but imagination is something we have to reclaim as ours. How do we want to materialize our own imagination of what the good life, what a just society is. I think in part that is a response to Toni Morrison. Ive heard her talk over the last few years, even in our writing, writing more about what is joy . What is happiness . Having stories about that in addition to the stories of the pain and the violence and the hardship, so its finding that balance. I have some other thoughts around the elasticity, the kind of Dorothy Roberts point, which i think is important. It just pushes us to really understand the limitation of the facts or the data around something, around even our project of being teachers. What is data good for but what isnt it good for . You know, ill leave you with this last reflection. That has to do with colleagues who out at stanford who wrote a great set of papers about how just presenting people with data about Racial Disparities in the criminal Justice System actually doesnt lead them to want to support reforms in the system. Specifically white america. They did a study in new york showing america about disparities and asking white americans, the respondents, so now are you willing to support reforms against three strikes law and stop and frisk . The more they were exposed to the knowledge around this, the more likely they were to support the reforms, which tells us something is happening between, you know, their eyes and the data, and that is all of the stories, the narratives, interpretative frames, oh, if theres more black people behind bars, they must deserve to be there. They must be more criminal. Confirms all the racial narratives, so that tells us we have to be as rigorous around the narratives and the stories and challenging that as a terrain of struggle as we are about producing better data, because the data is not going to save us. Again, its thinking about where we invest our energies. Its also just a challenge to this idea that more facts and data about inequality, it has kind of a perverse quality. Who are we trying to convince . They already are not invested in our humanity, why are we expending our intellectual energy into more stuff thats not making a dent. Think social change comes from and pour more of our energies into those life affirming projects rather than simply responding to what is. Well put. [ applause ] and i think that is a wonderful note to end on so we can all feel inspired and leave the day with your voice ringing in our ears. Thank you so much. Thank you, meredith broussard. Thank you all. [ applause ] thank you both. Please buy both of their books. Buy two copies if you need to, or more. I want to finish out the day with a poll of sorts, because i was told someone in the room posted something on twitter that i want to confirm by your voting. How many of you have been to a technologyrelated conference where all the speakers were people of color . Before today. Before today. And the same where all of those i should say the majority of those speakers were women . Okay. All right. Well, this is, of course, part of the premise of the center for critical race and digital studies, which is to say what if the conversations we had today were at the center of our

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.