comparemela.com

Woolley with us here tonight celebrating a very, very important new book. Its called the reality game how the next wave of technology will break the truth. It is from our friends at Public Affairs books. Dr. Woolley is a writer and researcher specializing in the study of a. I. , emergent technology, politics, persuasion and social media. He is an assistant professor in the school of journalism and Program Director for Computational Propaganda Research at the center for media engagement at the university of texas at austin. Professor woolley founded and directed the Digital Intelligence lab at the institute for the future which is a 50yearold think tank based in the heart of silicon valley. He also directed the Research Team at the computational propaganda project at the university of oxford. He has written on political manipulation of technology for a variety of publications including wired, the atlantic monthly, vice, tech crunch, the guardian and many, many others. His research has been featured in publications such as the new york times, the washington post, the wall street journal. Hes also made a appearances on the today show, 60 minutes and frontline. His work has been presented to members of nato, the u. S. Congress and the u. K. Parliament. It is such a great honor to have him here with us tonight doing this incredible and very important research. Please give him a warm welcome. [applause] hi, everyone. Its great to be here. Every time i hear my bio, it sounds kind of fake. Its been crazy. This is the last talk on my tour, and im really happy to be ending it here in san francisco, specifically at city lights. Thank you to everyone here at the store for having me. I couldnt think of a better place to end this tour and to talk about really what is a book on democracy. At the end of the day. And the ways in which we reimagine and rebuilt democracy in the technological age. People assume when they talk to me about my work that i am a computer scientist, and thats actually not true. Nothing could be further from the truth. For a long time i kind of thought that maybe i should, you know, try to play the game and be like, yeah, i took a few classes, i know a little bit about html. But at the end of the day, im the kind of person that studies what i study by talking to people. I spend time in places, i spend time with people, and i go deep. I go deep on subjects. And so for the better part of the last decade, ive been going deep on the subject of what i call computational propaganda. And its a fancy term for the ways in which automation and computer code, algorithms and things like that get used to manipulate Public Opinion. What weve seen in the last four or five years, 2016 during the u. S. Election, during the brexit election, in india with recent problems caused by arguably by whats app that have led to offline violence, weve seen social media become used as a tool for manipulation, as a tool for disinformation. A lot has changed, right . In the early 2000s we had a perspective that social media was going to be something that would be the savior of democracy in many ways, and that was best shown through googles kind of talking phrase of do no evil, it was also showcase by a lot of the work talking about digital utopia and sort of cyber libertarianism. Thats not where we are now. But were not lost. Everythings not lost yet. This book is not just a book about how screwed up everything is and how scary the world is. Its actually a book about solutions. Every single chapter in this bookends with a solution. The conclusion is a solutionsoriented chapter, and i realize that there are a lot of things that we can do. And im going to end the talk today on those things. So first lets talk about storytelling and what it means to be someone who studies technology by talking to the people who make and build technology. Tonight im going to tell you, im going to introduce you to four people, four places and four ideas that i learned in the last several years. And these four people, places and ideas have been instrumental in how i wrote this book and how ive been thinking about technology. The first person, his names phil, and phil my adviser. Hes who the book is dedicated to. Hes now the director of the Oxford Internet institute at the university of oxford, and phil really took me under his wing when i came to do my ph. D. At the university of washington in seattle. Phil, at the time, had been studying the arab spring. He had been in tunisia, in tunis studying the people who were using technology in attempts to communicate about democracy, to organize protests, to do all of these sorts of things. Heed had written a book with oxford press called the digital origins of dictatorship and democracy. And the discussion in this book was all about the ways in which the internet had played a role from the beginning of the internet going mix in countries public in countries, for helping people to realize freedom but also for helping people to realize control. And so phil, obviously, was thinking about these kinds of things very early on. And i had just come from being a fellow on the Obama Campaign in 2012, and i had become enthralled when i was working on the campaign with the way that they were making use of day. I was, like, blown away by how sophisticated the Data Campaign was. There was a lot of excitement about the Community Organizing aspects of the campaign and the personal storytelling aspects, but really none of that would have been anything without the connection to the amount of data the Obama Campaign had on independent or undecided voters. So what they did was they married the data, the massive amounts of data and the massive amounts of work with personal stories and humanizing sort of the data in a way that was able to really reach people. And with a resounding of [inaudible] when i met phil, phil taught me something very important, and it was something i kind of known but really bear saying to all of you which is that technology and Politics Today are inherently connected. You really cant have one without the other. And to some extempt, if you think of technology similarly as tools, if we think about media and the ways that media gets used to commune candidate with people or two people about information on behalf of others, then this has always been the case. But in todays world, technology and politics are very much intertwained. Intertwined. The campaigns that tend to do the best are the campaigns that have the most technological savvy because the reality is, not to overuse the phrase on the front cover, but the reality is that if you have a lot of data and if you can marry it to a sophisticated a. I. System, then you can do very, very hyperspecific, individualized ad targeting or targeting to people and speak to them in a way that theyd like to be spoken to. Thats the thing we realize now. So phil, in Seattle University of washington taught me that people and technology are intertwined. The next person i want to introduce you to is a person named andrew. I met andrew in england when i had taken a job at the university of oxford. What ended up happening is wed gotten grant money to study compational propaganda in about 2013 from the National Science foundation in the u. S. , and then the European Research council followed suit, and they wanted to know the ways in which russia and other countries were using social media to try to influence Public Opinion in democracies. So he said, hey, do you want to come to oxford with me . Im like, yeah, twist my arm. Of course i want to come to oxford with you. One day i was at a conference, actually lsd, and i was standing around, tonight know anyone, kind of scared about it still am and this guy eye poaches me. And when you study propaganda, a lot of times conspiracy theorists want to talk to you. When random people know who i am and ive given a talk, im always like we going to Start Talking about aliens or antivaccine stuff, am i going to have to carry on with you. The fact of the matter is i dont really know how to talk with you about that stuff. Andrew said, hey, i actually make and build automated to files on social media for the labour party in england. I was like, what . Yeah, you know, i control several hundred or thousand accounts on twitter. I do it on behalf of the labour party. They dont pay me to do it, but im doing it because im a member and believe in it and stuffer. And i said, wow, this is pretty crazy. Yeah, lets that talk. And we got to talking and struck up an unlikely friendship, and he really taught me a lot about the ways in which people can use technology to amplify their voices online. A lot of what i talk about in this book is about social media bots, the use of campaigns built to look like people sorry, profiles built to look like people on social media but arent people, theyre automated profiles. And one person can manage many, many, many accounts online, and you can use those accounts to drive up likes, to retweet messages, you can also use them now with the evolution of Machine Learning, a. I. , to talk to people in a more sophisticated fashion. And andrew taught me that theres always, always a person behind the technology. The technology doesnt exist on its own. Social media firms, i think, today would have you believe that the algorithms are apolitical, that they dont have value, that they make decisions in a way that no one could have figured out or decided upon. But if you follow the work of people like noble or the work of microsoft who actually have something called social media collectives that does some surprisingly open work, youll know that the algorithms and software and technology always have human value in them. The people that build these things encode them with their own beliefs. For instance, if you train a Machine Learning tool and you, what you have to do is you have to go through a process of tagging the data, and you have people do that. If all the people that tag the data are white men, then the algorithm may end up being racist, especially if its prioritizing where bus lines should go in a certain neighborhood, hypothetically. Thats something weve actually seen. Suddenly the poorer neighborhoods or neighborhoods of color stop getting bus lines. They end up not getting as much buses coming through, and that is an algorithm or tool encoded with human values. Bots are is tame thing. When you the same thing. When you with use social media, when companies build algorithms that prioritize certain types of information, theres politics there. There are decisions that go into that process. If i had a dollar for every time i heard a social Media Company, a representative of a social Media Company say were not the arbiters of truth, id have 10,000, you know . Because they dont want you to think that they arbitrate truth. But im hear to tell you today thats not the case. Trending algorithms that prioritize certain kinds of information to people cure rate information. They prioritize the things that you see. For the longest time and even today, organizations like google, facebook, twitter have made decisions about how to prioritize news to people. Think about that. That matters. And this book is about that. And andrew taught me that you always need to look at the person behind the tool. Its not enough for just to download a ton of data and say were going to do big data analysis. We need to whoa why theyre building it and who theyre doing it for. You might think that its, you know, savvy political campaigns a lot of times that are doing this work, but it turns out when you dig down deep, you start finding shadowy pr firms that actually say i can build your social media profile and add 10,000 accounts in the next few weeks, and surprise, surprise, what theyre using is fake profiles, fake information. And its a whole weird world out there. The third person i want to introduce you to is a woman named marina. And marina was my boss at the institute for the future, and i met marina the day or two days after trump won the 2016 election. It was the first time id ever been to institutes of the future, pal hoe alto, and they flew me out with a bunch of politicians from the state department who were concerned with the weaponization of a. I. , and at the time, like, a. I. Hadnt been weaponized in the way a lot of people thought it had. There werent bots talking to people getting them to change their minds about politics. It was more than a. I. That was used behind the algorithms had been more sutley manipulating opinion. She listened to the experts and very astutely in her way said at the end of the talk, this is a continuation of kgb tactics and russian stuff. She grew up in ukraine during the, in the former soviet union. And she said maybe what we need to think about isnt that we dont need to think that this propagandas new, because the tactics arent new. The things were seeing are continuation of things weve seen for a very long time. Its the technology and the way technologys being leveraged thats making it so much more potent. We see automation, anonymity, we see these things, the problems of scale, quantum computing, all of these issues are things we need to be considering. But while we consider them, we also need to consider the way the people who are behind them are using them. So marina taught me we need the look to history. We have to think about how propagandas been used in the past and have a deep understanderring of the terminology that we use to discuss this stuff. Because what we say matters. Right now theres a sort of, a sort of epidemic in this cup of using the term fake news. And the term fake news has been weaponized by the people who spread fake news. I challenge you from here on out, if you want to ask me what you can do as an individual, dont use the term fake news. Use misinformation which means stemly accidentally spread false information. Or you could say malinformation which just means bad, stupid stuff, junk news. And so, you know, the terminology matters, history matters, and its no surprise to me or researchers who have studied this stuff, people like carol jack who wrote a great piece called the lexicon of lies, its no surprise that the people who spread these lies are retaking on the terminology and making the same exact arguments about when you point the finger at them, they say, no, youre doing this. Thats exactly the playbook. The playbook isnt necessarily to change peoples mind, its to create confusion, to generate apathy, to make people mad and polarized. And thats something we miss a lot of the time. We think theres sophistication in the sense that these bots are coming and talking to us and making us suddenly become interested in owning a gun. The bots are there to make you not want to the finish to vote and not engage in dems cra. To think that system is is broken. So weve got to look at history, and marina is absolutely right that this stuff comes out of the soviet playbook. One thing that you all should know is that the russians arent the only people that do computational propaganda. And, in fact, i think the russians benefit a lot from us thinking they are the only people that do computational propaganda and saying this is a thing that only happens from them. Computational propaganda and Information Operations happen in nearly every country around the world. Theres a great report from my old team at oxford that suggests during elections over 7080 countries, im not sure of the specific number, this stuff has been weaponized by governments and by campaigns. So it also happens domestically. Domestic actors do this. In some sense theres been a democratization of computational propaganda. And, in fact, my next book is going to be coming out with yale press, itll probably be even more boring [laughter] but my next book is called manufacturing consensus, and the idea behind the book is that we news these technologies to create the illusion of popularities of things. That the more you make something look popular, the more you make it seem like a viable idea. Okay. One more person. The last person, the last place and the last idea, last person is kathleen. Shes my boss now at the university of austin. She was formerly at the new york times, she was the dining editor, and before that she was a sports reporter. I dont know how you make that transition, youll have to ask kathleen. But, yeah, shes fantastic. And when i went to ut, i had kind of lost a little bit of hope because everythings going on, and id been writing this damn book and spending time thinking about the ways in which the informational system broken. But i work at the school of journalism at ut, and kathleens the director. And kathleen has taught me that we need to place faith in the institutions that we already have. We tonight need to create brand dont need to create brand new things. We have the federal communications commission, we dont need the federal disinformation commission, we dont need one more commission, we need to invest in those. But more specifically, we need to invest in journalism. Journalism in this country has done amazing things. Theres so many people that work for a great publication around the United States that want to do good work and want to protect democracy, but theyre still having to learn on the fly. And, in fact, in the book i talk a lot about the ways in which journalism has been not just challenged by the digital era, its not like theyre feckless individuals or organizations that cant handle the digital era. Its that organizations like google, google news, youtube, facebook, twitter massively benefit off the work of journalists without giving any renumeration or money to these folks. The same can be said for organizations like wikipedia. When youtube faced the crisis of disinformation, what did it do . It started linking to wikipedia articles. Theyre a nonprofit. Youtube was using it as the resource that youtube sent people to. Same thing goes for journalists. Google news, for the hongest time, gave snippets of articles, and when people started researching it, you couldnt click through to the actual article. When you clicked the actual article sorry, the research showed that no one actually read the full article. No one actually click through. They just read the little piece, and so the journalists put all their work into doing the investigation, writing the article, google posts a snippet and then no one actually reads it. And we wonder why the news industry is failing, why its having a hard time. I mean, maybe not failing, maybe thats the wrong word. What i think is we can reinvigorate journalism, and im looking at an operation right now. Im going to tell you exactly what my argument is. The argument is that the Technology Firms around the country should have to put, i dont know, 10 billion or 20 billion into a public trust in the United States and let it be overseen by Civil Society groups, people that have a stake this making sure the money spent wisely and well. Google news labs has committed 350 million or so to the google news initiative. Google gives out that money. They make partnerships with organizations. They make decisions about who gets it and who doesnt. And for a long time what googles done when theyve experienced backlash about their policies or their algorithms not prioritizing full articles, theyve deprioritized the news sites that have complained. And so thats not good enough. The Technology Companies have helped create this problem, and they have admitted to it. Theres been a big mea culpa moment. We all saw Mark Zuckerberg sitting before congress saying, shit, i know that Cambridge Analytica did some bad stuff, we had a hand in that, thats kind of our fault, but they havent really given back. They havent really systematized the response to this problem. Theyve done some things and theyve been working hard in many ways. But its not enough. Its really important that we remember these are multibillion dollar companies, some of the Richest Companies in the world. They get treated more like nationstates these days than they get treated like a regular company. So kathleen taught me to reinvest in journalism and to be skeptical of what we see today and to not think, like i said earlier, that journalisms failed. With all these things in mind, all these things taken together we have a really interesting picture, and we have this book. And this book is actually a book about the future. I spent a lot of time talking about what weve been through, but this book looks at the next wave of technology. Its about deep state video, about a. I. , about virtual reality, about automated voice systems that sound just hike a person like google assistant. And it thinks a lot about the ways in which this next wave of technology will make for more potent artificial disinformation. And the subtitle is provocative for a reason. Its supposed to scare people. But in best case scenario, youll prove me wrong. You will not let Technology Break the truth, this is supposed to be a warning and a provocation. And to do a little reading for a few minutes, and then im going to end with solutions and well do q a. Conclusion. Designing with human rights in mind, finding solutions to the problem pose by online disinformation and political manipulation is a daunting task. The disinformation landscape is vast, and it extends beyond our current ability to track it or contain it effectively. Moreover, the internet grows larger every day. According to a 2017 report, 2017, on the state of the net from the software dim domo, we create 2. 5 quintillion bites of data every day. 2. 5 quintillion. I dont even know what that number means. Moreover, the number of Internet Users grew by one billion to a total of 3. 7 billion active users in the five years previous to that report. So from 20122017, the internet grew by a billion users. A forbes article asserted that 90 of the online data was generated in the previous two years. Let me let that sink in. 90 of the online Data Available in the world was generated in the priest two years. This means that people working to gain Public Opinion or exert political oppression have almost unimaginable amounts of Data Available on potential targets with new information beaming out to them every millisecond. They also have access to a lot of potential targets and leverage online anonymity, automation and the sheer scale of the innocent to make them nearly untrackable. Important ethical and legal considerations along with the possibility of finding a skillful operative make prosecution a poor strategy for stamping out computational propaganda. Instead, weve got to fix the ecosystem. Its time to build, design and redesign the next wave of technology with human rights at the fore forefront of our minds. Thinking about responses to the rising tide of computational propaganda, i find it helpful to break down into the short term, the medium term and the long term. Many of these efforts are bandaid approaches focused on triaging help for the most egregious issues and oversights associated with the infrastructure of web 2. 0. Such amendments include little tweaks to social media news algorithms or Software Patches to existing tools. They include several new applications for identifying junk news or tracking and cataloging political his advertisements. These efforts are useful, but on manipulation tactics online manipulation tactics are constantly involving. What works today may not be useful a year from now. In fact, many of the applications that have been built quickly become defunct, going to code level changes made by the social media firms. A lack of funding or upkeep or propaganda agents finding a simple way to game them. There are useful products of this kind though like bot check and surf safe and robot labs used to detect computational propaganda on twitter and check for fake news sites using ones browser. But these programs need to be constantly updated. They present a promising start for tools that alert or disinformation threats, but they must combined with others in order to be trulyfective. Another example of a propaganda tracker is [inaudible] a project from the alliance for securing democracy and the german marshall fund, gmf, that was built to track russian twitter accounts. Although its important to identify nefarious traffic and equally important to notify users they may be encountering false news reports, these efforts are too passive and too focused on userbased fixes. Also its important to remember that a good deal of Research Shows that post hoc fact checks do not work and the social media firms are engaged in a constant battle to catch at least new sigh bork and humanbased cyber operations. More than anything, i want to communicate that everythings not lost. People are fighting to stem the tide of digital prop can da. Employees at facebook have managed to dismantle disinformative advertisements on topics from payday loans to which candidate should get a vote. Googlers have stood firm against military Drone Research and manufacturing. Its also leer to me that todays large tech firms have to get real with themselves. They are now media companies. Purveyors of news, curators of information and, yes, arbiters of the truth. They owe a debt to both democracy and the free market, and their allegiance to the latter doesnt mean they can ignore the former. So one of the things that you might not know and you probably dont know, because why would you, is that the English Version of the book subtitled how the next wave of technology will break the truth and what we can do about it. For some reason didnt make the american version due to editorial decision, but i really like the what we can do about it. And so im going to tell you a few things. I talked about the short term, the medium term and the hong term. One of my closest friends said the university of washington, who was brilliant, got a message saying they, i dont out them, had shared a known piece of russian disinformation on tumbler from the research agency. This is a person who studies this stuff, and knows it really, really well. If they can be fooled by it, if i can be full but, then we can all be full by it. So we have to read the whole article and we have to think very carefully before we share what we share. What we are seeing right now is not the proliferation of cheap fakes its not like sophisticated ai video that makes donald trump look like he saying something that is not saying. Theres a whole chapter on this and its coming. We will start to see more of it. What we are seeing as regular people sharing videos that are edited on imovie. Videos of joe biden who looks like hes a racist because his edited videos of nancy plus he looks like shes drunk. Looking Like High School students the something happened completely different than what actually happened. Also of jimmy costa was sped up that looked like he was abusing a white house intern. Got his press credentials revoked and then subsequently reinstated through some strain episode of mr. Magoo. So we have to be careful about what we share. The other thing that people can do is talk to the people that they love and they care about. I just wrote a report for the National Endowment for democracy in d. C. Its called the demand for miss information. The main takeaway i got from that report, i read it with my friend katie joseph who gets credit for the work. The big takeaway i had, was the only way people really change their minds when it comes of these kind of issues and the polarized issue of the United States and other countries, is a talk to people they care about most. Because psychologically you dont change your mind on the conversation have on facebook, based upon an argument that you have with someone that you dont know. It needs to be a conversation with someone who is civil and about a topic that you care about. That needs to be conveyed. So those are shortterm things. In the mediumterm, we need policy. We need regulation. I am so sick of hearing from people that this is a user issue and selfregulation is going to solve this. We cant just let google, facebook, twitter, and the powers that be make decisions internally about what they are going to do. When thats the case will say researchers like me, your research is flawed. Its not scientific. And i say would emit flawed is not scientific. Will you didnt have all the access to the data that we had. And you didnt have a representative sample and the quantitative analysis she did. And i say oh well maybe you could share with me a whole data set . And then ill do the analysis, how about that . Like yeah well figure that out and they have this thing called social science one that is supposed to do that, but its not happening. Theres no real regulation holds them accountable for going on in their platform. Theres no oversight. In the early to thousands they made a decision that said they werent going to look at any political communication during elections online. They look at tv, they look at radio, they look at magazines, and they make sure campaigns are not doing really elicit mess of things in those areas. But online they dont look at all. Thats hugely problematic. The government has a huge role in this. And the government needs to do something. Currently, nothing will get done, there is policy that has been created that is waiting to go before a more lets say a lesser polarized congress. Im very happy with that i think is wellinformed. But right now theres not much appetite to it. However, we can look to other countries, other places throughout the world to figure out the ways in which they are dealing with the problem. Some countries in south america, and then theres the long term fix. In the book the tagline is you got to design with democracy in line you got to design with human rights and mine. My belief is the platforms that we use today, its not much of a belief but a fact, are designed for engagement. They were designed to get people to stay on them, they were designed to scale, to grow to massive sizes. We dont have to accept that, its not so that we asked for it exists. Its infrastructural in society. Facebook is over 2 billion users. There were also designed to make money at the end of the day. What happens when thats the case . I think what we get what we have now. We gotten what they built, and what they built was a system that did not prioritize highquality information, or engagement, or civility. They built systems that prioritize the opposite of those things. I have a belief that its possible to actually create technology in the interest of democracy and human rights. We see it to some extent when we can see more of it. With jane mcgonagle who really guided me on this book, i cocreated the ethical os with funding from the video network. You can find it for free its a tool it had a hand in that gives technologist a bunch of proper occasions about how to think about the problems that could happen with the technology they are building as they are building it. Its kind of a toolkit to think through things. Introduction to Computer Science at stanford has user, we talked about white, nader about using it and other organizations about how they can leverage it. Thats really exciting for me because it means theres things we can know questions we can ask. The long term is more of a challenge. And at the simplest level, we have to reinvest in Critical Thinking in public schools. We have to reinvest in Media Literacy in public schools. Its not enough, its not fair to expect that this is a user fix and a people based problem as we have been told when the education institutes we have in this country dont give your Critical Thinking until you get to college a lot of us dont get there. Its time for us to create a robust system also. The 1950s a bunch of foundations came to United States to create Public Interest law because if youre poor you cannot get a lawyer. And you might say to some extent today that is true and id say youre right but would come along way. We need the same thing for technology. We cannot always have the best and brightest going to google and facebook to make a ton of money because they think theyre going to do something good. They are sold that line. We also need the best and brightest going to nonprofits, universities, we need people to understand technology to be helping build technology and trend legislation. Many people to understand code to help it build legislation on propaganda. While diane feinstein, local hero called on accountability and abuse act is put before the senate a year or two ago, was laudable. It was also not feasible at all. Because i dont think technologist had really looked at it. The bill is written in such a way it did not understand the structure to the web. Over half of all internet traffic comes from automated accounts. That probably would not happen if we had Public Interest technologies. When we studied germany, around 2016 during their last election, their election then, we thought we would find a lot of people sharing disinformation but we did not. And we were wondering why the heck that was. We knew the far right in germany had gained a foothold and become very powerful. But we quickly realizes germany had a really robust public media system. Germany had a Robust Program for Critical Thinking in public schools. After world war ii, with what german experience with the 90s, they made it illegal to talk about White Supremacy and things in the public domain. We are very, very afraid in this country to think about hate speech because we are afraid of free speech rightly so in some ways. But we cannot always treat the First Amendment as if its at odds with the right to safety, and the right to security. We got to figure out a better way. In the introduction to this book, i begin with a quote from betty reed who is from the bay area, shes in the east bay in her late 90s. She is a park ranger at the rosie the marin intent the riveter park museum. My wife took me i was kind of reluctant to go but i found myself in tears at the end of the talk because she is one of the most inspirational speakers i have ever heard. She was so amazing. I would highly recommend looking or about youtube. Middle end with this much s and questions. She said every generation i know now has to recreate democracy in its time because democracy will never be fixed. It was not intended to. Its participatory form of governance and we all have the responsibility to perform that more perfect. Thank you. [applause] so now i think we can do some q a and im happy to manage it. Dont be shy in a questions and when . I enjoyed the book and your talk for sure. Ive a content question. First off i thought is interesting how you brought in the emergence component, and to me when you brought up the kgb and stuff in the 50s, weve seen the stuff before. But to me, want to think about emergence, i think about as you scale some of these things up you get more than just the sum of the parts. And we do see the emerging effects of some of the staff at scale that does render them kind of a new beast in a lot of ways. And kind of thinking about the deep state protection stuff. For a few more years we will be able to do it for sure. Five years may be, ten years optimistically. But ultimately a lot of these techniques are asymmetric to the attacker. And i think that is something really important and gets lost in the discussion about this. The other question, i wanted to ask about was what you brought up kind of in the book about a lot of the blame gets put on the people who are building the solutions. I totally agree its a techno libertarian thinking when it comes to whos building the platforms and algorithms. But ultimately, to me, though systems are just answering to a ceo that answers to a board, that answers the shareholders. So to me all of this is symptomatic of capitalism driving things at the base. And so i am wondering how you see a solution existing in this kind of incentive model . On your first point, point well taken. The point on emergence to get into the heart of that, for everyone else, we seen disinformation and confrontational propaganda lunch at the public and often time its disinformation when they began. They are purposefully spread false information intended to manipulate. They become disinformation very quickly. When i worked at google jigsaw is a fellow for a year, it was interesting. I was an experience that really shook me. What they called it was seeding and fertilizing. You basically plant the seed, then you fertilize it, then you let regular people do the work and spread the propaganda for you. It is difficult to track, there is this problem you cant figure out where the snakes begins in the tail ends. Tease that is a metaphor. I think, with the second question, so yes, this is a problem of capitalism in the free market there is no way of getting around it. When you prioritize the profit you prioritize system that take advantage of people a lot of the time. But not out myself but my masters is in Critical Society and im spending a lot of time thinking about the stuff. I believe that serious changes have to be made, but i dont think we have to throw the baby out with the bathwater. What we have right now, it may not be the best system that we have, but the quote gives me hope because its the idea that democracy can look different in different generations. And i dont quite know what the answer is, i cannot answer your question, i wish i did have the answer. I act like i did not know anyone and probably just leave. We can rebuild democracy in a different way that interacts with capitalism in a different way. And maybe need to speak the language of the market. Recently when ive been talking to people at facebook and google ive been saying maybe it will be Good Business for you. Too actually do something beneficial for society. [laughter] think about that. How do we market this as Good Business . Other questions . Anyone . All right, thats great. Short and sweet. Thats by far the shortest and sweetest of all them, thank you, thank you very much for having me, this was great. [applause] next on book tv microsoft president brad smith talks about the Ways Technology can be both a tool and a weapon. And later Caterpillar Foundation president michelle sullivan, provides her thoughts on leadership and philanthropy. And espns undefeated editorinchief kevin marita talks about the fierce 44, black americans are shook up the world. Check your Program Guide for more information. Brad smith, microsoft

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.