comparemela.com

Thank you for coming out tonight. My name is rachel miro. I had to look at the sheet of paper to make sure. [laughter] rachel i am the Senior Editor of our Silicon Valley desk here in san jose. I should say Silicon Valley because we are in mountain view. Joining me on stage is the director of internet ethics, alex damas, the director of the stanford internet observatory and the director of journalism and media ethics at Markkula Center. This evening is copresented with the Markkula Center for applied ethics. This event is part of kquds series on common ground. It is an initiative bringing people together for civil discourse, featuring journalists hosting provocative conversations about politics, policy, art, culture, science and technology. Reckoning with the disagreement among us about how to face the future of economic, culture and environmental uncertainty. This series asks what are our shared responsibilities to the common good . Common good . Next in the series if you have an open calendar is this tuesday at 7 00 p. M. , at the San Francisco exploratory them we rium. We will look at how to overcome the polarization. On to tonights topic. Democracy is under attack worldwide. Populism is on the rise. Disinformations tool number one disinformation is tool number one and social media is the platform of choice. What can we do about it . We can start by talking. Alex, i take it you have some to offer us about the topic deju jure . Russia is again attempting to influence the American Election for president. Alex that is what we read in the times. That was a briefing given to the house subcommittee on intelligence. But there are no details. We dont know what they mean by that. There is actually, from my perspective, there are five different kinds of interference in a 2016 election. It is not clear if they are doing any of the same playbooks or something totally different. Rachel show us a few examples of what we remember from the 2016 election. Or perhaps many of us dont remember because we never saw it on our feeds. Alex the kqed audience was not the total target here. [laughter] alex a lot of the russian influence was aimed at the left and right. If you dont mind putting my slides up. The two major types of disinformation or Information Operations is the term we use to make an attempt to change the information environment. It really had two big directions. The first was warfare. That is about the driving division by creating medical memes that are injected into applicable discourse. In this case, these are three examples from both sides. Fake profiles and fake personas created by a private company that belongs to an oligarch in russia. The one on the left is supposed to be a prolgbt group. In this case, this is an lgbt coloring book for Bernie Sanders. Which is the kind of thing that is a funny little thing that you might post with the goal of getting people to join your group and to see your content and then most of the content had nothing to do with elections or politics. It was content like this that drew people in. That would allow them to inject messages later. Secure borders, an example from the right. A big topic of russia disinformation 2016 was antiimmigrant sentiment. In the bottom right, this is a twitter account that pretended to be the Tennessee Republican party. It turns out, the entire time, the social media intern lived in st. Petersburg, russia, not florida. Here is some more from instagram, mostly. As you see, it comes from both sides. All of the tone is get over it. It is from a fake Instagram Account called blackstagram. The number one topic pushed by the russians in 2016 was black lives matter. A big goal of theirs was to try to build africanamerican support for these fake personas and then inject messages about hillary being racist. It was stolen from bernie as well as messages that might have been seen by conservatives and then seen as being really radical. I will give you guys a second to look at this before i ask you some questions. Why dont you check out this post. This is from a fake black lives matter group called blacktivist. Black panthers were dismantled by the government because they were black men and women standing up for justice and equality. You probably noticed the strange addiction. The english was not perfect. This kind of work by the russians is being done not by intelligence specialists but the Internet Research Agency Employees were effectively millennials with english minors that could not find better jobs in russia. These people were not professionals. The language is not perfect. Now that i am a fake professor, i do a little socratic method. Who thinks that this being posted by somebody in st. Petersburg, russia is illegal . Raise your hand if you think it is illegal for them to do this. Only one person. You guys are right. It is not illegal for somebody out of the country to have an opinion about the black panthers even if they are lying about who they are. It is a violation of facebook terms of service to do this. Facebook terms of service do not have the force of law. Is this fake news . Raise your hand if you would call this fake news . It is interesting. The thing here is that theyre not making any falsifiable claims. This kind of argument of what was the reason why the United States government prosecuted the black panthers is something you might find in a freshman africanamerican studies seminar. This is an argument within the overton window of political discourse, but they were trying to amplify. Rachel remind us of the overton window. Alex im sorry. Do one of the real professors want to talk about the overton window . I am not a real professor, we are all fake professors. The overton window is the idea of what is the range of acceptable discourse. These are the things that are allowed in any society. In this case, american society. Within the reasonable bounds of discourse. That window can shift back and forth based on people being on the extremes. This is a real email. Does anybody want to guess who got this email . Who received it . John podesta. Yes. This is the email that john podesta received. It was telling him that somebody tried to sign into his account. It was sent by the main intelligence director of the kremlin. We are talking about real intel people, these are the people that like to kill people overseas. The joke is that the person tried to break into the account from ukraine. Which is a little jru inside u inside joke. These guys are hilarious. This is a redirector that sends him to a site googleaccounts. Net. When he got there, it was a perfect looking google login. He asked one of the i. T. People at the dnc whether this was legit or not. Apparently that guy replied it looks ok but meant to say it does not look ok. Which perhaps be the most important typo in the history of the human race. He logs into that and gives the password, the gru goes. They broke into the dnc using technically sophisticated work than this. Once they had that information, they were not releasing fake accounts. They were not releasing fake information, the cherry picked the emails that told the story they wanted to tell. Which specifically was the story that Bernie Sanders was ripped off in the dnc primary. To do so, they powered that message through real emails. Where people were saying not nice things about Bernie Sanders. They did this through a fake persona. There is a history we cannot get into. That failed. They tried it again through an Organization Called dc leaks, which they were pretending was a real leak site. D. C. Leaks reached out to a bunch of journalists and said here are some documents from john podesta into the dnc. And the journalists complied. Politico had a reallife blog of all the most embarrassing things that john podesta did. Dump. Odesta adopt even the New York Times ran with the stories over and over again a the stories over and over again that the gru wanted them to run with. If you go to paragraph nine or 10, this says that it could be part of a russian information operation. It does not matter when that is your headline and that is what people are reading. Some other examples of disinformation around the world, these are two real Whatsapp Messages in india. In india people use the internet differently. Whatsapp is the most Popular Communications medium there. Something like 400 Million People have accounts in whatsapp. Whatsapp is not like facebook where you can post something that a Million People see, you can send a message to up to around 200 people. People in india a part of many groups, family groups, school groups, work groups. Messages get passed along by individuals copying and pasting messages that are injected in. The one on the left is from attacking the conservative Political Party in charge of india. The bjp. Possibly supporting the congress party, they are big enemies. Basically lying about the price of what gasoline costs in other countries. The one on the right is racist propaganda about Blood Donation camp being fake. Disinformation looks different because, if you look at this, it is saying i am from a black lives matter group. When you are seeing this disinformation, it is coming from your uncle, aunt or your coworker. Its much more personal. Its harder to amplify, but in india you have groups working for political parties. The theory is that the group is about a Million People who have signed up to push disinformation and they dont believe in disinformation, they just believe it is the right news. On behalf of the Political Party. They get notifications from the official group and then they copy and paste it and spent all day sending it out to the other 400 Million People. We are still seeing this russian lead activity around the world. This is the report that our team wrote. Its at io. Stanford. Edu. What we found was a Disinformation Network in africa. This time aligned with wagner group. It is a company he owns that has actual paramilitary mercenaries. So, people that go into countries to kill people on behalf of autocrats. They are supporting autocrats on the ground with guns and with disinformation. In doing so, it looks like not for the Foreign Policy outcomes of russia but for the personal financial benefit. In Central Africa he has things like diamond mines and the like. He is backing two of the six people vying for control of libya, probably to get gas and oil rights in the future. Interesting changes from 2016 is this is no longer people sitting in st. Petersburg. They were reported back to people in st. Petersburg. One of the guys doing it made a mistake and he posted a picture of his trip to moscow on his Instagram Account. Which is kind of awesome. The people doing this work in sudan are actually sitting in sudan so it is hard to catch them. Their language is better, their cultural knowledge is better. It is multimedia. This is a whole newspaper that seems like a legitimate newspaper. That is mostly not about russia and not about Foreign Policy, it is just a newspaper. He also owns the Radio Station in madagascar. They are building the entire pipeline, they can manipulate the media but they also create their own media and the amplify that media online. Lets start with that. Rachel that is a little overwhelming. Yes, thank you. There are so many Different Things to parse out in alexs presentation. One of the first things that occurs to me is the question of whatsapp. There are so many people around the world who are unencrypted platforms if you will. Even though you can argue that journalists and regulators are doing a good job with the platforms on the open, they do and even less good job when the information is encrypted. If you think indias example, whatsapp, there is a particular case whatsapp is used in india but in restaurants, it is to pass the menus out to people in the communities. People in that community took order stuff from the restaurant and then they come and pick it up. Afterwards, the Restaurant Owner will share some video with you on whatsapp. There is an interpersonal acceptance of liberal privacy, in the sense that, i dont mind you sharing something with me even though i dont really know you other than the transactional relationship with the service. In the u. S. , it is a different kind of sensibility. If i get a Whatsapp Message with the video from somebody i dont even know, first of all, i may not have a whatsapp connection with people in that sense. There is a huge advantage that disinformation actors have in places like india where whatsapp is literally an interpersonal thing. As well as a transactional thing at the same time. That is one thing. About the overton window one of the things it will help to understand is it existed because of the centralized paradigms of ownership in the media from print to radio. As long as media was owned by a few organizations, 10, 20, 30, depending on where you are in the world, there was a cultural sensibility and acceptance of values built in. That all broke with social media. You and i have a microphone, we can amplify our speech. There is no such thing as an acceptable window for what is acceptable in a democracy for public speech. That is what is broken. Rachel part of the question is who is responsible for the changes in the window . It was not that long ago that social media started. When it started, people were not sharing news articles. The whole idea of social media was to connect you with your friends and your family. At some point, that paradigm shifted. Part of the responsibility lies with social Media Companies that have certain affordances. They guide you to say you should be able to use this for this or that. In 2014 Mark Zuckerberg came out in 2013, Mark Zuckerberg came out and said we want to be everybodys personalized newspaper. Facebook was not something people thought of as a personalized newspaper. Suddenly there is this whole encouragement, you should be sharing news stories. Suddenly, i am your aunt phyllis and i endorse this message. Thats how it comes across. Some of the responsibility definitely lies with the platforms. Some of it definitely lies with us. One of the early promises of the internet, people sought with the advent of blogging. I remember people saying that we are all journalists now. Turns out we are not. On social media, turns out we are loudspeakers for other peoples messages. That is a different role. We all bought into this role. We find ourselves doing it. When we talk about response responsibility, we have to talk about these different layers. I found an interesting poll that was done last month by npr, pbs newshour. They asked people who should have the main responsibility for a dressing miss disinformation. Addressing disinformation is pretty vague. Whether it is not to do it, not to respond to it, to highlight it, the numbers are these, 30 39 said the media have the main response about he for addressing this information. 18 pointed to technology companies. Half as many. 15 to the government and 12 of the public. 29 of democrats assigned the main responsibility to the media and 59 of republicans do. We are polarized even on who is responsible for doing something about this. Rachel as a journalist, people dont want to accept the information they receive. You say this is it, i have the answer for the question you had in the response is, no, thats not what i believe. As if it is a matter of opinion. Some people are not looking for information, theyre looking for confirmation and they are looking to signal identity and to be part of a certain group. Increasingly reading about the fact that people often know that they are quite likely sharing misinformation. They are ok with that because thats not the point. The point is not to inform people but to say this is what i believe. I think what is really interesting for the rest of us is that we have these common human weaknesses that make us do the same thing even if we dont intentionally mean to. What i definitely learned, my colleagues will tell you that i have to check myself all the time. If i find something that is absolutely the best illustration of what i believe, this just occurred and it totally confirms everything i believe, i just have to sit on it. More often than not, it is a set up. It is designed for people like me to respond that way and to share broadly with others in outrage, and perpetuate that miscommunication, misinformation. It is often not an outright lie. It is out of context or it is made to push a certain narrative. Rachel much more like a omission. As recently as two years ago, i was among those who would laugh at politicians and regulators who were so behind the times and unable to find her unable to find their left hand and right hand. Of course, it would be no position to craft laws that would be out of date as soon as the ink was dry. Now, i dont know, the whole disinformation situation online is such a dumpster fire. I dont know that there is anybody who is on top of it. Even if they did nothing but read facts all day. What is this message for what can we do from a regulatory standpoint to try and control some of this . Or is that a hopeless task . In the United States, we are extremely limited by the First Amendment. Yesterday i was in washington as i got home at 2 00 a. M. Today. So, if it looks like i am asleep with my eyes open, it might be. This has become the big punching bag for people of both sides when they say Tech Companies need to fix things. It turns out the vast majority of things people complain about, what is called cb 230 is the First Amendment. Is that political speech has almost never any criminal or civil liability in the United States. Even if its false. The Supreme Court has multiple times said that if you lie intentionally, in most cases it is a crime. Its not a crime and it cannot be punished. Most of the stuff we are talking about is something you would never be able to adjudicate as false anyway. Even if we were a different country. The regulatory in the United States, theres not a lot of options here. Even other countries, this has mostly been about things that were already illegal speech in the country. The most effective regulation has been a 27 letter german word that i cannot pronounce. But sdg is the acronym. That is a law that requires the Tech Companies to enforce german hate speech law. But that is hate speech law and its not about true or false. Even that has had some real issues when it starts to apply to things like sarcasm in comedy, and the like. In the United States, the regulation, the place we can regulate is not the content of, but the mechanisms by which people can do political advertising in the United States. I would like to see rules that figure out everybody is guessing what the rules are for political advertising online the fcc has not ruled about how these 80s and 90s laws apply to 2020 online advertisements. I would like to see restrictions for political ads about what kind of targeting you can do. Basically, a minimum size so you cant target 20, 30, or 40 nr a members. I have seen a couple of tote bags. There are multiple people in here that have given to pledge drives. And if you havent, rachel knows your name. She will hit you up afterwards. [laughter] you can limit it so you cant just hit members that live in santa clara county. You have to advertise to much broader sense of people and have rules around transparency. My colleagues at stanford that study these issues believe you can have these laws as long as they are content neutral. They apply to the actions and the platforms about what kind of tools they provide, and such. To regulate the speech of individuals will effectively never happen in the United States. I want to take a step back and ask people to think on the slides you saw about different examples. One of the things common to all of them is the visuals in them. It has to do with what we are trying to do with our behavior when we see the visuals. In my mind, i have ended up having to ask myself three types of questions. What actors or individuals are in the post we see. We know these actors. Good actors, bad actors. A different type of vocabulary is if this is supply or demand side. Where are you. Are you on the demand side because you are the reader . Or are you on the supply side . Are they supplying you information . It could be people who could be anything. Journalists are not the only people posting. All the organizations involved. The third thing is, what kind of behavior is actually going on . They are expecting you to instinctively believe that thing and then they are expecting you to actually share it. There is an implication to your fast thinking. There is a very famous book called thinking fast and slow. Daniel monahan. The easy way to understand all of this is they are making you think really fast. When we think fast, we will act with our biases. We are not going to cognitively process what this is. The idea over all on social media posts or visuals is to click, click, click. The idea is click your way to really quick activity. That is the implication. Millions of shares can happen on a video before people know if it is true or false. The approach to this is to actually slow people down. In the midst of all of the other ideas that you have what can the public do . The only thing i would ask for is when you see something that is too good to be true or you can feel a feeling creeping up, you know that feelings drive behavior. But then feelings will make you make a mistake. The first thing we all have to do is slow down. There is one line that came out of a south indian film. I dont have to name the film. It is about an anticorruption actor. It essentially says that for the truth to win, you need evidence, you need authority, proof. For life to win, you just need to sow confusion. Confusion is fast, the truth is slow. All good things take time to emerge. If you look at the antarctica example, last week there was a story with antarctica hitting 69. 3 degrees. But if you read the stories in the post and elsewhere, those scientists are taking great pains to say we are not certifying this weather station. Wow they hit a record high, nobody ever heard of antarctica hitting this high. They are still at great pains to say we are not certifying this yet. This is what it takes for science to even say something. It is slow, painstaking but it is very easy to go out there and say they were pushed out of the gallery in the senate. I dont know if you heard of ted cruzs tweet which was actual disinformation, he deleted it after that. There should be any politicians approach, any journalist any journalists approach. You should not just delete it. You should post a corrective tweet. Saying heres what i said that was wrong. And correcting it. The steps or behaviors are implicated. One is slow down and the other is identify who the actors are. If youre on the demand side, we have to regulate our own behavior. If youre on the supply side, there is all kinds of regulative mechanisms and codes of ethics that kick in. Rachel im going to challenge you there. In the olden days, i would encounter ideas i disagreed with. Now i am in a silo even if i dont want to be because of the platforms i am browsing on put me in a silo. They have algorithms making sure nothing encourages me to move my eyeballs off that site. There is filtering happening before i encounter it. On this point, if you have to look at algorithms and personalization. If im looking for shoes on amazon, i dont mind it kind of personalization based on shoes ive ordered earlier and so on. Journalistic content is a public good. You cannot take a public good in the form of posts and other people sharing articles if you try to personalize a public good with algorithms, you are most certainly going to get it wrong. That is the reason people start using a valuebased expression that all accumulates. These algorithms dont know the difference between fake and fiction. They dont know. They cant tell. There is a lot of intelligence now over the past few years but it is actually coming now. And even now, there are technological issues. First of all, they cant tell what is fake and what is not until they are trained. Number two, even if there was no fake news at all, personalizing public good like news forces democrats to be on one side. You will see less value system and everybody sees only what they want to see. The alternate is asking us to diversify our own facebook friends. But that is hard to do. For me to wear friend a conservative just because i want to do that is not something that will actually happen. [laughter] i am saying it will happen incidentally. I want to see the news that she or he are doing. It is because i met a friend on the road and he happened to be of this point of view. The way people need to personalize their news is to take news out of the picture. All of those things are good. That is what is good to have on social media. That makes up is a big mixup. It is the design that they havent even thought of. They are afraid that the minute you start saying what is news, to separate news out, youre going to start questioning how you define a News Organization. What is news . What is journalism . Those kinds of definitional issues have not been taken into account in the early stages of engineering infrastructure. Now when they come in, you have to define free speech. Rachel i dont know where were going to define what news is and to be able to draw those lines correctly. We are talking about a lot of Different Things at the same time it occurs to me. The easy answer about this information is you need Media Literacy. Need to understand what are the right sources and how you analyze the messages. It turns out that the researchers are now pointing out that the language of Media Literacy and Critical Thinking itself has been weaponized and used in order to push this information. Think for yourself. Dont believe what the mass media tells you. Go out and find your own a little bit in despair because she like many of us believed in the notion of Media Literacy being helpful. And i will say im giving my own terms to what she proposes , i think we need two additional things the social Media Literacy. An understanding of how social media works and the impact of what happens when the news is shared by friends and family as opposed to some kind of outlet out there. We need to understand self literacy like talking to students and i would say all of us about our own cognitive limitations and blind spots and how we can be manipulated. The only way to respond and prevent disinformation is to be cognizant of all three of those layers. And they are different layers. How the media works, how social media works, how we work, and deepfakes,mentioned or boths. Those were challenges that we were finding new and different, in 2016 the bots and how they are amplify all the stuff. And they are still there but the social media platforms are getting much better at identifying them. We have all read many stories about how there is effort out there to caricature the other side. That is one of the things we are seeing with black lives matter. I hope we learn to distrust if you see something that makes them look really egregious and extreme, that is probably the russians. There are some but they will find those few people that are that extreme and then try to make it seem like that whole side is that way. It makes it much harder to have those friends on facebook and have those conversations. Just to mix all subjects together, some of it is because of the nature of social media. Think about it it is very rare that you find yourself in a room over and grab parents and people you met in school 20 years ago, someone you met at a party, someone at a funeral. But that is the social media environment. It sounds like a funeral. [laughter] rachel what we found over time researchers call this context collapse. Youre not sure how to talk to all those groups at the same time. And who you are visavis all of those groups at the same time. With context collapse and then conflict with people who dont know each other particularly well and they start to argue, you tend to shut down. There is something called the the spiral of silence. In the face of disorganized conflict, youre not sure how to talk to them. People tend to shut down. Researchers found that people would argue less on social media and then even at the dinner table would talk less about some of these very conflicting issues. That actually leads to polarization. Because we are not talking to each other, we dont understand each others perspectives. Something in the design of those platforms and in the kind of interaction they created led us to where we are now. One of the things that is really important is that we learn how to have Constructive Conflict with each other, how to talk and disagree and that not everything has to end up in a flame war. Not everything has to end up with exaggerations of each others positions or ad hominem attacks. Also, how we talk to people who come from different perspectives than us is something we have to figure out. I think that would help with the disinformation and polarization as well. It is interesting you say that because i do feel like im experiencing this more with in person conversation with men and women both. The desire to broadcast, people just want to keep talking. They are not actually interested in what you have to say. Even in response to what they are saying. But guys, im going to ask each of you to help me stop this despair and resignation. Alex, why dont you kick me off. Alex this is not a happy crowd. [laughter] point out a good actor, some someone you think is doing something clever or effective. It could be in the world of government, journalism, it could be platform technologists. Who is doing something we can look at and say they are going in an interesting direction that we can follow . Alex a number of nordic governments launched social Media Literacy campaigns. Just like with everything else, it is really rough to say the danish are doing it, it works in the United States. Including maybe on health care but we will find out about that. But their numbers are looking good. There has been good feedback for that. One thing about all those countries is they have high trust in their government media. Like nrk in norway. There is a number of countries in which the bbc, abc, australias abc, government supported media is independent enough from the government but also kind of constrained by the board of governors to stay as neutral as possible but they have not chased the clickiness. The truth is it is very difficult other than present company excepted, to think of a Media Organization that hasnt become very strident in his in its political views in the United States. You cannot argue that the New York Times has become more liberal in the trump administration. Five or six years ago, msnbc was in real trouble financially and they found a really profitable model in being what fox was to obama and the equivalent to trump. I think part of it is it would be great to get back to a world in which there are Media Outlets that are seen as being more neutral carriers. I am not sure that is a genie that can be put back in the bottle. Im going to push back a little bit. I feel there is a lot of nonempirical referencing and algorithms. It is like an easy pivot to say it is the algorithms that are making us crappy humans beings but it is actually the human beings that are the way we are originally. People go on social media to be reinforced in their own beliefs. They search out the information that reinforces who they are. That is true of everybody. You have to be really careful about saying things like you want to algorithmically expose people to other information because everybody who says that mean the other people. They dont mean themselves. This audience will be ok with facebook saying, we noticed you read too many atlantic articles, you need to watch some alex jones. The evidence is more mixed than conventional wisdom. We did in experiment at facebook. The quality of peoples newsfeeds go down if you turn off the algorithmic ranking because the ratio of how much people post of crappy news posts versus them writing something up about their kids or their birthday or that kind of stuff is a 101 ratio. The algorithms, even in 2016 overemphasized people posting about interpersonal things. Rather than resharing. We have to be careful. The youtube algorithm there have been studies it can lead to radicalization but also in other studies exposed people to information they wouldnt have seen. We have to look at ways to slow people down and slow down the speed at which information will flow in a viral context social media. In cases like on instagram, they have done an experiment less about fake news and more about making people nicer to each other, slowing people down as they respond to somebody in a way that sounds like it is mean. Saying, do you really want to respond this way . And making people take a moment to think before they act. There are a number of experiment s like that that might be helpful. Rachel before either of you answer, i just want to find out i would like to push back on his push back. Rachel i noticed that some of you out there have cards. You will have questions that you want to ask our fine panelists. We have some people in the audience collecting those cards. Take this moment to finalize and make sure they get into the hands of one of our card collectors. In terms of solutions, i feel that one example of a Good Organization working in the real world and trying to help is first draft. There is a simulation they do where they ask News Organizations and other actors and individuals to act out. A deepfake has emerged, it looks like it is from the election of authority of a country, what will you do . All of the big News Organizations get into a tizzy. Because breaking News Organizations do not have the time to verify across multiple sources. They want to compete with each other with who is going to go with the story. That is a fundamental and ethical issue with how it works. Because of the way first draft is asking News Organizations to do these simulations ahead of time, it is helping News Organizations realize what is the problem with the journalistic process. You are relearning the process and you are not the gatekeepers anymore. The process by which you build stories has not really changed. We are still sourcing but we have an infinite supply of tweets to source from. Some of those are just lies. In effect, there is a call to reinvent the way a story is built and that is all for the good. The way we are building stories has to change. This is one thing. As far as algorithms and personalization goes, i agree with what you said. It is behavior that is giving signals to the algorithms about what you want to see. But the design of the platforms is to encourage us to engage around paradigms. If you ask facebook what is your metric for engagement that is not based on clicks alone, not based on shares, if it is engagement based on what people have learned from this post. What people have comprehended about a particular thing. They say we are not a News Organization but with the distribution, there is a fundamental question about what algorithms and people are doing to each other. It is making our worse angels come out. We also have better angels when we slow down. That is why we have the emphasis on going slow. On the distribution side of what i would ask of these firms, when you see something that googles organizations have said is false, why are the social media platforms urging you to share it . Why are they urging you to like it . This feature called like is affirmance. This feature called share is an affirmance. It is universally offered on all of the posts regardless of whether or not they themselves are marking it as false. There is a need to think about when we offer what features universally on all posts. My turn. My turn. My turn. You asked about good actors and who is doing good work. I want to highlight the work of researchers like Kate Starbird and jonathan albright. They have been telling us for years what kinds of things are going on, the kind of disinformation described back in 2016 by jonathan albright, tracking the Network Travels of information and why it is being distributed certain ways. Kate starbird, we had her speaking at santa clara years ago about how they have a lab that found the same people were sharing black lives matter tweets and blue lives matter. Again, highlighting the sharpest aspects in either direction. These researchers are doing really important work. And if some of the companies had listened to them years ago, we would be in a better position. I do want to put the responsibility on the companies and you dont want to get me started on the facebook algorithm because im really big on choosing for myself. And the idea that someone would know better for me really bugs me. I am big on autonomy. I think they really do encourage certain behavior. We need to talk about what we can do ourselves to build our own practices and our own habits. One of them is selfcontrol. I was saying earlier be aware when something seems like exactly the story that confirms everything you believe and dont cherry pick. I would go broader than that. I will say if you have any doubts about a particular story, dont share. Your friends are not going to be less informed just because youre not broadcasting that story on some social media. If we learn some of these selfcontrol techniques which are made harder by the fact that the companies are putting in place various affording to make a shares quickly as possible, we are going to stop playing apart in this whole misinformation disinformation ecosystem. Do we have some cards . Lets get started. Alex i think we have to be careful when we talk about solutions and how much power we put in the hands of the trillion dollar information intermediaries. After 9 11, we asked the intelligence agencies and the military in the u. S. To keep us safe with no countervailing equities. Bringing the cia and nsa in front of congress and making them cry and yelling about 9 11. That is how we got abu ghraib, the iraq war. That is how we got all of the nsa oversteps seen in the snowden disclosures. We have a tendency in the United States to feel like something bad has happened, we are going to yell at these very powerful entities to fix it. If we put them in charge of that, they will fix it and then we might want to take that power back and we never will. We have to be very careful about it is emotionally satisfying to say that the intermediary is responsible. But then to say that they have to control the speech of other people and again, nobody ever means themselves. They always mean other people. Just be careful what you ask for. I understand that argument but my concern is that we that said the government cant do it in the United States and if the companies dont do it, then who . Do we just accept the status quo . In which entire droves of people are driven out of the conversation completely. Alex when you say they have to do x it has to be followed by but the limit is y. And that is where Something Like turning off the ability to like or reshare because it was marked as disputed. That gets to a point where i start to feel really uncomfortable about the ministry of information existing inside of facebook. Lets start with the question, i am a highly skeptical facebook employee. Zuck has been saying that we should not be in the position of setting grounds for independently arbitrated truths. What is your take on facebooks ultimate goal here . Is it responsible civic participation, revenue, or other . This seems pointed at you, alex. Alex he just changed his mind on this this week. I saw his interview with the ambassador, ambassador bissinger, where he said we should be regulated somewhere between the phone company and the newspaper. We are something new. That was not an accidental statement. There was an oped and then there was a white paper released about regulation. Facebook wants regulation because it is a superhard thing. They are getting tired of everybody complaining that you censor too much or too little. Also because they want to wipe out the competition. They would love the standard to be you must be this high to carry speech by people online. That standard of how high you have to be is one step below where facebook is right now. This is where google is, snapchat is and this is where tiktok is. Something that would be great for their competitive advantage would be legal requirements to do content moderation at that scale that works for them, so i think he changed his mind on that. I think it is about competitive space. Facebook already does a lot of what we are saying it should do or should not do. It already moderates content in all kinds of ways. It already amplifies some voices and suppresses others. It is not like this would be a new task for them. It would just be more overt and it may be applied to different speech than it is now. But they are doing it already. As we have seen with Something Like hate speech, if you drive it off platforms like facebook, you will drive it onto smaller platforms where they are not doing this kind of control. Alex that is fine. But you can make rules around hate speech and be controversial but they can be based around actual risk. True and false on political speech is the most sensitive decision you can make. Yes, they are doing it but almost all of the arguments you will hear from the left is they need to do more censorship. We need to be careful about having them make judgments about First Amendment protected political speech. How many people think facebook should have taken down that edited pelosi video . Ok. That exact thing was done to donald trump over and over again on Jimmy Kimmels show. Should jimmy kimmel be kicked off of facebook for the disinformation policy . Posts these videos i dont think so. I dont think facebook should censor jimmy kimmel. I think we have to be super careful about core political speech, making fun of politicians, for example. That that should be controlled by a trillion dollar intermediary that pays a ton of taxes and has a bunch of regulatory exposure and who can come under the control of the incredibly powerful executive branch any time. It is kind of nuts to say that we want them to do that kind of work. Can i jump in here . I think one of the core debates that does not happen often enough, alex is not new to this, it is the difference between amplification power through speech or taking down and keeping it up. There are organizations that have to find the truth and actually post it out. Their job is being made hard by social Media Organizations. Where all kinds of speech will come in and rightfully so. They will create confusion in our minds and then we dont trust that the news is true or not anymore, especially if it comes from a value system that is not ours. If i dont trust fox because i am left and they are right, i will not trust fox as a brand anyway. Likewise, on the right, if they dont trust msnbc, they will not trust an individual expose by that outlet. Real question here is if something is actually a lie and it is being posted lets reduce this down to political ads. These are paid political ads. If you ask the question should lies go through ads on social media platforms, what is happening is the goal of the advertiser is to target a bunch of people to make them believe that this is actually true. If you keep doing these kinds of ads 1000 times, 5000 times, you will make reality emerge and a in a bunch of peoples minds and they will not see any other kind of reality. You have a situation where you will have people who believe one thing happened, ukraine, biden, and other people in the same city who dont believe that thing happened. That can happen if you start putting lies in ads. The law apart, the question from an ethical standpoint is this is about democracy. You cant have democracy without truth. You just cant. So if these Companies Including News Companies and Media Organizations that put lies on a headline and amplify the lie. Both are similar in this space. They are born out of a democracy. The entire capitalistic innovation engine that led to these firms building these successful product exercises, a lot of good has actually come from this. If you run in this kind of government, there is more responsibility than just the First Amendment. We have to say what is the truth about . If we keep asking this question, we are not arbiters of truth, we have to be neutral, that is coming from a different angle. It is actually coming because one part of this country, the conservative part finds that if they apply neutrality then they would have to take down a lot more speech on one side and less on the left. Then there is a question of proportional takedown. There are all kinds of questions here around why social media firms do not want to use truth as a lever and i think those questions have to be talked about more than they are in the democratic context. That is a journey that america is experiencing right now. One other thing. If were interested in this conversation, it means we have been thinking about this. We are aware of some of the debates. But, there are a lot of people out there, what they get is each side accusing each other of fake news and disinformation and this one journalist wrote about this years ago how this was done in russia long before it happened to us here. What it does is the technique is just muddying the water to the point where people dont know at all what to believe, what is going on. It creates a kind of paralysis that is not good in a democracy. If we want people to vote and have an opinion about things and participate, we cant live in that soup of dont know what reality is, every source is biased. And a pox on all their houses. So, we have to figure out how to have some sort of measured skepticism but also trust in certain outlets and certain groups. Enough to keep the initiative to function as citizens of a democratic society. One of the groups most vulnerable to disinformation is older adults that are less tech savvy. And they vote. How can Senior Citizens be educated to be less gullible . [laughter] that is an excellent question. Almost always today you hear about young people. You always hear that young people dont know how to read the media and blah blah blah. I actually think we need a reverse stream of education in which we might need to get to the young people so they understand what is going on, all these people that we talk about. Then seriously, have them go and talk to the older people. I think we should not assume that the older people are not interested in finding out about all of this. Maybe the grandkids have something to teach the grandparents. One of the problems that wont get addressed in this version that we speak about is journalism and its role in itself. The reason older people get tricked is because there is some other sense of alienation, discontent, uncertainty. All kinds of stresses that people have. Older people are under stress. When people are under stress, they will react fast and they will not have the usual skepticism to things they see on their feet. If you go upstream of that when you ask what kind of things local journalists should be saying into their actual community, the necessary Community Building that journalism used to do is not happening as often in areas where there are not as many News Organizations. Usually, when people are in a community sense that has a democratic actor working, these kinds of regulatory mechanisms kick in. People will talk about this and say did you see that post . I was about to share it what happened . If those conversations are not happening anymore, that means because there is no journalistic role elsewhere. Then you will have people just posting away. It makes them outraged that they see something. There is a deeper problem in the u. S. That you guys may know about that cant be handled at the social media level. It is upstream of that. And that has to be included as well. We are talking about the weaponization of cynicism and skepticism. Do we have time for another question . This is about labeling news content. They have been pushed by platforms to label. Like a youtube banner on rt that says rt is russiafunded. Contentneutral but seems to be gaining traction. Is this a good and sustainable solution . Alex . Alex it is reasonable. You have to be careful of the labels. Experiments have shown that when you tell somebody that something might be false, they are more likely to engage with it. People do not like to be told what to think. Tag, noty the disputed false. More subtle. You have to be very careful using Real Research on what language to use. I think it is a reasonable thing. There are two Different Things, content and identity. Certainly, labeling the identity of groups that are known to be disinformation actors or sponsored by specific government is a totally reasonable thing. Thats the most effective reaction, the companies building up teams to ferret out that activity, to look at coordination between different groups and actual collaboration with the Intelligence Community in the u. S. , to understand who the actors are and what their technical specs look like. That has been very effective for covert propaganda. And so for overt propaganda, rt, sputnik, other groups, that labeling is fine and it should not be that controversial. You have to be careful but that is a good thing. Rachel are there Software Tools and thisfy bots information and are facebook and twitter implementing policies like this . Flagging possible falses . Alex a bot means an actual Computer Program doing the posting. Twitter has done a ton of work on this. Their numbers are looking a lot better. It was never a big problem on facebook. The majority of the people all that stuff we saw being pushed, those are people sitting there and doing that. You do not need robots because a lot of these actors are acting in lowcost environments. Even domestic actors. Some of the biggest domestic propaganda groups shut down by facebook were outsourcing to places like pakistan to get lowercost talent to create the content and push it. The companies have work around detecting, a new account with 10,000 tweets in the last week, either you are on a real meth binge or for seems to be a professionally shared account. They have to work around that and i think that will continue. There are hybrid accounts. Botsof a mix between taking over accounts of real people to make it harder for them to disguise. I want to go back to where we started at the beginning with a sense of urgency that i feel like we are still responding to the things we learned in 2016. And we have an election coming up and we have all these things acting to shape our behavior right now, so we cannot afford if the labeling will play a part, lets do it. It might not be perfect. We need to do things on all these levels. But we need to do them now. We cant talk about it as a theoretical discussion about what will happen in the future. Because if democracy is being undermined right now, we maybe will not get another chance to do things in the future. A Quick Response since we are speaking about twitter, it announced this week that they will start labeling lies. Six months ago or eight months ago, they asked everybody to give them input. Now they have come up with their approach. It is labeling. If i lie to my 20 followers, it will get labeled. What do you think . Alex i think we have to see. This is the problem. There is a lot of guessing what people think will work and then the empirical evidence actually looks different. I dont know if twitter has done any real testing. At facebook we did a lot of testing and found out the exact wording was key to getting people to engage with the alternate information. My hope is that twitter has done the same testing but i cannot speak to that. Rachel last question because we have to wrap up now. We saw this week also that democratic president ial contender Michael Bloomberg was spending money on everything, decided to pay individual californians ahead of the primary to post positive things about him, to text their friends with positive statements. Is this taking things into a whole new realm or is this more of what we have already started to see elsewhere . And we have also seen facebook saying they will not take down such posts. That they are accepting this as a new normal and how our campaigns work. Alex that is not totally true. They are allowing it on instagram and people are marking it. The problem with what they are doing now is none of it is marked. I am not sure how facebook is supposed to know that if somebody likes Michael Bloomberg that they got 2500. The problem is there is no guidance on this, so it is not clear whether this is legal or not. I agree with alex. The federal Election Commission needs to do much more. Right now, i do not think we have enough commissioners to do anything. Alex it effectively doesnt exist thanks to the fact that republicans will not fill the board to get a quorum. We can hope for new rules, but that is not likely to happen. So what do we do in the meantime is the question, and to your point i dont think we had run into this kind of payment to individuals at this kind of scale before. Had to dorussians fake stuff because they cant afford to pay people 3000. It is hard to categorize this sort of thing. Is it disinformation . It is bizarre. Part of this comes out of the whole instagram influencer culture. And sponsored advertising culture. There are a lot of things that led to this. Can we ato a least do a mass shaming of anybody who would participate in this . I can think of people that would hate that and not advocate that but i am worried about these new things coming out. And we will sit on our hands saying, i wonder what it will do to our country. I think there is an opportunity to learn from organic food, localization of food, the fair trade Industry Farm to table. , apply some of that to the content we see. For example, if you take the bloomberg case, it is an example of inorganic activity. It is an actor spending money on me and my 1000 friends to see this kind of content and that is not organic content. Maybe a label that says this is not organic. But then, that means vocabulary. It means that social media has to have a way to do the supply chain with all the intermediaries. And you have to have a way to say the content arriving in your news right now has these labels. Why . When you go to the supermarket and buy pasta, you see this the labels, that this is fair trade. We had to learn our way through that. That was a transformation. There is a responsibility there. Responsibility of actors and people we are at the point this is so deep and fundamental at the ugc level, that there has to be a deeper evolution across the supply chain. So labels are important, but i just a shipping out the label saying lies and so on i dont know if that would like labels that slow people down. I think those will help. And we owe a debt of gratitude to the media for reporting on this kind of thing. There are journalists investigating. Alex the one good thing about the bloomberg action is the only chance of us fixing this problem is a bipartisan believe that we need to change the laws. And the only way the republicans lose this years because of Michael Bloombergs money. Honestly. The only way we end up in 2021 having any bills around changing the rules, having a quorum at the fec, changes in campaign finance, fixing our Election Security and having rules around online ad, the only way that happens is if we have a Democratic Congress and a democratic president partially elected by a billionaire oligarch effectively doing it from the inside. Because then we will finally have some bipartisan support that this is not how our politics should work. We have to leave it there. Thank you so much for coming out tonight. Please thank our panelists. [applause] [captions Copyright National cable satellite corp. 2020] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. Visit ncicap. Org] cspans washington journal, live every day with news and policy issues that impact you. Ofs morning, the use technology in tracing those with the coronavirus. And catherine newcomer of George Washington university on thwe role of inspectors general. Watch cspans washington journal, live at 7 00 eastern this morning. Join the discussion. This week watch live coverage of the launch of spacexs commercial crew test flight, the first launch of astronauts on american soil since 2011, at 10 00 a. M. Eastern. Our live coverage of crew dragon launch on cspan2 with lift off at 4 30, as nasa astronauts launch to the International Space station, then a postlaunch briefing with nasas administrator. Allday, on cspan2, live coverage of the spacex crew dragon as it docks with the International Space station, spacehe event between the x and iss crews. A group of conservatives and former republicans held a forum on putting country over party. Late last year they formed a Political Action committee called the Lincoln Project aimed at stopping reelection of president trump. The discussion marked the 160th anniversary of president lincolns speech at Cooper Union Hall in new york city. Good evening, everyone. Im laura sparks, president of the cooper union. Its my pleasure to welcome you tonight to our historic hall as we commemorate a profound and lasting expression of political conviction that took shape in this room exactly 160 years ago today. [applause] as you saw in the video a moment ago, the great hall occupies an important place in the fabric of our city. Sitting here

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.