comparemela.com

Director of internet ethics, alex damas, the director of the stanford internet observatory and the director of journalism and media ethics at Markkula Center. This evening is copresented with the Markkula Center for applied ethics. Dss event is part of kqu series on common ground. It is an initiative ream people together for civil discourse, featuring journalists hosting provocative conversations about politics, policy, art, culture, science and technology. Reckoning with the disagreement among us about how to face the future of economic, culture and environmental uncertainty. This series asks what are our shared responsibilities to the common good . Next in the series if you have an open calendar is this tuesday at 7 00 p. M. , at the San Francisco exploratory them we will look at how to overcome the polarization. On to tonights topic. Democracy is under attack worldwide. Populism is on the rise. Disinformations tool number one and social media is the platform of choice. What can we do about it . We can start by talking. Alex, i take it you have some show and to offer us about the topic . Russia is again attempting to influence the American Election for president. Alex that is what we read in the times. That was a briefing given to the house subcommittee on intelligence. But there are no details. We dont know what they mean by that. From my perspective, there are five different kinds of interference in a 2016 election. It is not clear if they are doing any of the same playbooks or something totally different. Rachel show us a few examples of what we remember from the 2016 election. Or perhaps many of us dont remember because we never saw it our feeds. Alex the kqed audience was not the target here. [laughter] alex a lot of the russian influence was aimed at the left and right. If you dont mind putting my slides up. The two major themes of types of disinformation or Information Operations is the term we use to make an attempt to change the information environment. It really had two big directions. The first was warfare. That is about the driving division by creating medical means that are injected into applicable discourse. In this case, these are three examples from both sides. Fake profiles and fake personas created by a private company that belongs to an oligarch in russia. The one on the left is supposed to be a prolgbt group. In this case, this is an lgbt coloring book for Bernie Sanders. Which is the kind of thing that is a funny little thing that you might post with the goal of getting people to join your group and to see your content and then most of the content had nothing to do with elections or politics. It was content like this that drew people in. That would allow them to inject messages later. Secure borders, an example from the right. A big topic of russia in 2016 was antiimmigrant sentiment. In the bottom right, this is a twitter account that pretended to be the Tennessee Republican party. It turns out, the entire time, the social media intern lived in st. Petersburg, russia, not florida. Here is some more from instagram, mostly. As you see, it comes from both sides. All of the tone is get over it. It is from a fake Instagram Account called blackstagram. The number one topic pushed by the russians in 2016 was black lives matter. A big oliveras was to try to build africanamerican support for these fake personas and then inject messages about hillary being racist. It was stolen from bernie as well as messages that might have been seen by conservatives and then seen as being really radical. I will give you guys a second to look at this before i ask you some questions. Why dont you check out this post. This is from a fake black lives matter group called blacktivist. Black panthers were dismantled by the government because they were black men and women standing up for justice and equality. This kind of work by the russians is being done not by intelligence specialists but the Internet Research Agency Employees were effectively millennials with english miners that could not find better jobs in russia. These people were not professionals. The language is not perfect. Now that i am a fake professor, i do a little socratic method. Who thinks that this being posted by somebody in st. Petersburg, russia is illegal . Raise your hand if you think it is . One person. You guys are right. It is not illegal for somebody out of the country to have an opinion about the black panthers even if they are lying whether lying about who they are. It is a violation of facebook terms of service to do this. Facebook terms of service do not have the force of law. Is this fake news . Raise your hand if you would call this fake news . The thing here is that theyre not making any falsifiable claims. This kind of argument of what was the reason why the United States government prosecuted the black panthers is something you might find in a freshman africanamerican studies seminar. This is an argument within the overton window of political discourse, but they were trying to amplify. Rachel remind us of the overton window. Alex im sorry. I am not a real professor, we are all fake professors. The overton window is the idea of what is the range of acceptable discourse. These are the things that are allowed in any society. In this case, american society. Within the reasonable bounds of discourse. That window can shift back and forth based on people being on the extremes. This is a real email. Does anybody want to guess who got this email . Who received it . John podesta. Yes. This is the email that john podesta received. It was telling him that somebody try to sign into his account. It was sent by the main intelligence director of the kremlin. We are talking about real intel people, they like to kill people overseas. The joke is that the person tried to break into the account from ukraine. Which is a little jru inside joke. These guys are hilarious. This is a redirector that sends him to a site googleaccounts. Net. When he got there, it was a perfect looking google login. He asked one of the i. T. People at the dnc whether this was legit or not. Apparently that guy replied it looks ok but meant to say it does not look ok. Which perhaps be the most important typo in the history of the human race. He logs into that and gives the goes. Rd, the gru they broke into the dnc using technically sophisticated work than this. When they had that information, they were not releasing fake accounts. They were not releasing fake information, the cherry picked the emails that told the story they wanted to tell. Which specifically was the story that Bernie Sanders was ripped off in the dnc primary. To do so, they powered that message through real emails. Where people were saying not nice things about Bernie Sanders. There is a history we cannot get into. That failed. They tried it again through an ,rganization called dc leaks which they were pretending was a real leak site. D. C. Leaks reached out to a bunch of journalists and said here are some documents from john podesta into the dnc. And the journalists complied. O had a reallife blog of all the most embarrassing things that john podesta did. Even the New York Times ran with the stories over and over again that the gru wanted them to run with. If you go to paragraph nine or 10, this says that it could be part of a russian information operation. It does not matter when that is your headline and that is what people are reading. Some other examples of disinformation around the world, these are two real Whatsapp Messages in india. In india people use the internet differently. Whatsapp is the most Popular Communications medium there. Something like 400,000 people have accounts in whatsapp. Whatsapp is not like facebook where you can post something that a Million People see, you can send a message to up to around 200 people. People in india a part of many groups, family groups, school groups, work groups. Messages get passed along by individuals copying and pasting messages that are injected in. The one on the left is from attacking the conservative Political Party in charge of india. Possibly supporting the congress party, they are big enemies. Basically lying about the price of what gasoline costs in other countries. The one on the right is racist propaganda about Blood Donation camp being fake. Disinformation looks different because, if you look at this, it is saying i am from a black lives matter group. When you are seeing this comingrmation, it is from your uncle, aunt or your coworker. Much more personal. Its harder to amplify, but in india you have groups working for political parties. The theory is that the group is about a Million People who have disinformationsh and they dont believe in disinformation, they just believe it is the right news. On behalf of the Political Party. They get notifications from the official group and then they copy and paste it and spent all day sending it out to the other 400 Million People. We are still seeing this russian lead activity around the world. This is the report that our team wrote. Stanford. Edu. What we found was a Disinformation Network in africa. This time aligned with wagner group. It is a company he owns that has actual paramilitary mercenaries. So, people that go into countries to kill people on behalf of autocrats. They are supporting autocrats on the ground with guns and with disinformation. In doing so, it looks like not for the Foreign Policy outcomes of russia but for the personal financial benefit. In Central Africa he has things like diamond mines and the like. He is backing two of the six people vying for control of libya, probably to get gas and oil rights in the future. 2016 ising changes from this is no longer people sitting in st. Petersburg. They were reported back to people in st. Petersburg. One of the guys doing it made a mistake and he posted a picture of his trip to moscow on his Instagram Account. Which is kind of awesome. The people doing this work in sedan are actually sitting in sedan sudan so it is hard to catch them. Their language is better, their cultural knowledge is better. It is multimedia. This is a whole newspaper that seems like a legitimate newspaper. That is mostly not about russia and not about Foreign Policy, it is just a newspaper. He also owns the Radio Station in madagascar. They are building the entire pipeline, they can manipulate the media but they also create their own media and the amplify that media online. Lets start with that. Rachel that is a little overwhelming. Yes, thank you. There are so many Different Things to parse out in alexs presentation. One of the first things that occurs to me is the question of whatsapp. There are so many people around the world who are unencrypted platforms if you will. Even though you can argue that journalists and regulators are doing a good job with the platforms on the open, they do and even less good job when the information is encrypted. If you think indias example, whatsapp, there is a particular case whatsapp is used in india but in restaurants, it is out to peoplenus in the communities. People in that community took other stuff from the restaurant and then they come and pick it up. Afterwards, the Restaurant Owner will share some video with you on whatsapp. There is an interpersonal acceptance of liberal privacy , in the sense that, i dont mind you sharing something with me even though i dont really know you other than the transactional relationship with the service. In the u. S. , it is a different kind of sensibility. If i get a Whatsapp Message with the video from somebody i dont even know, first of all, i may not have a whatsapp connection with people in that sense. There is a huge advantage that disinformation actors have in places like india where whatsapp is literally an interpersonal thing. As well as a transactional thing at the same time. That is one thing. One of the things it will help to understand is it existed because of the centralized paradigms of ownership in the media from print to radio. As long as media was owned by a few organizations, 10, 20, 30, depending on where you are in the world, there was a cultural sensibility and acceptance of values built in. That all broke with social media. You and i have a microphone, we can amplify our speech. There is no such thing as an acceptable window for what is acceptable in a democracy for public speech. That is what is broken. Rachel part of the question is who is responsible for the changes in the window . It was not that long ago that social media started. When it started, people were not sharing news articles. The whole idea of social media was to connect you with your friends and your family. At some point, that paradigm shifted. Part of the responsibility lies with social Media Companies that have certain affordances. They guide you to say you should be able to use this for this or that. In 2014 Mark Zuckerberg came out and said we want to be everybodys personalized newspaper. Facebook was not something people thought of as a personalized newspaper. Suddenly there is this whole encouragement, you should be sharing news stories. Suddenly, i am your aunt phyllis and i endorse this message. Across. Ow it comes some of the responsibility definitely lies with the platforms. Some of it definitely lies with us. One of the early products of the internet, people sought with the advent of blogging. I remember people saying that we are all journalists now. Turns out we are not. On social media, turns out we are loudspeakers for other peoples messages. That is a different role. We all bought into this role. We find ourselves doing it. When we talk about response responsibility, we have to talk about these different layers. I found an interesting poll that was done last month by npr, pbs newshour. They asked people who should have the main responsibility for a dressing miss disinformation. Addressing disinformation is pretty vague. Whether it is not to do it, not to respond to it, to highlight it, the numbers are these, 30 said the media have the main response about he for addressing this information. 18 pointed to technology companies. Half as many. 15 to the government and 12 of the public. 29 of democrats assigned the main responsibility to the media and 59 of republicans do. We are polarized even on who is responsible for doing something about this. Rachel as a journalist, people dont want to accept the information they receive. You say this is it, i have the answer for the question you had in the response is, no, thats not what i believe. As if it is a matter of opinion. Some people are not looking for information, theyre looking for confirmation and they are looking to signal identity and to be part of a certain group. Increasingly reading about the fact that people often know that they are quite likely sharing misinformation. They are ok with that because thats not the point. The point is not to inform people but to say this is what i believe. I think what is really interesting for the rest of us is that we have these calming common human weaknesses that make us do the same thing even if we dont intentionally mean to. What i definitely learned, my colleagues will tell you that i have to check myself all the time. If i find something that is absolutely the best illustration of what i believe, this just occurred and it totally confirms everything i believe, i just have to sit on it. More often than not, it is a set up. It is designed for people like me to respond that way and to share broadly with others in thatge, and perpetuate miscommunication, misinformation. It is often not an outright lie. It is out of context or it is made to push a certain narrative. Rachel much more like a omission. As recently as two years ago, i was among those who would laugh at politicians and regulators who were so behind the times and unable to find her left hand and right hand. Of course, it would be no position to craft laws that would be out of date as soon as the ink was dry. Now, i dont know, the whole disinformation situation online is such a dumpster fire. I dont know that there is anybody who is on top of it. Even if they did nothing but read facts all day. What is this message for what can we do from a regulatory standpoint to try and control some of this . Or is that a hopeless task . In the United States, we are extremely limited by the First Amendment. Yesterday i was in washington as i got home at 2 00 a. M. Today. Eyes arelooks like my closed, it might be. Become the big punching bag for people of both sides when they say Tech Companies need to fix things. The vast majority of things people complain about, what is called cb 230 is the First Amendment. Is that political speech has almost never any criminal or civil liability in the United States. Even if its false. The Supreme Court has said that if you lie intentionally, in most cases it is a crime. Its not a crime and it cannot be punished. Most of the stuff we are talking about is something you would never be able to adjudicate as false anyway. Even if we were a different country. The regulatory in the United States, theres not a lot of options here. Even other countries, this has mostly been about things that were already illegal speech in the country. The most effective regulation has been a 27 letter german word that i cannot pronounce. But sdg is the acronym. That is a law that requires the Tech Companies to enforce german hate speech law. But that is hate speech law and its not about true or false. Even that has had some real issues when it starts to apply to things like sarcasm in comedy , and the like. In the United States, with regulation, the place we can regulate is not the content of, but the mechanisms by which people can do political advertising in the United States. I would like to see rules that figure out everybody is guessing what the rules are for political advertising online the fcc has not ruled about how these 80s and 90s laws apply to 2020 online advertisements. I would like to see restrictions for political ads about what kind of targeting you can do. Basically, a minimum size so you cant target 20, 30, or 40 nr a members. I have seen a couple of tote bags. There are multiple people in here that have given to pledge drives. And if you havent, rachel knows your name. She will hit you up afterwards. [laughter] you can limit it so you cant just hit members that live in santa clara county. You have to advertise to much broader sense of people and have rules around transparency. My colleagues at stanford that study these issues believe you can have these laws as long as they are content neutral. They apply to the actions and the platforms about what kind of tools they provide, and such. To regulate the speech of individuals will effectively never happen in the United States. I want to take a step back on the people to think slide you saw about different examples. One of the things common to all of them is the visuals in them. It has to do with what we are trying to do with our behavior when we see the visuals. I have ended up having to ask myself three types of questions. What actors or individuals are in the post we see. We know these actors. Good actors, bad actors. A different type of vocabulary is if this is supply or demand side. Where are you. Are you on the supply side . Are they supplying you information . It could be people who could be anything. Journalists are not the only people posting. All the organizations involved. Organization, all of the organizations involved. The third thing is what kind of behavior is going on. They are expecting you to instinctively believe that thing and then they are expecting you to actually share it. There is an implication to your fast thinking. There is a very famous book called thinking fast and slow. Daniel monahan. The easy way to understand all of this is they are making you think really fast. When we think fast, we will act with our biases. We are not going to cognitively process what this is. The idea over all on social toia posts or visuals is click. Click, the idea is click your way to really quick activity. That is the application. This place to it they already want to share. The a poster that is to actually slow people down. In the midst of all of the other ideas that you have what can the public do . The only thing i would ask for is when you see something that is too good to be true or you can feel a feeling creeping up, you know that feelings drive behavior. But then feelings will make you make a mistake. The first thing we all have to do is slow down. There is one line that came out of a south indian film. I dont have to name the film. It is about an anticorruption actor. It essentially says that for the truth to win, you need evidence, you need authority, proof. For life to win, you just need to sow confusion. Confusion is fast, the truth is slow. All good things take time to emerge. If you look at the antarctica example, last week there was a story with antarctica hitting 69. 3 degrees. But if you read the stories in the post and elsewhere, those scientists are taking great pains to say we are not station. G this weather high,ey hit a record nobody ever heard of antarctica hitting this high. They are still at great pains to say we are not certifying this yet. This is what it takes for science to even say something. It is slow, painstaking but it is very easy to go out there and say they were pushed out of the gallery in the senate. I dont know if you heard of ted cruzs tweet which was actual disinformation, he deleted it after that. There should be any politicians approach, any journalist approach. Rnalists you should not just delete it. You should post a corrective tweet. Saying heres what i said that was wrong. And correcting it. The steps or behaviors are implicated. One is slow down and the other is identify who the actors are. If youre on the demand side, we have to regulate our own behavior. If youre on the supply side, there is all kinds of regulative mechanisms and codes of ethics that kick in. Rachel im going to challenge you there. In the olden days, i would encounter ideas i disagreed with. Now i am in a silo even if i dont want to be because of the platforms i am browsing on put me in a silo. They have algorithms making sure nothing encourages me to move my eyeballs off that site. There is filtering happening before i encounter it. You have tooint, if look at algorithms and personalization. If im looking for shoes on amazon, i dont mind it kind of personalization based on shoes ive ordered earlier and so on. Journalistic content is a public good. You cannot take a public good in the form of posts and other people sharing articles if you try to personalize a public good with algorithms, you are most certainly going to get it wrong. That is the reason people start using a valuebased expression that all accumulates. These algorithms dont know the difference between fake and fiction. They dont know. They cant tell. There is a lot of intelligence now over the past few years but it is actually coming now. And even now, there are technological issues. First of all, they cant tell what is fake and what is not until they are trained. Number two, even if there was no fake news at all, personalizing public good like news forces democrats to be on one side. You will see less value system and everybody sees only what they want to see. The alternate is asking us to diversify our own facebook friends. But that is hard to do. For me to wear friend a conservative just because i want to do that is not something that will actually happen. [laughter] i am saying it will happen incidentally. I want to see the news that she or he are doing. It is because i met a friend on the road and he happened to be of this point of view. The way people need to personalize their news is to take it out of the social media picture. All of those things are good. That is what is good to have on social media. That makes up is a big mixup. It is the design that they havent even thought of. They are afraid that the minute you start saying what is news, to separate news out youre , going to start questioning how you define a News Organization. What is news . What is journalism . Those kinds of definitional issues have not been taken into account in the early stages of engineering infrastructure. Now when they come in, you have to define free speech. Rachel i dont know where were going to define what news is and to be able to draw those lines correctly. We are talking about a lot of Different Things at the same time it occurs to me. The easy answer about this information is you need Media Literacy. Need to understand what are the right sources and how you analyze the messages. It turns out that the researchers are now pointing out that the language of Media Literacy and Critical Thinking itself has been weaponized and used in order to push this information. Think for yourself. Dont believe what the mass media tells you. Go out and find your own sources. Those are the kinds of things that used to be taught as a Critical Thinking exercise to question the established media. Now we are finding that is what sends people off to these outliers and increasing radicalization. How do we respond to that . There is a researcher named dana boyd who wrote about this. A little bit in despair because she like many of us believed in the notion of Media Literacy being helpful. My i will say im giving own terms to what she proposes that i think we need two additional things the social Media Literacy. An understanding of how social media works and the impact of what happens when the news is shared by friends and family as opposed to some kind of outlet out there. We need to understand self literacy like talking to students and i would say all of us at about our own cognitive limitations and blind spots and how we can be manipulated. The only way to respond and prevent disinformation is to be cognizant of all three of those layers. And they all and they are different layers. How the media works, how social work, ands, how we how social media works. Those of the challenges that we were finding new and different, in 2016 the bots and how they are amplify all the stuff. And they are still there but the social media platforms are getting much better at identifying them. We have all read many stories about how there is effort out there to caricature the other side. That is one of the things we are seeing with black lives matter. I hope we learn to distrust if you see something that makes them look really egregious and extreme, that is probably the russians. There are some but they will find those few people that are that extreme and then try to make it seem like that whole side is that way. It makes it much harder to have those friends on facebook and have those conversations. Just to mix all subjects together, some of it is because of the nature of social media. Think about it it is very rare that you find yourself in a world that includes people you met in school 20 years ago, your grandparents, your parents, that is the social media environment. It sounds like a funeral. [laughter] rachel what we found over time researchers call this context collapse. Youre not sure how to talk to all those groups at the same time. And who you are visavis all of those groups at the same time. With context collapse and then conflict with people who dont know each other particularly well and they start to argue, you tend to shut down. There is something called the the spiral of silence. In the face of disorganized conflict, youre not sure how to talk to them. People tend to shut down. Researchers found that people would argue less on social media and then even at the dinner table would talk less about some of these very conflicting issues. That actually leads to polarization. Because we are not talking to each other, we dont understand each others perspectives. Something in the design of those platforms and in the kind of useraction they created led to where we are now. One of the things that is really important is that we learn how to have Constructive Conflict with each other, how to talk and disagree and that not everything has to end up in a flame war. Not everything has to end up with exaggerations of each others positions or ad hominem attacks. Also, how we talk to people who come from different perspectives than us is something we have to figure out. I think that would help with the disinformation and polarization as well. It is interesting you say that because i do feel like im experiencing this more with in person conversation with men and women both. The desire to broadcast, people just want to keep talking. They are not actually interested in what you have to say. Even in response to what they are saying. But guys, im going to ask each of you to help me stop this despair and resignation. Off. Why dont you kick me alex this is not a happy crowd. Point out a good actor, some someone you think is doing something clever or effective. It could be in the world of government, journalism, it could be platform technologists. Who is doing something we can look at and say they are going in an interesting direction that we can follow . A number of nordic governments have launched. Just like with everything else, it is really rough to say the danish are doing it, it works in the United States. Including maybe on health care but we will find out about that. But their numbers are looking good. There has been good feedback for that. One thing about all those countries is they have high trust in their government media. Like nrk in norway. There is a number of countries in which the bbc, abc, australias abc, government supported media is independent enough from the government but also kind of constrained by the board of governors to stay as neutral as possible but they have not chased the clickiness. The truth is it is very difficult other than present company excepted to think of a , Media Organization that hasnt become very strident in his in its political views in the United States. You cant argue that the near you cannot argue that the New York Times has become more liberal in the trump administration. Five or six years ago, msnbc was in real trouble financially and they found a really profitable model in being what fox was to obama and the equivalent to trump. I think part of it is it would be great to get back to a world in which there are Media Outlets that are seen as being more neutral carriers. I am not sure that is a genie that can be put back in the bottle. Im going to push back a little bit. I feel there is a lot of nonempirical referencing and algorithms. It is like an easy pivot to say it is the algorithms that are making us crappy humans beings but it is actually the human beings that are the way we are originally. People go on social media to be reinforced in their own beliefs. They search out the information that reinforces who they are. That is true of everybody. You have to be really careful about saying things like you want to algorithmically expose people to other information because everybody who says that mean the other people. This audience will be ok with saying that we noticed that you read too many atlantic articles, you need to watch some alex jones. The mix is more then you would think. We didnt experience on an experiment on facebook. The quality of peoples newsfeeds go down if you turn off the algorithm because the ratio of how much people post of crappy news posts versus them writing something up about their kids or their birthday or that kind of stuff is a 101 ratio. The algorithms, even in 2016 overemphasized people posting about interpersonal things. Meant rathergauge than reshare. We have to be careful. Therebe algorithm have been people looking at ways to change the intention. We have to look at ways to slow people down and slow down the speed at which information will flow in a viral contest on in a viral context social media in cases like on instagram, they have done an experiment less about fake news and more about making people nicer to each other, slowing people down as they respond to somebody in a way that sounds like it is mean. It is saying do you really want to respond this way . And making people take a moment to think before they act. There are a number of experiment like that that might be helpful. Rachel before either of you answer, i just want to find out i would like to push back on his push back. Rachel i noticed that some of you out there have cards. You will have questions that you want to ask our fine panelists. We have some people in the audience collecting those cards. Irina and are talking take this moment to , finalize and make sure they get into the hands of one of our card collectors. In terms of solutions, i feel that one example of a Good Organization working in the real world and trying to help is first draft. There is a simulation they do where they ask News Organizations and other actors and individuals to act out. A deepfake has emerged, it looks like it is from the election of a country, what will you do . All of the big News Organizations get into a tizzy. Because breaking News Organizations do not have the time to verify across multiple news sources. They want to compete with each other with who is going to go with the story. That is a fundamental and ethical issue with how it works. Because of the way first draft is asking News Organizations to do these simulations ahead of time, it is helping News Organizations realize what is the problem with the journalistic process. You are relearning the process and you are not the gatekeepers anymore. The process by which you build stories has not really changed. We are still sourcing but we have a finite supply of tweets to source from. Some of those are just lies. In effect, there is a call to reinvent the way a story is built and that is all for the good. The way we are building stories has to change. This is one thing. As far as algorithms and personalization goes i agree , with what you said. It is behavior that is giving signals to the algorithms about what you want to see. But the design of the platforms is to encourage us to engage around paradigms. If you ask facebook what is your metric for engagement that is not based on clicks alone, not based on shares, if it is engagement based on what people have learned from this post. What people have comprehended about a particular thing. We are not a News Organization but with the distribution, there is a fundamental question about what people are doing to each other with the algorithms. It is making our worser angels come out. We also have better angels when we slow down. That is why we have the emphasis on going slow. On the Distribution Site of design, but i would ask you to ask of these firms, when you see something that googles organizations have said is false, why are the social media platforms urging you to share it . Why are they urging you to like it . This feature called like is affirmance. This feature called share is an affirmance. It is universally offered on all of the posts regardless of whether or not they themselves are mocking it as false. There is a need to think about when do we offer and what do we offer what features do we offer . My turn. My turn. My turn. You asked about good actors and who is doing good work. I want to highlight the work of researchers like Kate Starbird and jonathan albright. They have been doing good work. Jonathan albright has been tracking the Network Travels of information and why it is being distributed certain ways. Kate starbird, we had her speaking at santa clara years ago about how they have a lab that found the same people were sharing black lives matter tweets and blue lives matter. Again highlighting the sharpest , aspects in either direction. These researchers are doing really important work. And if some of the companies had listened to them years ago, we would be in a better position. To put the responsibility on the companies now and you dont want to get me started on the face took out the rhythm because im really big on choosing for myself. And the idea that someone would know better for me really bugs me. I am big on autonomy. They really do encourage certain behavior. We need to talk about what we can do ourselves to build our own practices and our own habits. One of them is selfcontrol. I was saying earlier be aware when something seems like exactly the story that confirms everything you believe and dont cherry pick. I would go broader than that. I will say if you have any doubts about a particular story, dont share. Your friends are not going to be less informed just because youre not broadcasting that story on some social media. If we learn some of these selfcontrol techniques which are made harder by the fact that the companies are putting in place various afforded says to make a share and as quickly as possible we are going to stop , playing apart in this whole misinformation disinformation ecosystem. Do we have some cards . Lets get started. Alex i think we have to be careful when we talk about solutions and how much power we put in the hands of the trillion dollar information intermediaries. After 9 11, we asked the intelligence agencies in the u. S. To keep us safe with no countervailing equities. That was the whole thing about bringing the cia heads to congress and making them cry. That is how we got abu ghraib, the iraq war. That is how we got all of the theoversteps seen in snowden disclosures. We have a tendency in the United States to feel like something bad has happened, we are going to yell at these very powerful entities to fix it. If we put them in charge of that, they will fix it and then we might want to take that power back and we never will. We have to be very careful about it is emotionally satisfying to say that the intermediary is responsible. But then to say that they have to control the speech of other people and again, nobody ever means themselves. They always mean other people. Just be careful what you ask for. I understand that argument but my concern is that we that the government cant do it in the United States and if the companies dont do it, then who . Do we just accept the status quo . Alex when you say they have to do x it has to be followed by what the limit is y. And that is where Something Like turning off the ability to like it starts to get to the point where i start to feel really uncomfortable about the ministry of information existing inside of facebook. Let start with the question, i am a highly skeptical facebook employee. Zuck has been saying that we what is your take on facebooks ultimate goal here . Is it responsible for the participation, revenue, or other . You, alex. Pointed at alex he just changed his mind on this this week. I saw his interview with the ambassador, ambassador bissinger where he said we should be , regulated somewhere between the phone company and the newspaper. We are something new. That was not an accidental statement. There was an oped and then there was a white paper released about regulation. Facebook wants regulation because it is a superhard thing. Also, im so because they want to wipe out the competition. They would love the standard to be you must be this high to carry speech by people online. That standard of how how you how high you have to be is one step below where facebook is right now. This is where google is, snapchats, this is where tiktok is. I think he actually changed his mind on that. I think it is about competitive space. Facebook already does a lot of what we are saying it should do or should not do. It already moderates content in all kinds of ways. It already amplifies some voices and suppresses others. It is not like this would be a new task for them. It would just be more overt and it may be applied to different speech then it is now. But they are doing it already. As we have seen with Something Like hate speech, if you drive it off platforms like facebook, you will drive it onto smaller platforms where they are not doing this kind of control. Alex that is fine. But you can make rules around hate speech and be controversial but they can be based around risk. True and false on political speech is the most sensitive issue you can do. Yes, they are doing it but almost all of the arguments you will hear from the left is they need to do more censorship. We need to be careful about having them make judgments about First Amendment First Amendment protected speech. How many people think facebook should have taken down that edited hello see edited pelosi video . Will that is something that was done on the jimmy kimmel show to donald trump over and over again. Should jimmy kimmel be kicked off of facebook when he posts these videos . I dont think so. I dont think facebook should censor jimmy kimmel. I think we have to be super careful about corporate of a speech where youre making fun of politicians. That that should be controlled by a trillion dollar intermediary that pays a ton of taxes and has a bunch of regulatory exposure and who can come under the control of the incredibly powerful executive branch any time. It is kind of nuts to say that we want them to do that kind of work. Can i jump in here . I think one of the core debates that does not happen often enough, alex is not new to this, it is the difference between amplification power and keeping it up. They have to find the truth and actually post it out. Their job is being made hard by social Media Organizations. All kinds of speech will come in and rightfully so. They will create confusion in our minds and then we dont trust that the news is true or not anymore, especially if it comes from a value system that is not ours. If i dont trust fox because i am left and they are right i , will not trust fox as a brand anyway. Likewise, on the right, if they dont trust msnbc, they will not trust an individual expose by that content. The biggest question here is if something is actually a lie and it is being posted lets reduce this down to political ads. These are paid political ads. If you ask the question should lies what is happening is the goal of the advertiser is to target a bunch of people to make them believe that this is actually true. If you keep doing these kinds of ads 1000 times, 5000 times, you will make reality emerge and a in a bunch of peoples minds and they will not see any other kind of reality. You have a situation where you will have people who dont believe that happened. That can happen if you start putting lies in ads. The law apart the question from , an ethical standpoint is this is about democracy. You cant have democracy without truth. You just cant. If these Companies Including News Companies and Media Organizations that put lies on a headline and amplify the lie. Both are similar in this space. They are born out of a democracy. The entire capitalistic innovation engine that led to these firms building these successful product exercises, a lot of good has actually come from this. If you run this kind of government, there is more responsibility than just the First Amendment. We have to say what is the truth about . If we keep asking this question, we are not arbiters of truth, we have to be neutral, that is coming from a different angle. It is actually coming because one part of this country, the conservative part finds that if they apply neutrality then they would have to take down a lot more speech on one side and less on the left. Then there is a question of proportional takedown. There are all kinds of questions here around why social media firms do not want to use truth as a lever and i think those questions have to be talked about war than they are in the democratic context. That is a journey that america is experiencing right now. One other thing. If were interested in this conversation, it means we have been thinking about this. We are aware of some of the debates. But there are a lot of people , out there, what they get is each side accusing each other of fake news and this information and disinformation and this journalist wrote about this years ago about how this was done in russia long before it happened to us here. What it does is the technique is just muddying the water to the point where people dont know what to believe, what is going on. It creates a kind of paralysis that is not good in a democracy. If we want people to vote and have an opinion about things and participate, we cant live in that soup of dont know what reality is, every source is biased. And a pox on all their houses. So we have to figure out how to , have some sort of measured skepticism but also trust in certain outlets and certain groups. Enough to keep the initiative to function as citizens of a democratic society. One of the groups most vulnerable to disinformation is older adults that are less tech savvy. And they vote. How can Senior Citizens be educated to be less gullible . [laughter] that is an excellent question. Almost always today you hear about young people. You always hear that young people dont know how to read the media and blah blah blah. I actually think we need a reverse stream of education in which we might need to get to the young people so they understand what is going on, all these people that we talk about. Then seriously, have them go and talk to the older people. I think we should not assume that the older people are not interested in finding out about all of this. Maybe the grandkids have something to teach the grandparents. One of the problems that wont get addressed in this version that we speak about is journalism and its role in itself. The reason older people get tricked is because there is some other sense of alienation, discontent, uncertainty. All kinds of stresses that people have. Older people are under stress. When people are under stress, theywill react fast and will not have the usual skepticism to things they see on their feet. If you go upstream of that political thing when you ask , what kind of things local journalists should be saying into their actual community, the necessary Community Building that journalism used to do is not happening as often in areas where there are not as many News Organizations. Usually, when people are in a community sense that has a democratic actor working, these kinds of literary mechanisms kick in. People will talk about this and say did you see that post . I was about to share it what happened . If those conversations are not happening anymore, that means because there is no journalistic role elsewhere. Then you will have people just posting away. It makes them outraged that they have to see something. There is a deeper problem in the u. S. That you guys may know about that cant be handled at the social media level. It is upstream of that. And that has to be included as well. Are talking about the weaponization of cynicism and skepticism. Do we have time for another question . This is about labeling news content. They have been pushed by platforms to label. This is not content neutral but seems to be gaining traction. Is this a good and sustainable solution . It is reasonable. You have to be careful of the labels. When you tell somebody that something is false, they are more likely to engage with it. People do not like to be told what to think. Unit they know it is truly false it is not something that flashes. You have to be careful about using research on what language to use. There are two Different Things, content and identity. Certainly, labeling the identity of groups that are known to be actors or sponsored by specific government is a different thing. Sponsored by specific government is a totally reasonable thing. The companies have been building up teams to ferret out that activity, to look at coordination between different groups and actual collaboration with the Intelligence Community in the u. S. , to understand who the actors are. And what their technical specs look like. That has been very effective for covert propaganda. Over propaganda, for that variety of different groups, that labeling is fine and it should not be that controversial. You have to be careful but that is a good thing. Rachel is facebook and twitter implementing policies like this . Falsagging possible es . A bot means an actual program. Twitter has done a lot of research into this. It was never a problem on facebook. The majority of the people all that stuff we saw being pushed those are people sitting , there and doing that. You do not need robots because a lot of these actors are acting in low cost environment. Some of the biggest domestic propaganda groups shut down by facebook were outsourcing to places like pakistan to get lowercost content lowercost talent to create the talent and to push it. A lot of work around detection of a new account. Either you are on a big binge or binge or this seems to be a professionally shared account. They have to work around that and i think that will continue. There are hybrid accounts. A mixture of bots taking over accounts of real people to make it harder for them to disguise. I want to go back to where we started at the beginning with a sense of urgency that i feel like we are still responding to the things we learned in 2016. And we have an election coming up and we have all these things are acting to shape our behavior right now, so we cannot afford if the labeling will play a part, lets do it. It might not be perfect. We need to do things on all these levels. But we need to do them now. We cant talk about it as a theoretical discussion about what will happen in the future. Because if democracy is being undermined right now, we maybe will not get another chance to do things in the future. Since we are speaking about twitter, it announced this week that they will start labeling. Start labeling lies. Six months ago or eight months ago, they asked everybody to give them input. Now they have come up with their approach. It is labeling. If i lie to my 20 followers it , will get leaked. What do you think . Alex i think we have to say. This is the problem. There is a lot of guessing. Guessing is something well work and then there is the empirical evidence actually looks different. Before, we did a ton of testing and found out the exact wording was key to getting people to engage with the alternate information. My hope is that twitter has done the same testing but i cannot speak to that. Rachel we have to wrap up now. We saw this week also that democratic president ial contender Michael Bloomberg was spending money on on paying individual californians head of the primary to post positive things about him, to text their friends with positive statements. Is this taking things into a whole new realm or is this more of what we have already started to see elsewhere . And we have also seen facebook saying they will not take down such posts. That they are accepting this as a new normal and how our campaigns work. Alex that is not totally true. They are allowing it on instagram and people are marking it. The problem with what they are doing now is none of it is marked. I do not know how facebook is supposed to know. The problem is there is no guidance on this, so it is not clear whether this is legal or not. I agree with alex. The federal Election Commission needs to do much more. Right now i do not think we have , enough commissioners to do anything. Alex thanks to the fact that republicans will not fill the board to get a quorum. We can hope for new rules, but that is not likely to happen. I do not think we have run into payment of individuals on this kind of scale before. Alex the russians can afford to pay people 3000. It is hard to categorize this sort of thing. Is it disinformation . It is bizarre. Part of this comes out of the whole instagram influencer culture. There are a lot of things that led to this. Can we do a mass shaming of anybody who would participate in this . I can think of people that would hate that and not advocate that but i am worried about these new things coming out. And we will sit on our hands is ag i wonder if this new phenomenon for our country. I wonder what it will do to our country. I think there is an opportunity to learn about the localization of food. Farm to table. Apply some of that to the content we see. For example if you take the , bloomberg case, it is an example of inorganic activity. It is an actor spending money on me and my friends to see this kind of content and that is not organic content. Maybe a label that says this is not organic. But then, that means vocabulary. It means that social media has to have a way to do the supply chain with all the intermediaries. And you have to have a way to say the content arriving in your news right now has these labels. Why . When you go to the supermarket and buy pasta, you see this business. We can learn our way through that. There is a responsibility there. Responsibility of actors and ple we are at the point this is so deep and fundamental , that there has to be a deeper evolution across the supply chain. I agree they will ship the label lies and so on. I would like labels that slow people down. I think those will help. A debt ofowe gratitude to the media for reporting on this kind of thing. Alex the one good thing about bloomberg action is the only chance of us fixing this problem is a bipartisan believe that we a bipartisan belief that we need to change the laws. Honestly. The only way we end up in 2021 having any bills around changing the rules, having a quorum at the fec, changes in campaign finance, fixing our Election Security and having rules around online ad, the only way that happens is if we have a Democratic Congress and a democratic president partially elected by a billionaire oligarchs, effectively doing it from the inside because then we will finally have some bipartisan support that this is not how our politics should work. Thank you so much for coming out tonight. Please thank our panelists. [applause] tonight, on American History tv beginning at 8 00 eastern, may 8 is known as ve day or victory in europe day. American history tv and washington journal market the 75th anniversary of germanys surrender with Pulitzer Prize winner rick atkinson, author of the guns at last light. The final book in his liberation trilogy about the allied triumph in europe. Watch American History tv tonight and over the weekend on cspan3. Tonight on the communicators, publicpolicy Vice President luther lowe on digital competition and why he thinks google has betrayed the internet. He is interviewed by a reporter from politico. Theoogle is steering traffic to its self and in a dee the oxygenating oxygenating the World Wide Web and ultimately harming consumers because they are not getting access the best of the information. Watch the communicators tonight at 8 00 eastern on cspan two. Lift off. This week, watch live coverage of the launch of the firstrew marking launch of astronauts on american soil and spacecraft since 2011. Tuesday at 10 00 eastern live on cspan. Jimunt. Clock briefing with bridenstine. Wednesday, our live coverage of the spacex crew dragon lodge begins at 12 15 pm eastern on cspan two liftoff at 4 30 p. M. To the astronauts launch International Space station. A postlaunch briefing been with Jim Bridenstine im at 6 00 p. M. Eastern. And thursday at 11 15 eastern on cspan to, all they live coverage of the spacex crew dragon as it docks with the International Space station. And the opening of the hatch between the vehicles and the event between the spacex crew dragon and the iss crew. Cspan2ve on cspan and come online at cspan. Org or listen on the free cspan radio app. Next, remarks by u. N. Ambassador nikki haley craft on u. S. Foreignpolicy during the corona virus pandemic. Ambassador craft talks about the u. S. ,ms between the china, and the world health organization. Posted by the hudsons hudson institute, this is 45 minutes. Good afternoon. I am honored today to be joined by our u. S. Ambassador to the united nations, kelly craft. She has a lot on her plate these days dealing with issues the u. S. Faces at the u. N. , but also doing this job amid a pandemic, which has created a bigger challenge

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.