Here, they provided a historical overview and look at potential challenges that lie ahead. Cnn anchor kate bold one Kate Boulduan monitors the discussion. Afternoon. Impressive turnout for in august day. Congratulations to all of you. I always say the Wilson Center has the brightest audiences, and you are bright enough to know at ts will be a fabulous this is going to be a fabulous conversation on disinf. Harman, president and ceo of the Wilson Center and i am delighted to be here. That august, which Means Congress and the president are on vacation. But todays topic is not on vacation and will not be, at least through the 2020 election. So that will some center is not putting off a conversation. That being said, disinformation is hardly any problem. In the midst of the cold war, the u. S. And the soviet union both try to influence narratives on other parts of the globe. For example, a new article published by the Wilson Center tragic history and Public Policy program look at translated russian documents related to operation denver. A campaign in the 1980s not spread the lie that h. I. V. Was created as a result of the pentagons biological weapons research. So, disinformation was not infected in 2016 and is sadly unlikely to be buried in 2020, but today, we delve into the context. And we have the right moderator and speakers to do that. Our first panel, which starts right now is on historical patterns. Moderating is the very talented kate baldwin, the anchor of cnns show at this hour, and formerly the coanchor of new day and the situation room with wolf blitzer. Kate and her colleagues have done critical work on the topic. Earlier this year, cnn created an interactive website that shows the danger of fakes, doctrine but realistic videos sayingow public figures and do things they never did. Since 2016, the pentagon has spent at least 68 million on technology to detect deep fakes, but the issue is unlikely to go away before 2020. Were also joined by our very kowicz, global fellow with the canon is to q2 is writing the on russian how much do i get for this on russian disinformation. It is coming up next summer, called how to lose the information war, based on her on the ground experience in ukraine. We welcome jessica buyer, a lecture and Research Scientist at the university of washingtons Jackson School of international studies, Whose Research has focused on International Cyber security and nonstate actors, as well as online political activism. And finally, our last guest is the director of strategic projects, Cyber Security and diplomacy, and microsoft, drawing on years of in Political Campaigns. She worked on microsofts partnership with the Iowa Caucuses in 20 to create a Smart Phone App for caucus organizers to report the results of the vote more quickly and accurately than before. Conclusion, in todays complex world, it is more important than ever to provide real facts and trustworthy information. Everyone in every place is being tested, including the Wilson Center. We are doing our best and we really need our friends on this panel, including the fourth estate, the media, to do its part. So i look forward to. Panels. Panels, to death kate, over to you. Kate thank you so much. Out the issueys in the topic today really well. Before you look forward, you must look back, and i think that is why we should start with history. The patterns of the trends we are seeing when it comes to disinformation. Maybe it started with operation denver, as jane talked about but nina, i wonder disinformation , is not new and how far back do you look to where you would say the first real Disinformation Campaign youve been researching, where do you put it up . Nina i think a lot of people want to go back to the soviet period and compare what we were seeing then with what were seeing now. But there is an important difference. While the tactics are the same the tools have changed. Social media allows the information getting put out there and its not all fake. We have to say, it is all usually grounded in a kernel of truth or real feelings, real discontent that is weaponize to weaponized by bad actors, including russia and other foreign actors. The social media allows those bad actors to spread those messages much more quickly, to travel at lightning speed and also allow those messages to be targeted to the very people who will find the most appealing and that is what makes what were seeing today so much more difficult to counter. Looking at the modern era, at disinformation 2. 0 as you might call it, my book started in 2007 in estonia with the crisis when a lot of false stories about how ethnic russians in democratic estonia were being treated in this new era were spread among the population there. It led to riots and led to some real social discord that the estonian government that had to counter. They did that not by trying to tamp out the voices of the press or counter narratives that were grounded in truth about how the russian population was feeling , but they did that by bringing those ethnic russians back into the fold, by focusing on integration and education that something will come back to later today. Kate i can promise that, for sure. Know, at thisl point, especially everyone in this room, we all know what disinformation means generally and broadly. For us,ned the term this is what i am most interested in. Because it is not always what it seems. What is the difference between disinformation and propaganda . Nothing likee is being asked to define something as an academic. [laughter] most people talk about disinformation as information that is false and with an intent to deceive attached to it. Academics talk about misinformation as Bad Information that is spread unwittingly. The difference of disinformation and propaganda, there are people who also divide that out. Propaganda is an organized effort to spread disinformation or information that isnt necessarily entirely untrue or maybe it is the way it is talked about, that is another tactic. You focus on disorder rather than a narrative of people protesting for their rights. So there are different ways you can g look at different events and stress certain things over others. One of the things that academics are starting to point out is that we have this conversation about disinformation or misinformation, and we think that disinformation is something that a bad actor is doing, oil misinformation is my aunt whos sharing a story on facebook that is not true. At there are people who look the social context that people are a part of. That people share information, not just because they think its true, but also because it resonates with them or signals , showslace in a group that they are part of some Overall Group rather than other ones. And also, if you are looking at say disinformation in places violence,ulnerable to people may be are sharing misinformation and they dont know the thing they are sharing about someone coming to kidnap children isnt true, but that information is often grounded in existing societal hatred so , the idea that it is the purveyors of disinformation are the malicious ones, is not exactly true. Because even misinformation can be dangerous as well. Kate what are the tech takes, jenny, how have the tactics changed . Im interested from your perspective. Were talking about how the information spreads, but how have the tech vix changed how have the tactics changed . Jenny we are talking about things, how they used to be and how they are now and have never changed. Espionage is a thing that existed and that is not new but what is new is weaponization of the information you get. Its not new that even Political Campaigns are hacked. That happened in previous cycles in the u. S. And in recent history. What is new is the tactics the enemy uses when they get a hold of that information. One thing we think about it into anticipation, it is important to have Historical Context in order to anticipate and handle it, but also important not to solve yesterdays problems entirely and to think about what might come next. Thats what the conversation around deep fakes. I dont know how much everyone knows about the concept but it sounds like some might be more expert on it, but it is the idea that videos looking like someone saying what that individual did not say. There may not be a rise in it yet, but it is anticipating that it is what is coming next. How will the playbook from the enemy change, and it made new in the next cycle . Kate you all talk about something that is the goal of disinformation. Is it always the same, nina . Is it always to change hurts and minds or force a segment to do what you want them to do, or sometimes is it more insidious than that . Is the goal from estonia to today of disinformation the same . Nina i think what is interesting about what we are seeing from the kremlin is that it is not to change hearts and minds. That is how i would draw the line between propaganda and disinformation. Propaganda is agitating on behalf of a certain cause, whether it is government or a Political Movement et cetera. , disinformation particularly russian disinformation is a need is aiming to create as much chaos and disorder as possible. That is a theme that goes from estonia right through to georgia, poland, czech republic, ukraine, especially ukraine, as a matter of fact. It is trying to inspire a sense of distrust in the democratic system so that people dont go out and invest in that budding new democracy. Certainly since 2016, i think we are seeing more distress building in our democratic system as well whether thats in the media or in our election in infrastructure, or whether that is in campaigns and political structures themselves. In that regard the Disinformation Campaign we saw in 2016 that continues today has been successful in that regard. Kate to that point, is there a pattern best you would think if there was a pattern, people would wiser to it already. Is there a pattern to Disinformation Campaigns . Is the russian disinformation distinct from the Disinformation Campaign put out by iran, or how what happened in ukraine, is that different from what happened essentially in the United States . Is there a Common Element that you find in these things . Elementthink the common , which is what makes it so difficult to counter, is the fissures in our societies whether economic, or racial or political, especially political polarization, those are the things that all these campaigns weaponize to their advantage. It is not as simple as just deleting fake accounts, these are societal issues that take investmentance and from our politicians and Civil Servants and folks like you in the fourth estate. Kate . That is really interesting. Who are the worst offenders, jessica . There have been some research that has found patterns. There are more internal Disinformation Campaigns than external ones, for example where we have a nationstate acting a country, you are more likely to see internal. People seeking political power or acting for their own personal gain using disinformation. Kate within their own country . Jessica yes. And also people doing it to make money. You add money spreading out stories. We saw that in research out of 2016. Oxford institute, they have found that democracies and nondemocracies, you see disinformation inside both. A gets tricky trying to define what is disinformation when the government controls the media, but there have been some efforts to try to look for patterns across countries. I guessa question of, the worst offenders would have ,o be the worst offender Foreign Government trying to influence outside its borders. Who would be a worse offender than that . Russia . Absolutely. Also, any actor attempting to use or create a situation in which violence is likely to further their political goals, those of you the worst offenders generally. Kate what is it about russia . So a lot of times i get a question about why is what russia did in 2016 any different than the type of things the United States does all around the world . And i am talking overt operations, not what our intelligence agencies do. The answer that i give israel much grounded in my background, having worked at the National Democratic institute and supporting prodemocracy activists around the world. Whenever we do those programs. Open, anybody could come from in a Political Party and learn how to run a Petition Campaign or learn how to run a Political Campaign. And what russia did in 2016 was much more Surreptitious Campaign that didnt get anyone to two consent. Right . We didnt know we were being messaged to buy the russian government. That is what i find so objectionable about that, also illegal, clearly. This is not something anybody bought into. And we are seeing other nationstates mimicking the russian playbook. You mentioned iran, we have seen some saudi accounts, certainly china is doing that within its own borders. That lack of openness that goes against our very democratic being that i find to make russia most objectionable in this case. Kate here is rising there is good news. Some of theo playbook and methods, and now, it is all about education. And it not to move into the territory of the next panel, but it is solution focused. Being able to educate the public about what the operations look like. Dhs put out a great info graphic a couple weeks ago about the war on pineapple and essentially took an issue that is divisive but not political which is whether or not pineapple along pineapples belong on pizza. Pizza pineapple on this is going to get very real because i am very propineapple. [laughter] ginny they took this topic in five easy steps to using this info graphic to show what the adversaries are doing with the topic that is divisive. We have something we disagree on , so how do they take that thing and drive a wedge . Im encouraged because i see our government doing things and Civil Society organizations and th are doing things to educate the public around that. The awareness is so through the roof. Not saying that we fixed it, but it is encouraging to see that there is progress. It sounds like theyre really havent been effective defenses against disinformation, maybe not many examples historically, but have you seen effective defenses . A brokent to be record, but i think the biggest defense is educating the people on how to discern what theyre reading, understanding the techniques and being able to filter it better. Its also about weve been talking about disinformation versus misinformation and how do you define the different terms. If we can get on a similar page about what is happening kate that might be a difficult thing. But people are working hard on it. I would agree that education is important and i call it a citizen space solution, not just looking at children when we talk about education but voting age people. Helping them navigate the flow of Information Online but. Something that does give me cause, from what i have seen recently in ukraine during their president ial election is that bad actors are now adapting their tactics to the kind of obstacles that some of the social media problems have put up, but also post russia 20 . Nina yeah. I am talking this year, 2019. Rather than being stopped by the wall put up they are burrowing under or going around in different sorts of ways. Kate im interested in that. Is there a way, described as how owing. Re burr how do they take from that russia playbook that they have now grown or built upon . Nina so there was a lot of discussion about political advertisements in the 2016 election. Facebook has now put a lot of transparency measures in place that aren search ads being bought, there are records that outside actors and foreign actors cant purchase these ads. It was a helpful tool for researchers and journalists like myself. Anticipating this, russia, according to the Ukrainian SecurityService Based on activity i saw when i was throughads and looking all that information in ukrainian and russian, russia is trying to rent peoples Facebook Accounts in order to use them as ules. They pay them about 100 a month or so, which in ukrainian terms, is one third of an average ukrainian salary, in order to use their accounts to get around those ad restrictions. The Security Service found this out, alerted facebook and i dont think that will necessarily stop russia and its against their terms of service , certainly, but this is something that worries me. Those who dont have Strong Political press excesses and want to get extra cash or someone in a third world country is going to be happy to rent out that facebook account. Kate seems like a hacker for hire. Nina thats a scary thing and a lot of manipulation going on in groups. Its moving up in the facebook hierarchy in terms of what people are seeing. So folks are being told or manipulated to share more content that they are seeing in secret and closed facebook troops, and that is harder for researchers like me to track and notify the correct people about , and then theres also more trust in those groups. Right . So people are in a group about a politician they like someone politician they like, and someone says here share this , meme. They dont know where its coming from but if their trusted group. Things like that scary to me and its the nonmonetized content where news is traveling organically. Based on peoples real feelings and worries that is being amplified in a very authentic way. There are all these policies about inauthentic hit you on the platform, but this is authentic. How do we educate them to. Make sure they know they are being manipulated . It is much more difficult than a five step process, it has to be much more holistic, but i am not totally dismayed yet. Kate just wait until the end of our discussion. [laughter] jessica, how would you say socialmedia, what socialmedia has meant in terms of disinformation . Jessica we have all pointed out one of the things that has really changed is the birth of the internet and the rise of social media platforms, it means we take those trusted groups about in a was talking about and we put them into that structure and then we daisychained in them together, and it makes this really great great structure for information to move. Sometimes good information but also Bad Information. Im sure everybody here has had the experience of someone who you trust, like this is a smart person, they know a lot about this thing, they share a story on facebook or another place. You to read the headline and you share it because you trust that person. There is research out there that shows that that type of process, that is like the perfect Fertile Ground for disinformation to spread. It is also very hard to tackle. Societies need trust to function well, particularly democracies. How do we give people the Critical Thinking skills to ask that question, particularly i have been doing research in the facebook groups, parenting groups where misinformation can spread quite easily. People are very worried and may be sleep deprived, asking questions about their childrens health. That can take you down the bad path very quickly. Also with education, we see that being very successful in certain places. There was just a big profile in finland, in their efforts. Thein a lot of the world, Education Systems already very stretched, and theyre trying hard to do a lot of different things. So we add this other thing, and it is a big challenge. Our own Education System has some issues. It is going to be hard to ask other countries as well to solve that issue. Kate we know information moves, just look at the news cycle, we can see how fast information moves. But if you had to compare, how much faster does disinformation take off today because of social media platforms versus free preinternet, presocial media was this information alive and well . Whats the saying, and we a lie will make it around the rugby for the truth can get its pants on. Thats amplified by how we are connected. If such a become just the fact you can send it in a faster, text somebody. The biggest social platforms, your text messaging. Ive heard that some of the way people communicate. Information travels faster between us, but it also means the bad stuff gets through. Kate i have followed a lot of your writing, and he have been very critical of social media platforms and how they handle or mishandled this information on their platforms. Do you think facebook, twitter, that indo you think 2016, they should have known better . Nina its a typical question. I mean, in hindsight of course, right . But i do think in that political moment, i understand and sympathize with what happened, because even our own government can make a call to say that a Foreign Government was agitating of one candidate or another. I wish the platforms could have been quicker to his bond when it became clear what happened post election quicker to respond when it became clear what happened postelection. Solutions put in ande have been slower things get through the cracks. That being said, were having a much more open a productive conversation about these things now. I think the fact that the platforms are open to a regulation conversation is good. I think its really important that Congress Take the reins there because there have been some worrying developments out of the white house. There was news last week about an executive order about political bias. I think we need to make sure we have a very, very well informed conversation about how this regulation is going to work and make sure it is working on behalf of freedom of the press, freedom of speech. I know theres going to be a conversation about this in the next panel, but making sure we keep all those developments that keep our democracy safe at heart. And that it is not just about this political bias question but is about creating a more robust and informed discussion. Kate do you remember if there was a moment that you began to think that there could be , even ifation at play you had been pinpointing it in russia in the 2016 election . Nina this is very personal and hindsight is 2020. I remember i was in ukraine at the time working as a Strategic Communications advisor to the ukraine government as part of a fulbright grant. Russia and ukraine were very much on my mind. I was posting a lot of platforms about my experiences and a lot of men particularly that it i went to high school with were suddenly chiming in on all of my profiles and sharing links to rt and sputnik about how we should reconsider the russian government stretching actions. And i said, where is this coming from . Are you know this is a russian government platform . And it must be said, rt and sputnik are by far some of the least effective means of russian disinformation. But that is when my spidey senses started to tingle a little bit, and certainly, it reflected a lot of the stuff we had been seeing in ukraine, where these links to russian essentially propaganda outlets were being surfaced by actors who, again, like local actors, had a sense of discontent for one reason or another. that is really ginny we had a similar thing happen in the Iowa Caucuses where we were creating an app for the caucuses to provide the results to the headquarters. So it was not actually how the votes were being cast are , it was how they took the results back to headquarters to make sure the accurate results were received and released to the press. In the weeks leading up to the caucuses, there started to be some twitter activity saying that microsoft was trying to win the election for marco rubio. It was really weird. And of course, i was the only one paying attention to this, searching for different terms, trying to understand how all these random twitter accounts essentially had the same talking points. They were slightly different but were not entirely different and it were all on the same talking point, which was that microsoft was trying to win the Iowa Caucuses for marco rubio. Someone created a meme like microsoft. It didnt really take off. Drudge atink it one point. It never made it into the mainstream, but i remember talking with all of my colleagues at the time and saying, something really weird is happening here. I dont know why these random people on twitter seem to have this idea. There is no article it is being linked back to and where is this coming from . Thats when my senses started going off that something what was happening without any context of who is originating it. Kate i can only imagine how many different people had a sense of, this is really weird, something is going on, that kind of conversation. Its not just Public Sector problem. Its a private sector, as well. When did you begin to think this isnt just a Public Sector problem, a private sector problem that we need to be tackling . Ginny its a good question. Frankly it answers the question just to say, the role i was in before the election versus the role im in now, because leading up to 2016, i worked on Campaign Tech for the company. I work with Political Campaigns on how they use technology as sort of an evangelist for tech and that includes security features, but that was not our focus. And it is Cyber Security mocker see. Election, disinformation defense. So the whole fact that team got the 20 followed not just election, but what was going on in europe and ukraine and other places where this was clearly becoming an issue. We knew we needed to protect democratic institutions. But it was important to us as a company both from a business standpoint, a Business Case to be made but also it was the right thing to do. For us i like to say we were included and are doing something earlier, but it was definitely something we focused much more on following the election. Kate jessica, do you see in 2016 that russia broke new ground . I mean, from your research, did they break new ground or was it just kind of, i hate to say it, like perfect storm of how it all went down . Jessica so in this question of when people first saw Disinformation Campaigns so in 2014, i was working on a project in myanmar with one of my colleagues. One of the things were doing there was teaching people in Political Parties and trumpisms, journalists how to use social media as effective strategies. The people there were talking about the disinformation problem. That there were these rumors about muslims that were spreading. I was there right after that was a big riot in mandalay over these rumors. At the time, all the americans were like, well, tough. Terrible for the people of myanmar, right . We talk about the cold war. Its not a new thing. Its just that weve seen the consolidation of people into major social media platforms. This creates bigger channels for this information to spread. We see a nationstate actor that is very savvy and has fought very hard about how to exploit that landscape and exploit societal divisions, for instance, in the United States. We have low trust in media low , trust in government. A lot of division. This is like the perfect ground for disinformation. Its a longstanding problem but i think in 2016, we start to see. But also tracking before that, just the practicing of these tactics and refining them and then sort of broadcasting them more broadly across european and dust europe and the United States, but also other parts of the world. About how the Intelligence Community determines that russia interferes in the election, but does not weigh in or render any judgment on whether the outcome wouldve changed. How then do you measure, you can take of a 20 example or another one, how do you measure whether a Disinformation Campaign is a successful one . Nina i think it is measured by whether the conversation has changed. Differentk at interventions in 2016, we had the operation of the target of the dnc we had all these, memes and different sort of social media content being shared, and again, i will make a point, that ads were the least successful part of that. If you look at the engagement on a lot of those posts, it was much higher than what was achieved through purchasing 100,000 of ads. Because of all that stuff, campaigns changed how they were talking about themselves. People changed how they were talking about the election, and the media certainly change what changed what they were covering. We were covering emails that wouldnt have otherwise been made public. And the conversation changed. I am not of the opinion that it is productive to try to make the calculation of whether the election was swung by a foreign actor or not. I think its disturbing enough that the conversation was altered that much. Even if you just take the dnc hacking leak. There was other stuff on top of it as well. Kate what do you think about that . Looking for aeport card on the Disinformation Campaign, but is that the way to judge affairs successful or not . Jessica so i think nobody has been successful in saying can we count . How do we know conclusively . We can say russia caused this many votes to go this way or that way . I agree that it is not a very fruitful way to think about disinformation. I think what we can say is that we see, we can see how information is moving, we can understand ways in which certain types of platforms are being used to spread that disinformation. We can see people reacting to twitter. D of memes on twitter is a great platform for misinformation because it is a smaller platform but it has politicians, journalists, and academics, all people who amplify information are there. So it is a great place for microsoft is working for marco rubio [laughter] and to try to geteones attention, versus facebook. Even now, we dont know exactly what is being spread. My sense is that can we qualitatively or quantitatively say what exactly happened . Probably not, but we can say that we are in a different landscape in which you have organized actors working to spread disinformation. Nationstates have a lot of resources. Also see places like the politically increct 4chan and 8chan that are putting out disinformation for fun. And we have an infrastructure that facilitates that. So we are in a different landscape. Even russia does it. It is a democratic tactic. Ginny a good question is what is the goal . If the goal is sowing chaos, we can agree that that was one of the goals. I think it was successful. If that was the metric by which you were measuring, whether they had other goals in mind and whether the election changed or not is a different topic, but i think the big goal of sowing chaos and confusion, changing the conversation, i think we can all agree they succeeded on that on kate jessica, what is more dangerous . A Disinformation Campaign from a government like a nationstate, or one from like a nonstate actor . Is it always clear whos kind of running the operation . Jessica no, its not always clear. Governments, russia as an example, will use nonstate actors. What those relationships are are not entirely clear. They could be hiring a marketing firm. They could be putting pressure on someone to do something. That is one of the attribution issues we have in cybersecurity in general. How do we know when it is a nationstate doing something versus the criminal within the ir border . What type of responsibility if it is just a criminal does the government have . What does the International Community do in the face of that . It is a chicken Cyber Security issue that plagues disinformation. Nationstates have resources that most other actors dont have. Itsf a nationstate focuses will, if were going to personify it, onto doing Something Like that they can throw a lot of resources into that. Not just resources, money, but also time. People, talent. And so i would say that in a broad scheme, that they are the most dangerous. But inside countries where we are dealing with nonstate political actors using disinformation, thats also quite dangerous as well. The sowing chaos undermining trust in institutions, that is one thing that is a probable outcome of a well executed Disinformation Campaign. If you look inside other countries, you see a particular political faction spreading disinformation, that can cause people to die, right . Which we havent exactly seen in the United States. Nationstates i always think they just have a lot of resources. Kate what do you guys think . Nina i think jessica brings up a really good point about local and political actors and thats one of the reasons addressing the disinformation problem in the United States has been so difficult. We have seen so many domestic actors using the same tactics that russia is using. I think this is where the social media platforms get into bit of a quagmire. In the United States, we care very much about our freedom of speech, but in europe, they are very happy to write a rule or some sort of law that takes away those rights from people, and people are mostly happy etoiv them up to some degree, but here in the United States, it is a hard one that we are not going to give up. The answer i was have for this is in the United States, the platforms are not public institutions. They are not public squares. They have terms of service your to. Sign up i dont think most people when they signup to share pictures of their kids and their dogs are thinking about that. We need a better model of informed consent for people to understand what they are signing up for so that they know they are not allowed to spread hate speech and thats defined pretty clearly. How we get that out there is something thats up for debate , name sure it will be discussed later. Bodyer it is a government who works with the platforms to decide those things, whether the platforms get together like they have when they have dealt with terrorist content, to set standards and rules. Because the same rules that make russian or iranian or saudi disinformation something we dont want on the platforms, can also be applied to americans. That is something we need to get more comfortable with and set guardrails around. Kate wekend is just make an app for that . [laughter] nina i dont know. Ginny can i jump in . I actually think the platforms have been investing a lot in algorithmic detections. So using computers to detect violations of terms of service , and i think we have seen a lot disturbing content slip through the cracks. A think the christchurch shooter, getng that content offline was a lot more difficult than it think people anticipated. There were instances in ukraine were i saw content which you definitely should not have been kept popping up over and over, the same exact picture and the same exact text by somebody who had been repeatedly posting that content. And it was not removed. In that case, i think we need investment in people, we need humans looking at the content. Yes, that is more expensive for the platforms, but these are billiondollar companies. We need people with cultural linguistic context. You were talking about myanmar, that was one instance where we needed people who understood that local context. Again i think this comes back to a citizen, a humanbasedbased solution to these problems, because these are people problems at their core. Kate what is the biggest challenge to protect against hacking still. There was one element of the attack coming from russia. What are the challenges you guys are up against most of the time . Ginny a lot of people still dont think they are a target, in part because they can understand the message which the attackers will try to get to them. Kate like a difference between email orishing Something Like that . Ginny or Something Like a password phrase. Also needs is your domain and can fire up kate talk to me like i dont know what youre talking about. [laughter] ginny essentially someone just knows cnn. Com is your email address. There are ways we can send a bunch of phishing emails to anyone essentially at that domain that would not necessarily need to have a sophisticated phishing against you personally. Those exist. That kind of social manipulation is pretty scary. But i guess there are a lot of people who are like a junior staff person on the Congressional Campaign and they are, like, no one is after me. Its security by security. I am not someone anybody knows knows. Anyone is targeting. Z, and asx, y and long as i do my job, we spend time talking to Political Campaigns around the world of art basic cyber hygiene. What is twofactor authentication. Kate i am sure largely they dont even have. Ginny a lot of them dont. They think it is really hard. It is really not. We spend our days sitting down with Political Campaigns with their phones and walking them through two factor kate post 2016 . Ginny yeah. That happens a lot. They dont do some of the basics it is becauseimes someone is not teaching them. Kate they think they will not be a target . They still think its like not going to happen to me . Ginny this is not like everywhere, but we do run into resistance with folks who just dont think they are a target. Speaking of the myanmar example, it kind of made me laugh because try to do meetings with countries where they treat us like myanmar. Like, that stinks for you, america. But thats not going to happen here in this country, that it absolutely and will happen in. We are trying to get them to not just think of themselves as the target but their democracy as a target. Thats the biggest challenge on the small level as well as the broad, just awareness that they should be protecting themselves and doing the fundamentals. If they do the fundamentals, turn on two factor authentication, dont use the same password for every count every account they have, use a password manager tool. If they do those things they would be protected from like 92 of the hacks. Kate if you were a betting woman, what are the chances, guaranteed forgone conclusions mind that there is a president ial campaign or Congressional Campaign that will be hacked . Ginny a think it goes back to what is the playbook this time around . Weve been tracking nationstate attacks against our customers in this political space. We have a program that they can opt into that allows us to give them beneficial information but also allows us to track that are what is happening. So far we have not seen a lot of attacks against Political Campaign customers. We have seen a lot in the think. Ank space, in academia those are the areas you are seeing right now. Kate you have a running assumption of why that is . Ginny it is similar to what we saw in previous cycles, trying to get information from policy think tanks. Nina it is also important to note that for an attempted hack to be successful, it doesnt actually need to get into the system. Like russia, we were inspired to trust in the system, right, they can just rattle the door handle, they dont need to open the door all the way. That is what i think they did in 2016, with the attempted hacks of our election infrastructure. Now we know that all 50 states were targets of attempted hacks. What does that say to an average voter who is already worried that their vote is not necessarily being content, or things their vote doesnt matter . Thats whats really worrisome to me. We can put up the strongest seal ourrmetically information environment, but when those attempts have been, that is what can inspire that distrust of that is really difficult to overcome. Said ismething ginny the security by security concepts, i think it can apply to a large part of the problem, go beyond even hacks. I think it might get to the core of what really gets to why these are effective campaigns or not. Jessica we dont have good studies on how successful maybe microsoft does how effective these phishing campaigns are. You dont have to really target that many people within an organization, like less than 20 before someone will click. In academia, there are messages like dont click it. And i also think, targeting academics and think tanks, this is another way of getting at Information Sources that have been seen as credible or trusted. If youre trying to undermine trust in the system, you go after organizations that could be a trusted voice, talking about whats really happening. So it is not a surprise. Our chief information secured officer, our University Get hit all the time. We just have a couple more minutes before we open it up to questions. We talked about deep fakes early on. Some of my colleagues have done some really interesting, really explaining what deep fakes are, these hyperrealistic manipulation of video and Audio Content that is very hard to debunk and detect, especially just from the naked eye. And it is interesting and terrifying. He had said that you were anticipating a problem of it. Itll most seems like it is kind of knocking on the door, the problem with deep fakes and what that could mean, lets just say in 2020 if someone got, really started working on it. Nina the party scene cheap we have already seen cheap fakes, where you arent necessarily changing anything, you are just slowing down the video to change the procession of what is happening. You can manipulate video in some new ways that dont require using a. I. Or synthetic media. We are seeing the effect of that simple application what happens when they take it to the next level. How you fromion of a to standpoint address the. To your point about being able to pull images off because they essentially have a dna to them, we have figured out in many ways how to identify picture and quickly grab it and pull it down. It keeps getting harder especially with video being edited. The christchurch situation, in part it was because the video kept getting edited in so many different ways, and that made it hard to identify quickly to pull it down. Thats one example of what were going to see coming to technology. Theres not a lot of research and funding into countering this kind of technology. Most of the research has gone into creating this technology. Kate i think darpa is doing, uc berkeley is also interesting in that, but it is not there yet. Deep i am worried about fakes, but i am also worried about the very, very simple content that is still having huge impact on like were still seeing these memes that look like garbage that are getting shared millions of times, and people are like yes, this is clearly true. Albert einstein definitely said this quote. [laughter] thats really scary. That goes back to the very ofple education, five steps how to identify if his sister or false. Jessica i was thinking about how, you study online communities that a quote from a famous person that has been spreads. For years, it imagine if you have a video of them actually saying it. I feel like we dont have the tools to deal with that, and it is pretty scary. Kate something to look forward to. [laughter] lets open it up. Theres a lot of questions. Jane. Should point, i jane am harman, again. I should have pointed out we have an allfemale panel. [applause] so, kate, you got to ask all the questions, but i dont think anyone asked you specifically about the responsibility of media. Ginny was talking about cheap fakes, which i assumed one of them was the slowdown of nancy pelosi speaking that she was inebriated. When you saw that on cnn, what was your responsibility in terms of putting that on or just not putting it on the air and describing it . Kate how fast the new cycle works. I mean, there are moments when things go from, i can see it on my twitter feed when i am onset to, they get in my ear and say, that we need to go with this. It can be a matter of seconds. Fastis just me saying how it actually can move. That comes with a lot of risk and responsibilities. That video specifically, there was real conversation. It happened really quickly, of what is this . Find the source. Then it was called out almost immediately, it was called out as the late it, on the to call it out as what it was, but it speaks to the larger issues especially in the post 2016 world of the responsibility of media organizations just like citizen, of, any making sure the information, making sure we know the source of the information and taking time i think there is kind of a collective thought of taking a breath and being responsible and thoughtful of what information we are putting out. How much we are amplifying one or onel narrative, particular story, or one particular soundbite. Much is kind of a how priority and weight do we put on one story in a 24 hour news cycle when things are often repeated. Theres a real conversation about it. Do we do it perfectly . Absolutely not. No one does. But there is a real awareness, and it has been a lot of soulsearching and kind of collective postmortem that i think was done not just at cnn, but at every media organization, on how to do better going forward, for sure. However you want to do it. Whoever you are closest to, just hand them the microphone. [laughter] thank you very much for being here on this panel. My name is cindy garcia and i have two questions. My first is from your research , who do you believe are the most vulnerable populations to disinformation . Have you recognized any patterns . The second one is what are some actual concrete educational tools that we can use now to address these issues . Thank you. Jessica yeah, so disinformation flourishes in low trust environments. Lowhere where there is trust in media or low trust in the government, anywhere there is no shared Information Sources that are credible. So any country where people believe that there is one information source that maybe they dont always agree with what they think is pretty much credible, there will be more resilience against disinformation. Places with any type of societal cleavages which probably describes everyplace, right . Places with any type of societal but in some places those cleavages, theres history of violence. Theres history of injustice. We are also talking about lower trust environments, places where people have lower literacy rates. Often, weto say talk about Digital Literacy or Media Literacy or information literacy. Humans are usually very savvy consumers of information. In the myanmar case again, prior to the transition starting in 20112013, people didnt trust the media, but they read it so they knew what to say if they were asked. They werent stupid or illiterate. They just had different strategies in that context. Different types of strategies can lead to different and more dangerous outcomes in some places. Nina on tools, i always talk about reverse image searches. Just a show of hands, how many people here know what that is . A little less than half. Thats actually really good. A reverse image searches search is something you can do on almost any Search Engine that you have downloaded on your browser. It can show you when the first known instance of that image is from the internet. So when you are seeing images recycled, a lot of times, for instant the russian government will use pictures from different conflicts and wars to show really terrible scenes in syria or ukraine that are actually from years and years and years ago. You can rightclick ever using chrome whatever browser you use and search for this image and it will show you the earliest instance of the image and thats a really easy way if your spidey senses are a thing going to say is this picture really what im saying . Should i believe my eyes . The are a couple of other methods like that that a thinker easy for anybody who has a modicum of computer literacy. Good afternoon. Im with National American league of press association. I mentioned im a a veteran. I worked with units with the United States army. Disinformation is a tool thats always going to exist. Hackers are a way to collect information. In the cyberspace, different intelligence gathering to do this information. The United States has all this information. Russia had their own information. We have nothing to copy from the russians. Now, look what happened the obama administration. They carry out strong disinformation against the campaign of trump. Now lets be clear. What is this information what is misinformation . Misinformation could be its a mistake. But disinformation is crystalclear that its done on purpose, and you have black information which means all lies, and great information which is combination of some aspects are true and some are a lie. My question is this. It is very sad that media like the washington post, the New York Times, the different network, cbs, nbc have this information to apply into the mmunities against a Political Party and favor of other Political Party. That is the big problem. Because information should be by the media crystalclear, should be to our informed and educated properly. But, unfortunately, it has become very popular in first mesh in the 21st century to use disinformation within the media. Why is that . I want to make sure i understand your question clearly. You are saying why the media is not is using disinformation. Can you give me an example . The biggest example use would happen with that the mueller report, what happened against the campaign of the trump. The whole meeting was involved. Every and not only that, many intelligence agencies and federal agencies of the United States government. Against donald trump is what you are saying. Im going to disagree with you that the intelligence agencies were working against donald trump. I think the intelligence agencies have kind of laid out very clearly they were not working for any side other than gathering information in terms of disinformation. And i dont know an example that of when cnn has been knowingly amplifying some disinformation to benefit any campaign i can promise you, i dont know if you watch my show or watch some of my colleagues shows, but thats the last thing that i wake up every day to do, the last thing that i want to do. Thank you so much. Who else . National democratic institute, myanmar. You guys mentioned myanmar a couple times, specifically the need to counter information that is been used to stoke religious and ethnic violence. But but i didnt hear any mention about the current internet blackout and i wanted to know whether or not on the other side of disinformation it could be used as an excuse to write laws that put a crimp on Free Expression like you major in europe, or to forward on censorship like whats going on in parts of africa . Yeah no, i think this is an , excellent, an excellent point. Also myanmar ill think any of us making it we also have one of as a government where the government actors is also part of putting on disinformation. So i think this is one of the things about this like any issue that has to do with speech or expression is always going to be very messy, right . The questions where to ask about what type of speech should be permitted and shouldnt be, its never going to be a comfortable resting point that we come to. The use of internet blackouts is very common around the world and is becoming increasingly common, not just around times of societal unrest but also like during national exams. So using an off switch like that which effectively stifles a huge percentage of expression that is going on is something that we are seeing all over. And then we see in other democracies where that equation around freedom of speech a slightly different than here. Lawsrent types of appearing. As nina mentioned, we see that in germany and australia and all over. Where we see that in other countries is ways of how those rights should be, finding other ways in which those rights are being shaped. Because as we know, no right is pure. Therealways going to be tradeoffs that are made but i would absolutely guess the use of things like internet blackout is only going to increase as a way to both perhaps wellmeaning, trying to stop the flow of information that people think of leading to violence but also as a social control mechanism as well. Ive had an example from ukraine and if you want to learn more about this. I wrote a piece for atlantic about this discussion and the fine line between trampling on freedom of speech and protecting it, ukraine has been dealing with this question since 2014 when the maidan revolution begin and they started talking about blocking a lot of websites. A black russian social Media Networks and on one hand theyre saying i dont is disinformation breading that people are being recruited to fight on the separatist side, the russian backed separatist side of the conflict. But there are a lot of worries when youre looking at it from a democratic standpoint. As a know someone from nbi would do, because what if the government changes and somebody then has the power of the pen to write a website off of the internet in ukraine for myanmar or another country . We dont want to use the tactics that the purveyors of disinformation are using in order to protect ourselves from disinformation. Up here. On fellow here at Stimson CenterWilson Center. I have a very particular question for you. Howd you address this problem in terms of remedy . In romania we had a lot of problems not only with the propaganda, but with the Money Laundering. This propaganda is paid by crooks. I have thought about two possibilities. Ive identified the domains and the administrators of the domains, and im trying to find the Beneficial Owner in order to crack, you know, to prohibit the domain in order to think of the way of bringing these sections, at least before the prosecutors but in countries like romain or russia. In Eastern Europe usually we have corruption. Im thinking the propaganda also affects your financial integrity because they are paying. Im thinking if you have some best practices to share with me in terms of remedies. Because weve all seen the scholarship is pretty clear on that side, but in terms of remedies, we have to find a solution. Thank you. Solution to the its a big question. Certainly the Money Laundering question is intrinsically connected to disinformation. In georgia there were protests in june about russian influence in the country of course russia has annexed part of georgia illegally, but what people were protesting was a creeping russian influence through finance but then is replicated and amplified through disinformation. Somebody importing wine into russia would say why dont you put the stories in the news outlet that you own . Theres a lot of problems with a situation like that and i think the answer is more transparency. And im not a lawyer but i think transparency around ownership of domain names on companies and financial flows is something we can all aspire to and do better, and the United States and the uk had a big role to play in that that we have not yet stepped up to. Anyone else . Ok. Right here. Thank you. The American Security project. I am doing a lot of research on disinformation and sort of the u. S. Role in countering it. Doing a lot of soulsearching on the ideas as well. I want to come to sort of a theme thats been echoed throughout this panel. Nina, you stirred up pricing basically disinformation is targeted of those who find most you started out i saying you started out by saying basically disinformation is targeted to those who find it most appealing. And doing some research on psychology and that sort of thing where people are fundamental in some ways receptive to disinformation, and even when told during basic site of experiments this is disinformation they had been exposed to, its false information they still dont change their minds. So how do we deal with this one were coming up with solution and realizing people are fundamental ly in some ways resistant to facts . So this is where i got very cheesy and talked about civics. The internet to some degree has taken away civility from debate, and as an alumnus of my High School Debate team, the more that we can inspire not only young people again, but all ages to have conversations facetoface, that way that emotion is kind of taken away where you have to measure what youre saying and understand that person on the other side of the twitter profile picture or facebook is a human being. The more that will get back to a really robust and productive conversation that i think sometimes thats taken away when we have got computer screen in between us. Inspiring those conversations i think is something that governments and politicians have a role in playing. Getting out and speaking to constituents facetoface is important, and learning how to talk about the stuff which is i hope something im going to be working on in the coming months here at the Wilson Center giving politicians the tools to do that sort of work. And giving people the sort of tools to do that as well, thats part of Media Literacy, part of Digital Literacy. Remember that people are humans and we can have these conversations, whether its through a computer screen or in an environment like this. But it does get to the impact that even when people are told, youve been fed disinformation, how do you erase that from them . Thats one of the, like how do we solve that problem, right . I think through like nene is talking about like big systematic longterm effort which isnt a very nice like easily packaged policy thing here but a noticeably teaches a lot of students every year, what are the most interesting things that can happen as an instructor is to watch the Student Company to my class with a bunch of preconceived notions which we all have. We all believe things about the world. We believe things about how the world is ordered. Instead of me telling them that this is wrong or thats wrong, instead of having them do exercises and work that allows them to unpack their own police and understand what evidence comes from, how we produce knowledge, you see students have these moments where the often change their minds which they previously believed would just get lit up with sort of secret superpower 15th about trying to understand the world. And i think so the Education System itself, Building Work and at every level which a lot of people do trying to teach people how to take management of, how do we verify information is true or not come into their own hands. Certainly your right, the studies say we dont believe when were told something is not true, and we often will double down and us more about like our trusted networks and our Immediate Community that makes a difference. When i talk about why our programs exist and why we do what we do, i say this is a multi pronged approach. Unity government, Civil Society, academia, tack. Because reality is its messy. Theres not a simple solution to that question or any other ones that up in front of. There are little micro rules we can all play. We can get a ferment the screen screen and i to interact with some indifferent public us. We can learn from our professors and of the people around us. One thing we think about is how do we make sure with some of thats a search on our platform at the get accurate information. That is just a think we can do. Its a role we can play. We can make sure when someone is looking for where they could devote that the information that is upfront on the platform is the location they should go to vote at that weve worked with the secretarys of States Office or whoever to make sure we are actually putting that information out there. Theres practical things that can be done and theres some philosophical solutions but the reality is its going to be a combination of groups and individuals acting into the ways to try to get to that. Thank you guys so much. Thank you. The show continues. New panel is coming up, but wait, theres more. New nametags. Were welcoming one mail to the panel. Its ok. We can do it. So listening to the last conversation which was not cheesy, i was thinking about why the Wilson Center matters. Here comes the softball. Because we have panels like th and local points of view on the panel and from the audience. And were openminded and we try to put out there not one right answer, but a number of policy options that people should think about. And if we get you thinking more about things, we are doing our job and i think its a job that is desperately needed in this town where people seem to crouch in the corners and lob grenades. As i have said often from my nine terms in the United States congress, i learned a lot. I was honored to serve there, but then when i had opportunity to come here, i realized that so many of the things i wanted to do in congress were better done at the moment right here. Because what i wanted to do was think seriously about the heart problems, have people to help the hard problems, have people to help me with that. Because we all need help, or certainly i needed help and i still do, and then to come up with some things that would really improve, i hope, the quality of discourse and the quality of life around the world. And that is what we tried to do here. Its a little ambitious but we do pretty well and im very proud, just let you know, to be rated number one in the world in regional expertise. Thats just a high honor for the Wilson Center it has earned every single day at a very smart people who run our programs. Speaking of which, this program, in case you missed it, is part of our science, technology, and Innovation Program headed by make king who is hiding out in the court and is furious i mentioned her and touted her and yetuted her, but she is another of the very bright and i think idealistic would be, what would under represent what she offers. Right, informed, and caring people who are trying to make our world better. So back to the current topic, and now we go from the context that exists to the challenges that we face, our second panel will focus on the evolving threat of disinformation, which as you know we have not eliminated, including current challenges to social media platforms. In addition to deep fakes you all now know a deep fake and a cheap fake is, right . Raise your hand if you dont know what a deepfake and a cheap fake is. Come on, guys. We did this together. Very smart audience. In addition to deep fakes, state and nonstate actors nowadays infiltrate group chats and social media. We all know what group chats on social media are. And use fake get this fake Online Identities to seamlessly blend into a users newsfeed. Got that . What that means is, for example, in ukraine russians have attempted to rent the Facebook Accounts of ukrainian citizens so they can post political ads and circumvent facebooks regulations. Joining our second panel is katie harbath, director at facebook, and the former chief digital strategist at the National Republican senatorial committee. Welcome, katie. And we thank facebook by the way for supporting this event and future events that we will hold on this topic. Also joining us is david greene, Civil Liberties director of Electronic Frontier foundation, which is a very capable, nonprofit, dedicated to Digital Privacy and free speech. He brings significant experience medicating First Amendment issues in state and federal courts are so now we have a few more people to sit down drumroll. Second panel begins. Back to you. Yes, you guys are still stuck with me, sorry. So weve discussed and you guys are watching it as well, we discussed, we know whats happened. We know whats happening. Why is it so hard to prevent, to stop when it is identified and what are the challenges of tackling disinformation today a special and social media platforms . And the challenges of regulation and how there should be regulation, who should be regulated, who should be in charge of making these decisions . These are really big topics and these are to make people at the forefront of it. Thank you both for being here. Katie, first you. Facebook faced criticism after 2016 for being late to the game and slow in the response in 2016. What have you guys learned . What would you say youve learned from 2016 and sense . Since . Where does that put the challenges for your platform right now . Yeah, no thank you so much , for the Wilson Center for convening this and working with us on these issues. These are definitely ones we cannot tackle alone. We are completely Different Company than what we were in 2016. Ive been at the company now for eight and a half years, and ive not seen such a big shift in the work and the focus of our company on a topic like this since when we did the big mobile shift in 2012. I would say this one was even bigger. Those changes include more expertise, expertise in everything from cybersecurity to threat intelligence, to also just local expertise on the ground. Because the problems that countries face, there are some similarities but theres definitely unique one for every single country in terms of what they may be facing. Sometimes its not foreign interference. Sometimes it may be domestic interference of what were looking at. It could be across different platforms facebook, instagram, whatsapp. We want to make sure for each countrys election, we are doing the right threat analysis for understanding what we face and what the right ways for mitigating those are. One of the things we knew going into this and we are definitely seen there is never going to be a finish line. There is never going to be a point in time where we are going to be like, we solved it, lets move on. Because we have gotten much better in terms of cracking down on more transparencies and et cetera. But as they mentioned, there in they are now moving to other areas. Were looking at pages and groups. How do we make sure what is happening on instagram, for instance, and what are the right mitigation strategies that we can be doing for all of those. As we are going in to the u. S. 2020 election, one of the things we are looking at and brainstorming is trying to think about what are the different ways we think people might try to exploit the platforms what , are the mitigation areas that we can do, and also how can we be prepared for what is expected . The biggest challenge for us, particularly when you think about this domestically, is how do you actually define what is misinformation, and and what should you actually do about it. At facebook, agree or disagree, and this is a debate that i we should be able to have, i should be able to postpone this , but i dont have a right to that being amplified. We are trying to think about for much of the information thats why we dont take it down but we reduced the morality and the articles so you can see alternative viewpoints of that. Not everyone agrees with the fact that they work with. Not everyone necessarily agrees that they are neutral and not everybody agrees with the fact check. Often times stories are mixture of truth and fact. How do you think about those, what are the penalties we should provide for people and we are trying to learn as we go. And try to be doing something. That is a big challenge for us because there is still a lot of disagreement overall in terms of how we should handle the content. And got a million questions. I really do. David, ive versed want to get your take. First want to get your take. You come at this whole conversation with a very healthy skepticism in government regulation and control. Thats fair to say. Though yous, are saying it here, but the warnings are all over the place are all out there on how , American Voters are going to be facing more foreign disinformation rather than less in the coming election. Someone kindly mentioned the mueller report, but i will read one important part from the mueller testimony. He said theyre doing it as we sit here and over the course of my career, i have seen a number of challenges to democracy. The effort to interfere in the election is almost the most serious and this deserves attention of every american. You are one american. What do you see right now . I think we have to be really careful before we sort of embrace a role for government that either decides truth or decides who can speak and who cannot speak. In this election interference of , many legitimate harms that weve had to confront over our democracy where we have had to make that judgment or had to say the government has a limited role. I dont think government should do nothing about election interference but what i dont , think we want government doing is being the ultimate arbiter of truth which i think is an impossible role to play in these situations. And i dont think we want government telling which people they are which people are able to speak. There may be a role, there can be a role that are established by the and i think theres First Amendment, certainly things that the companies can do and should do and many of them seem to be dying to do. But i get nervous when we tried to put the government in a role of selecting speakers or selecting speech that deems to be appropriate. You guys are taking steps filtering and are probably using terms that are somewhat simple based before the government. Or any regulation put in place you talk about being open to regulation. A couple of things you said, which i found really interesting. How do you determine you said there is a level of expertise you are bringing to this that you have not before. S that Human Expertise . Is that technological expertise . There is a balance between if there is an algorithm deciding what is disinformation or not or theres a human involved there. Correct. It is a combination of the two and also important to remember that even Machine Learning and algorithms theres a human at the beginning of creating those. To give you a couple of examples where we look at that. After 2016 when you look at the bad behavior happening on facebook most people are not using real accounts. They create fake ones. And we have a pretty good sense of what a typical new user on facebook looks like. I usually use my dad as an example. A couple junes ago i got him on facebook and at first he only friended my mom and my brother. He did not post a lot right away. He was figuring out how it works , and then he started using emojis and i regretted getting him on facebook. She blocked [laughter] but thats a typical new user. Where somebody trying to create a fake account to spread misinformation or spread information quickly. The activity that they are doing does not look like a typical new user. So Machine Learning is a place where we can use that to try to flag some of the things that might say, this may not be who they are. And then those will go through human review where we might have enough signals to reconstruct that down right away and in fact, we block both accounts at the moment of creation. Bots, we block that. But for those that might be a little bit harder in the checkpoints. This is who you say this is. Another area that says the protocol add transparency rules. Are is an area where we committed to, and and we have the tools to try to require people who want to run political or issue ads in the United States to have to prove to us you are who you say you are, add a disclaimer, and put the ads into an archive. The challenges that we had with that is, if you think about typical campaignfinance law, you start with actor. Your campaign, Political Party, we have slipped that around. We said that if your ad talks about a candidate, the elections and the issue of national importance, that as an actor you must decide this disclaimer. This has expanded the number of actors who have had to go through this process on facebook that would not consider themselves typical political actors. It also causes problems with the International Organizations not based in the u. S. Who do want to run issuebased ads here, but are not based here. I think it is a real question in conversation we need to have about nationalism versus globalism. And where transparency should play a role versus where you should ban stuff outbreak. If you can about what that the ads that they ran from the 2016 for the 2016 election, it was issues around immigration and polarizing once. It was not necessarily vote for hillary or vote for donald trump. So, thats an area that we think regulation to help us better define issue ads and what should be required and allowed, but weve had to invest both in the Technology Side of things to try to be able to identify those ads and in humans, to also review those and make sure theyre actually in scope with how they define them. Kate im going to write something that you wrote. What katie is talking about. This was in 2018. The public on momentum for private companies can do something to more actively moderate the content that appears on their site is worrisome. And its worrisome even though i share the concern about this information, misinformation, extremism, harassment. Youre concerned about Companies Like facebook or twitter moderating what is going on. Why . Kate reporter david it is a lot of power to control who speaks. And we are concerned about government and for under legal reasons and otherwise we dont have the same legal concerns that i have one a platform im still concerned about influence on the democratic structures. The larger platforms like facebook and twitter and youtube hold a lot of power over who can speak and who gets to spread the message and who cant. I think they have a right to moderate their platforms, but i want them to do so in a way that is consistent with human rights principles. Whether that be when they remove content that is a well informed decision, its a transparent process, that person who posted the content is given notice and an opportunity to appeal the decision. These are structures i want to see in place, there is so much pressure we see what the company experiences. The pressure which is not moderating enough. But attentive pressure in whatever topic it is this must be removing content in terms of where the pressure is being inserted. There is a risk that Companies Resource as facebook and the ones who cannot hire thousands of people who speak the language and be able to moderate content. That is the easiest thing to do with content. It would be very troublesome and we coul we could lose a lot of the benefit from social media if we go down that path. Kate if we go down that path you talk about working on elections and not just the United States. India is slightly more than the United States. What do your teams look like . For elections and how far they had her working together. How does this work for facebook . We have over 40 different teams and 500 fulltime employees, that does not include contractors and others that we have working on elections across the globe. In our ideal situation we start about 18 with the head of an election. We look at it and we spent time understanding the situation on the ground not only from the own employees but working with partners, iri, and others of understanding the risks that they may see. Understanding what may have happened on the platform and the last time they had an election in order for us to understand what risks they may have and what partnerships they need to set up with civil support into society partners. Also which of our products do we need to build, i, etc. For that particular election . We will pull together that team that you see and they will work on it drop. On it throughout. I live in fear of snap elections, the uk is keep the car running election for me for quite some time. Italy is on the brink of having a snap election, israel is on round two this year, and we try to keep an eye on this so they may get called earlier to make sure we are trying to monitor it and stay on top of it. And then the teams work on this since we have to keep integrating. We did a threat assessment 18 months before, but that does not mean the same threats that are close up to election days. This is an evolving process with all the teams working together on a daily basis looking at these things. And the last piece of an election will pull together an Operations Center where we pull everybody together in one room to better manage all of the escalations and things that we may potentially see happen during the election and take action. And to be able to take what action. You mention the ways that actors are trying to exploit the platform. What are those ways . One of the main ways we see in combat is Voter Suppression. Anyone from wrong information and giving people the wrong election day, giving them the wrong roles around how they can vote the way they can vote. We take the information down, we saw in some places where people are spreading information, you can vote online and that is not true. That is why we work closely with the Election Commission where appropriate so we can ask them if we need to for clarification, but they can also flag content to us that might be violating local law that we can take action on. People might be trying to impersonate a candidate or profiles about the candidate saying things that are not true. We need to quickly work to remove those things as well and also a lot working with partners and the actual people participating in the campaign, the candidates and parties who may be wanting to run as or ads or might be having issues or questions that they havent got answers to. Kate some of the disinformation is somewhat clearcut. Ray . Right . Vote online, you cannot, that kind of stuff. But it does come down, in some way, there are judgments that have to be made, what is misleading, what is inaccurate, what do you do about that . When does it hit the bar of being too misleading . What do you do about that . From your perspective . David what is the solution . [laughter] you come from a very important perspective, what do you do with that, how do you grapple with that when youre talking about how something can be amplified and how fast this information can move . David theres examples we can play around with. Theres information that starts clearly that parodies. Parities. Not meant to be taken as truth but other people dont. What point does it become a concern . We dont want to say theres no parity on facebook. I do think anybody would be happy with that as a product or as a result of a Democratic Society. They are allowed to make fun of something youre not allowed to laugh. That even allowed to address issues by confronting them with parity. I dont think we want that. There are things that are mistakes, misinformation. Where somebody believes something to be true and it spreads,. And it spreads. So all these things sure the show the problem of what point is a trigger response. I dont have the solution and i dont think anybody does, recognizing that its nearly impossible as part of it. This is really difficult and the mistakes can be really critical and we get a lot from people that technology can be some magic box and im not a technologist, but i work with many of them, and i dont think that is true. There is a role for a. I. But ai is not a solution, it may but we will get down to humans as we all know are valuable and make mistakes as well. And katie said theres humans at the beginning of a. I. And humans at the end of it. Biass are going to bng to ai. I dont know what the solution is but we need to acknowledge that its not easy. There are some situations where its false. Those are the easy situations. Those are the ones where it is easy for facebook or twitter or andube to take action betify, and it will probably justifiable under their terms of service. There are other decisions. Those are the easy but i dont think those are the majority of the situations. To give an interesting example of one, reports of violence at the polls. Now, that could be something that people are spreading to stop people from voting or could actually be true. That is the situation if we see that where we want to work with trusted partners and Fact Checkers on the ground so china and terms of trying to understand is in a violent or a Voter Suppression tactic . To understand what we should do because if we make the decision in the runway, wrong way, that can have very bad consequences either way. We dont want to assume that because an Election Commission tells us its not true they may also have a case for not wanting people to think they are not handling the election click enter correctly. Election correctly. We want to be sure that we are being thoughtful as quickly as possible and thats where the real challenges come in with much of this work. David the scale is really enormous. The scale problem, these are the really hard things. Both of your perspectives on that. Before an election, something is put out, is it better to shut it down even if you are shutting it down and there is some legitimate users involved in this . It is not a russian but. Bot. It is a balance that i think obviously facebook faces as well as anyone in control of a platform. Im interested how you deal with the internally. With that internally. Katie so i think, we definitely struggle with it, however, i do think that whether it is people are being influenced all the time. I dont think we cant necessarily say 90 days, 60 days, 45. One of the challenges we have an trying to think about how we can do this better is even when working with partners it takes no time to say a lot but a lot a lie, but it takes a lot of time to prove it is a lie. The Fact Checking occurs. How do you reduce some of those things . Do we do reduction of distribution before and let it go to the fact checker . It means less people receive the newsfeed than they normally would. And what are the steps that we should take . To your point of finding the balance, i dont want to immediately take it off the platform and potentially put a but it back up if there is some truth to it. Then we are potentially censoring and altering the democratic process. David it is a really hard problem and theres both reasons to be extra sensitive before an election where you might want to take down things because theres not enough time to correct something. But, theres also the idea that might bevoices powerful before an election. And when we want to be sensitive to peoples right to speak. So you get the problem both ways and in regulating broadcast for example, there are laws in effect sometime before an election. So it would not be in an unheardof way to deal with these things. Maybe, theres something that the platforms can do with how they amplify things. Talking about transparency, letting people use the application. Kate what is the happy medium between censorship . Katie telling me what is on my newsfeed anything goes. David i think from a user perspective, a lot of the problems is because people dont know how they work and make some difficult them to work the way they think theyre working and how they want them to work. So, i think theres very few people that understand how they get information on facebook. If you dig into it, it can be explained. Kate some people are never questioning it. David or they dont care. People use facebook for a lot of different reasons. How we use facebook in the u. S. In a Democratic Society is really different from facebook in nondemocratic societies. The role of organizing is so much different, it is so much more of a Political Tool in nondemocratic societies. Facebook, it is how i find out that my High School Friends have these really crazy political views, and i keep track of my friends lifecycles and things like that, but in other parts of the world, facebook is how refugees keep in touch with the family and how political organizing happens. Identity polly c, which may seem like an policy, which may seem like an inconvenience, andbe a matter of life death in society where if you reveal your identity, the your life might be in danger. Facebook is used a lot of different ways all over the world. Im a big fan of user control and the idea that you allow users to customize the platforms. At least they are using it the way they expect to be using it. That is not a solution for every problem. Right . Especially when you talk about misinformation, with bad content content, sometimes people see something they dont want to see because it offends them or hurts them in some way. And then theres the neighbor content, where i know its fake, i see, i criticize it. Am media literate, but i do not trust my stupid neighbor, and i am concerned about how they are going to deal with it. User control are not going to help affect that, if that is your concern. This is basic but important, because it gets to what is the responsibility of the company to stop the spread of disinformation on the pot the platform. You said youll never reach a finish line. That stuck with me. Is the finish line for now to stop all the misinformation from being amplified straight from the source . Is that it all a reasonable expectation or goal for facebook . Katie i think its really a combination of things. I think for the things that are clearcut, we are trying to stop it as quickly as we possibly can. Thats like the wrong election dates and things of that nature. For the large majority, it will be in the gray area where we can have different viewpoints on if that is false or not and what we should do with that. How do we think about reducing the visibility of that, but also giving people more context . There is still a lot of research out there in terms of what is the best way to help provide people with information about something in a way that they absorb it, and not in a way that makes them dig even more in terms of believing what they believe. Also, thinking a lot of what is the role in terms of transparency. The role of transparency of understanding who is actually behind the page and what david said, in the u. S. , people must think, why not. But we should know who they are. But in many other countries that could put people at danger and i would argue in the United States that would make me feel very uncomfortable of junior staffers having their names out there that they could potentially be targeted by people because they did not ask for that. What is the role of transparency to have more information to make informed choices on this. Those are some of the areas of the different levers, if you will, looking at what is the right balance between all of those as we continue to go into 2020. I think it is going to evolve as we see what people, what sort of information we are seeing out there, what were seeing, coming from. We want to bring more transparency to what known actors are doing and we want to bring unknown actors into the light, and perhaps that light is going to stop them or at least help give people a better sense of who is trying to impact the overall political discourse. Anonymity is a human rights principle, human rights value. It is the ability to speak anonymously it is is something that is protected nationally and under most International Law as well. When they have the ability to especially initially, someone who wants to enter a democratic process, the ability to do that anonymously is really important. The u. S. Supreme court has recognized the importance of political speech. When youre not talking about advertising. We have this value of anonymity that is really important and want to preserve at the thing time as countering wellintentioned efforts to provide the readers specific information with tools to be able to figure out whether or not this is trustworthy information are not. Or not. Kate the first panel we were talking about of what is the best defense of this information misinformation campaign. And education, educating the public and the user, the voter. The more people know that misinformation is out there, the closer we are collectively combating it, do you see that is as still the best defense on facebook as well . Katie i think that the microsoft, of why and all of this is not a collective or single problem for these platforms. If you think back, it is going to be an effort by all of us. In terms of better understanding and helping people think about misinformation and think about more critically about the information that they might see on the internet, it can be the role of perception hacking. So, bad actors are trying to get somebody to write a story or but a post up on social media or talk about saying that there is interference when there is really not. Some people think it is happening even though it is not. Kate just creating chaos by saying there is interference. Katie correct. So, in 2018, we saw something trying to have been. Because you had trying to happen. Because you had experts, reporters calling us experts and they knew who to talk to, you had platforms that were working together to see if this was true and actually happening or not. You are able to present to the American Public at large a better picture of what is actually happening on the platforms versus what the actors might be saying they are doing. I think it is going to be really tricky and we have to be we want to make sure we are careful that is actually what is happening and not just them trying to make them think that that is what is happening. I would join everybody else who has been up here and say i am a big fan of education as a strategy for combating this. Theres a few reasons. I dont get solves the problem, but theres a few reasons why i am all in on education. It has certainly the least number of conflicts among human rights values. On education, we are not talking about how to balance somebody elses rights. We dont have to deal with all of those things. Right . It is convenient in a really, really good way. The other thing, one of our main goals is really about restoring trust. Having people be able to feel that they can make good decisions, and education really helps because youre giving tools to have trust in their own decisionmaking process. They may still make mistakes, and we all know that people making good decisions, wellinformed decisions, can make different decisions, but it is important to have trust in your decisionmaking process. That is why i am all in for education. Its an incomplete solution and does not get us all the way there. It has to be part of a multifaceted approach. I like what facebook did with of india election in terms i dont know how many people saw this, the shadowing and it something had been picked up if something had been picked up by being involved of a certain presentation and might even click on why, but i thought that was interesting. It was at least having people take one more step. Katie it is something that has been of the continuous way of presenting something to people by Fact Checkers. It was a prominent overlay of for people to see that and to try to get more information before the actually clicked through and saw the content. Kate no warning . Katie right now, the evolution of this one with first started doing the Fact Checking, we did a big red tribal with an explanation triangle with an clamation point. And then there is the backlash effects. It does not make them feel like they should look into this more. Then we started testing with related articles underneath as well as a green fact check symbol next to it where you can get more information. This is sort of how we learn from different elections and try to see what is going to work the best or not. Just a different way of displaying that content might be false and how it can get to the fact checker story. Those are all things that we are looking at and of course bringing into the United States and continuing to work on for a bunch of different elections. David it wasnt taking content down. It was still accessible to some. I thought there was a very low barrier to getting the content but it was a pause. Did fornother thing we india particularly with whatsapp, because of the content, we did a Marketing Campaign called share joint rumors. It was a campaign that we did on tv and radio and on the facebook platform itself encouraging people to be thinking about what they are sharing on whatsapp before they actually did of e andd forward contact with spreading information. You hit on something that all this discussion comes back to, trust. There is a historically low trust in the media right now, historically low trust in institutions and government and everything. In there was a real hit platformsocial media after 20 and what exactly are we saying, can i trust my eyes, can i trust my ears . Ofis was a cofounder ,acebook, he publicly came out it was in a New York Times oped, he thinks that government regulation is the way that he wants to see this go. That facebook, he cannot trust facebook, he was the cofounder, he no longer can trust that facebook can make the change needed because of the lack of competition to get it right. His name is on the patent of the newsfeed and he says he does not trust it. That gets me to the place of trust. How would you and it is an amorphous thing. How do you get users to trust that facebook is using information, moderating information to trust your platform . It takes little time, it would take a long time to build back up trust for anyone. All i can try to do is we work and try to monitor every single election that is happening across the globe. We have had the last six months have had eu, india, israel, indonesia, a large amount of the world voted in the last half. There were things he did right and wrong and will keep improving on that. All we can do now is try to deliver and keep getting better. On the promises euro per year over year. The 2020 election will be a big test for us. I dont believe there is a finish line and we will have to keep this as part of our fiber of the company in terms of what we are doing. Best andis to join make sure we are making this a priority to be consistent and deliver on what we say we will, talk about where we made mistakes and where we need help and so we can rebuild that trust. As you are asking the question now is asking myself it importantis that people trust facebook . Generated content. Delivering content that other people right. So is it important that you trust facebook, you expect or do youcebook to or want to trust the people who are using it . This is a different question. I am not saying that i want i think it seems to me it ties into the education question. We need to understand what facebook is. Expectation, i and anant facebook, doing that. Of as i was thought about sorting this out in my head. And when people to have trust in their institutions that produce content. I want them to have trust in themselves, feel like they have the ability to make decisions. It might be important that they have trust in facebook but i dont think it is among the more important institutions that i think, it is always good to have something that is trustworthy but i dont know how critical it is. Maybe that varies tremendously based on the user and a rethink and how facebook is used in a certain situation. The challenges are clear, the challenges are great. On inys have been working the past year, there is a lot to be scared of. There is a lot to be nervous about. And to fear and to honestly feel about when it comes to disinformation. It seems like such a huge problem and some of the smartest minds say we cannot do this or celfin solving a problem. Are you hopeful, there might not be a finish line but are you hopeful, is this a solvable problem . David i can answer like a lawyer. A defense on it depends on how you define solvable. In any othert nor moment of history have we ever been able to say were going to eliminate all false information false harmful information. I dont think we will get to that point. I dont think we have the tools to do that. It involves too many impossible questions to get there along the way. Wheren we reach a point we can use these tools, some stable point where we feel like we can make good decisions, not only ourselves but we can trust that our stupid neighbors and relatives, other people whose decisions we dont like. In a good make it faith way. I hope we can get there. I am hopeful as long as we define our goal is being something more realistic than ridding the world of false information. This goes back to equilibrium. Job in august of 2003 and facebook did not exist. Greathen there is a thought of how great and how much of an equalizer the internet could potentially be in terms of helping people to come together around causes they care about, a way for candidates who would never be able to win in an election otherwise to be able to get their message out to people and be able to mobilize them. To rallyborhoods around something of a want to have happen in their neighborhood and be able to do that via the internet. It is something that still exists and is still happening. We did not do enough jobs taking about how it would be weaponize den how the bad actors would exploit that. Right now we are in a time when we are having to play catchup. To be focused on that, to be thinking about the processes and the conversations like we are having here about what are the right roles that we should play versus others, what should you be doing with this information. This is the conversation we are supposed to be having as the society. I am hopeful we will get to equilibrium where we have trust that that is being mitigated so the good is able to shine through. That is why i get up every single day, that is why i keep doing this because i believe that we will be able to get to that point. We are doing a lot of catchup still right now. Lets open it up. You guys have a lot of questions. I am a Research Fellow here and a retired television ,orrespondent and i want to ask a question that flows from janes question earlier to moderator kate. Arepanelist kate and david welcome to answer. What steps should the networks be taking to prepare to report this information . Systems that, every one is different, at cnn there are systems in place from our standards and practices straight isto our boss where there information before it hits the were a was when you correspondent. When information before has aired, there is a conversation about what it is, what are the sources when it comes to video content or photos, images, Audio Content. There are i would say an adequate programs that we have to check digital footprints, things we are talking about how a debate. There is a lot of catch up when it comes to the direction that is going but there are ways, the nancy pelosi video, how that was modified. , how images are adjusted and changed. All of that is checked before it gets to air. Is it foolproof . When it comes to disinformation in the post2016 world i think the internet has been doing a good job. Because there is a lot of information that comes in. Wes is a conversation, should talk about how much information comes in that we dont actually put out. Of backstops and teams of people that are looking at it before it makes air. Inetimes at a moments notice the middle of the night. The systems are teams of people with access to talking to facebook, talking to twitter, talking to Homeland Security department, talking to the fbi that there isurce a question about a certain image, information before we put it out. It depends on what on the situation. There are safeguards in place. The problem is always doing it and the amount of time pressure on. Com in a tv or timely fashion. 100 . Absolutely. Confidenter been more or proud of the work we are doing as journalists, cnn and beyond. There is a real responsibility now. , theore you facebook more the spotlight is on you the more responsibility is to get it right all the time. We are human and make mistakes. I am infallible. Classic freudian, kate. But i am proud of the work we are doing at a time that i know it became a funny cnn tagline of apples and facts first. I do believe it and i think were up to the task. Do you want to weigh in on how amazing i am . Katie i will put it on facebook. The last exchange pointed to the importance of a editorial process. Should that not apply to the part of newsweek that his newsfeed which is to say facebook has to take responsibility for content in that part of the operation. If i could, listening to discussion about elections, it seems to me if i were a patriotic citizen of india or britain, i would be very unhappy with the role that this private american or International Company is planning. I would want the Indian Electoral Commission or the equivalent in britain to set the rules for facebook or twitter or the other social media and covering my election. We know the chinese have a solution to that and the russians. What is the solution and a democratic country . We were work incredibly closely with the Indian Electoral Commission during that election. It started with the code of practice and some of the other social networks signed for it that was in conjunction with the Electoral Commission about things that we would be doing, we brief them on our transparency product and adapted them. They are in india. In order to run any ads they had to get it approved by the Electoral Commission and had to have a signed form so we made it so they were able to upload those forms and have those appear in the ad library. During our Operations Center had a direct line to us. Times they blackout would highlight to us as for other things they were seeing that they felt violated local law in the blackout. We would then take a look at those and take the appropriate measures as need be. We were public about that partnership in the Electoral Commission and the Electoral Commission was as well. It is important on these elections, everyone talks about those teams and those people. It is not just a bunch of people the u. S. Telling those countries here is what we are going to do. Quite a few folks who are on the ground and have that local expertise, working with Civil Society and others are important. And the folks in my team come from those countries. When they are working on to help build up trust. And the u. S. Is the exception versus the role in terms of how elections and how the electoral process really works. It is important to have that local expertise. David in terms of the other part of your question, the responsibility part, it might depend on what you consider to be newsfeed and the main page where people get their stuff from. What facebook is to do which is suggest a news article to which was more like a clipping service type thing. I dont there think there should be a legal response to have an editorial layer. We dont have a legal responsibility for other news entities to have an editorial level management. I dont think we should impose that. Think in terms of an ethical duty it depends on what they are trying to accomplish and what their expectations of the users are. Thank you. Sharing on facebook is not only very easy but it is actively encourage even when you are not logged in. The amplification of information could be slow to reduce by making it harder to share. By adding a screen do you really want to share this . We do that. We send a notification to anyone who have who has shared it. It may be marked as false and to give them that information and if someone tries to share it afterwards and they happen to see it, there is a warning screen that says this has been marked as false. Are you sure you want to share it . The reason we still allow them to share it is some people might be sharing it to denounce it or informationtional so we want to make sure they have that ability to do that. That is how the product works. Thank you. I am from the university of washington. Was in myanmar i have been thinking a lot about your team that you have built up over the last since twice 16. And thinking about whether you have encountered cases in which like the myanmar case where facebook was the internet for myanmar. In our cases where there is no independent media, there is no other but facebook is essentially the media force. And to what extent facebook has a good heart to heart and his you know what . We need to back out of here much in we are too what i am or how can, pushing a bit back on is the regulation question. A statey is built on that is richly diverse and has a vibrant media landscape. It facebook is 50 of more,ation in a place, or and you dont have a government regulating it and you dont want to be regulated and you say you are not media, what kind of special responsibility do you have, how do you do that, how do you self regulate, is that even a questiond that is for david. There are vulnerable populations developedld, poorly institutions for regulation. Fall, not just the free you cannot be the goodness of everything and you cannot just be good people. It is about institutions. Katie we face that and a lot of different ways and ways, shapes, and forms. Even in of things, places where we may be the majority, that is the majority of how people use the internet. It can potentially be diverse and david mentioned about how this is the only that opposition voices may happen a country. They are not going to have the funds to run ads and do all this stuff so they will run through state run media and other things. So one of the things that we have done with our ads, well we have an exception for news media and in democratic places, if you are a place that is strict the state owned media but not in a way of the bbc but more like hungary, they are trying to run ads and get people to news stories, they have to be transparent because we want to make it more transparent to tryingof what they are to do, who is seeing their content, etc. This is the right conversation enough . Ving, is that i dont know if pulling out and shutting down is the right answer. I think we need to keep trying and thinking about ways to find the right balance. I would not want to see lessook not present in democratic societies. But it you know this represented a i beenentity, they had banished, exiled and their reporters had to report under threat of death using pseudonyms. The government did a good job of keeping their publication out of kazakhstan except facebook. The only way they could get their news into the country was by publishing on facebook. Apparently the government had decided to ban facebook, populists would have accepted that. Plays a my facebook tremendous role in giving, having people access to voices that are banned from the country. And certainly and i dont think you are suggesting this, i dont want a regulatory system where the regulator is not trustworthy as it would be in a repressive regime. Want regulations arising from a nondemocratic body. Whether it is facebook, recognizing it is playing this Important Role in taking some ethical responsibilities to do that. Are highly critical of facebook for being a little too handsoff and for this was before the big transformation, we thought they responded to slowly. We keep on mentioning myanmar. It is what happened in places like that where facebook and human were not having the tool to make the decisions, they are put in a position where they get plenty of requests to take down content. And not having the ability to assess, trying to. I think i would like what facebook is trying to do now. I think recognizing that you are playing that role is important. I dont know if i am agreeing with you. K agree with her. More questions and back. My name is monica, with the state department. Countering panel on violent extremism and i look at how similar in terms of polarizing politics have become in the u. S. And some of the messaging. My question for you is as that echons, we are creating chambers for ourselves on facebook, on our Different Social Media platforms. There was talk that facebook was going to create an algorithm to add more diversity to peoples feeds. One of the folks who was talking about the radicalization of inremists said i got stuck my a cochamber, did not know there was anything else you encouraging dialogue which was talked about in the first panel, educational tools, where does facebook see their role, what can we do to add more . Good point and we toe been learning to bring the space. On wednesday have been quite a few studies in terms of showing necessarilyk is not as much of in a code chamber as we thought. Neighbors who were sharing all this stuff and he onht be seeing some of that facebook. Sometimes people, they want to turn them off so they might be selfselecting and it is not just happening online. If you read some of the great books that came out of 2016 like politics of resentment is a favorite of mine from a poly site professor, talk about how cities are people people are getting assimilated in their is not as simple as just showing people Additional Information to get them to think about this. And so we have to think about tackling this in a couple of different ways. The news think about sources or related articles that we are showing people . In terms of civil discourse, in the last panel you were talking about this. One thing we found that is changing, when we had facebook live, we would get anecdotal remarks from campaigns anothers , people are much nicer here. They know it is live and they are saying a person, it is as close to facetoface as you can get about the comment that you are sharing with people, instead of just commenting in the newsfeed. ,hat controls should we do people who are running pages for these things to try and help stay somewhat civil. That is tricky. How do you do that in a way that you do not going to censorship and that is where it has been hard with the great lines of what we have. There is a lot, we still do our Civic Education stuff. There are reminders showing people who will be on their on their ballots trying to make sure they have accurate information. Were going to have to keep trying a lot of these different ways to tackles the problem overall. You see that, you see that in how journalist approach, i and other journalists approach the information put in front of us. There are certain things that we are faced with that are not letting [inaudible] another side of the debate that needs to be out there. Post 2016, i think there is more of that recognition that we had in the past. I think that is, i am happy with that change. Is it perfect all time . No. Is it silencing voices . I think my show is not a platform that white supremacist ideology needs to be on just as an example unless we are seeing that play out in what we do on tv and online and in the papers as well. That one of the things consumers should be educated about is your getting your news from multiple sources. I dont see faced facebook of getting news from multiple sources, you need to go out outside of facebook. Say what are your suggestions, i would say if you have the means to do so, pay for your news and also pay for the news that you think youre going that might represent the opposite side as well. So i think a lot of that is how, facebooke tools are a , tores to be the onestop control their users information. I dont agree with that model. I am not advocating that. I dont want it to be that. Just one thing about filter bubbles, one of the great things about the internet is it allows people who have trouble finding community to find Community Around the world. I want to make sure that in our justifying rush to get people out of their bubbles that we do not lose this sense of community , that there is something valuable about people who are experiencing the same things and maybe they agree. Especially if their views are not popular where they are located. It is another challenge we are trying to develop to break out of things. It is valuable to find your bubble sometimes. Kate great to meet you. Thank you for your questions. Here. Nk you for being stay tuned for a deeper dive in the fall and early 2020 for conversations about possible solutions and if you would like more, thered learn thespecific examples on Wilson Center website and thanks includingtners facebook, the university of washington, and the carnegie corporation. Thank you for coming. The House Committee will return to capitol hill next month to advanced gun control bills. 4, the panel intends to approve measures two highcapacity magazines and establish a process to prevent people deemed a risk from obtaining a firearm and prohibit people convicted of hate crimes from possessing guns. There is also hearing focusing on military style assault weapons. Today members are in their districts tweeting about their activities during recess. Ary informative tour of natural gas unit. Learned a lot meeting with workers and talking to Company Management and officials. Industry isnd gas helping with thousands of jobs and billions of dollars in investment. Mike thompson of california posted this, stopped by the Housing Project launched by canine companions. Thank you for the great work you are doing to serve veterans. Kirsten gillibrand speaks at the 2020 candidate forum. Coverage starts at 9 00 a. M. On cspan two and on tuesday Vice President pence chairs a meeting of the National Space council starting at 9 30 a. M. , also on cspan2. Campaign 2020. Watch our coverage of the candidates on the campaign trail and make up your own mind. Unfiltered20, your view of politics. Cspanstore. Org