[applause] thank you. So you heard a little bit earlier this afternoon from sona talking about the state of local news. And later from brian talking about the state of national news. And right now im going to talk for a few minutes about the state of Digital Media. The story of Digital Media sort of follows the trajectory of a tail as old tale as old as time. Its a story of hope followed by disillusionment, and in theory the last stage is redemption. Im not sure we are at redemption yet. Let me go back to hope. You may recall or some of you may recall or maybe some of you if you are students are too young that back in the mid to late oughts and early part of the 2010s the world was a pretty exciting place when it came to Digital Media. Google really did bring the worlds information to our fingertips. We connected with old friends there high from high school or other parts of our life on facebook. We had access to so many incredible videos on youtube. When it comes to news, it was revolutionary. You may remember that the arab spring in 2010 and 20116789 the miracle on the hudson. The ways we had access to information because people on the ground with access to these mobile devices, these powerful computers in our hapbz, were able to report, just regular people, on what they were seeing, what they were experiencing. It was sort of the promise of citizen journalism. We learned as it turns out about the capture of Osama Bin Laden by a guy, a few quarter after mile away saying im hearing choppers overhead. This is strange. Im in forgot the town. Whats going on . It was the democratization of information. Or so we thought . Fast forward we get to the disillusionment where everything began to change slowly and all at once. What is it that went wrong, exactly . I would put in three categories. People, people is what went wrong. You had foreign actors who tkpwaeupld the system gamed the system of the the internominate Research Agency in st. Petersburg were able to use these social media platforms to manipulate Public Opinion in the United States in the runup to the 2016 elections. We had domestic folks were targets who then willfully spread those and other kinds of misand disinformation. Whether its about elections. Whether it was about covid. Whether it was the reasons that the crash happened at the bridge in baltimore last week. Then you have the profiteers. No ideology other than making a few quick bucks. They were able to game these platforms to bring money to themselves through various means. That was one category. The second cats gorery was the Tech Companies and platforms themselves. Whether or not they did not anticipate what could have happened with these platforms, or whether they knew that their platforms could be gamed and didnt care, we do not know. We may never know. With you we started to see that social media platforms became basically a big game of whackamole. Starting in 2014, 2015 and continuing to this day. Then after 2020 a lot of the social media platforms decided, threw up their hands and said, enough. We are not going to try to moderate, necessarily moderate content on our platforms. Its too expensive. Its too politically fraught. Too many people hate us no matter what decision we make. Too many subpoenas come interesting this congress. They are like, we are out. We are going to leave it alone. You have ehropb musk who bought twitter, never perfect, but always an incredibly valuable tool turning it into a Dumpster Fire it is today. The third gatt roar i of what went wrong is the decline of local news. We have been talking about so much about today. Dont need to repeat t it has played in, into that void, has filled in so much of this kind of noise that we hear from platforms. News organizations sometimes did it to themselves. They were chasing in some cases those of us in news, we all laugh at the line about the pivot to video. Which was supposed to save journal i. It didnt because as soon as facebook changed their strategy, the whole thing went down in flames. The money that supported journalism went to the platforms. A lot of it for good reasons. Advertisers, it was more efficient how they reach people. And also the world is has become such a polarized place that many just do not trust any news. Again, we have been talking about that all afternoon. And now here we are, this is going to be a bit of a segue into the panel that im going to be part of thats coming up next, into the a. I. World. And what is that going to do to this already very fraught Digital Information ecosystem . My message to you, i may repeat this on the panel with the right opportunities, dont be afraid. Theres been a lot of coverage about the big spectacular deep fake of trump doing something or biden doing something. That may happen. I dont think thats a big worry. Those will be debunked so quickly that those are not going to get a chance to get much traction. What im much more worried about when it comes to content that is manipulated via Artificial Intelligence is the things that you cant see. That the media cant see. That the public cant see and debunk. Its coming in on whatsapp. Its coming in on tppblg messenger, tell gram telegram, all of the peertopeer, not always peertopeer, you can distribute those. The content that you cant see that can be very damaging. And very targeted information that can be to what a. I. Enables is a speed and scale and degree of targeting never before possible. If before you needed the internet Research Agency funded by the kremlin in st. Pete yoursburg to pull this off, now you dont. We are back to the proverbial guy in his pajamas at his parents house who can make the same effort at the same scale to cause a lot of trouble. But if there is one thing that worries me most, it is a phrase that was coined by two people. One a u. T. Researcher named bobby, and another academic, danielle, and it is the liars dividend. The liars dividend describes the phenomenon of what happens when we hear we are hearing from so many different places how we cant trust information. That we a. I. Can manipulate video, it can. Audio, which it can. It can manipulate images, which it k instead of trying to find out is this real, not real, going to trusted sources . What do we do . We stop believing in anything at all. And this is the liars dividend. It is out of the playbook of autocrats and would be autocrats going back for millennia. Now enabled by a. I. I was thinking for those of you that were here last night and you heard woodward and bernstein. Carl bernstein made a comment that Public Opinion changed when people were finally able to hear Richard Nixons tapes. Think about what would happen today if those tapes were released. Fake news. This is a. I. Manipulated audio. That wasnt me. You know what . A lot of people would be like, yeah. I dont know that i can believe that. Thats the world we live in. Yeah, i was supposed to talk about redemption. I dont know i have the redemption yet. If there is redemption to be had it is in the promise and the growth which so many people here in this session, you heard it in the first panel, have the promise of local news. Local news is there is no Silver Bullet to all these problems. If there is any salvation to be had, it is in local news and the growth of local news, people of the community, in the community, communing with the people who are their neighbors, providing them the information that they need, listening to them, and building that trust. We know that leads to civic engagement. We know how important it is. We must all support efforts whether it is press forward or any of the other efforts you heard from sarah, the american journalism project, or the work that elizabeth was doing. We all really need to support that. It is the only real way out. And with that we are going to move into our a. I. And elections panel. I am happy to introduce my fellow panelists, secretary of state cisco aguilar. Secretary of state scott sha a wap schwab. Open a. I. s beck yea waite. And our moderator, dr. Tala schwab talia schwab. [applause] thank you so much for that. Such a pleasure to chat about Digital Media and the 2024 election. We have a hot topic here. Just a small topic. And just to offer some introductory remarks, we are in a remarkable setting right now in 2024. We have elections in over 630 countries, plus the european union. Representing just under half of the worlds population. Which is mindblowing. We have seen a. I. Used in elections. We have the a. I. Generated robocall impersonating President Biden that sought to discourage voters in new hampshires primary. We have auto clips of a liberal party leader discussing vote rigging. We have a video of an Opposition Leader in the conservative muslim majority nation of bangladesh wearing a bye keeny. There are so many things to talk about and possible uses of a. I. As we look to the election. Im looking forward to this conversation. I want to just get started. We are delighted to be joined by becky, who is the head of Global Response at open a. I. And becky, i want to just dive right in. Open a. I. Released details about its approach to the 2024 elections earlier this year. And they noted that the rule that they have is that people cannot use open a. I. Tools for campaigning or for influencing voters. Can you talk to me about enforcing rules like that on a global scale . It seems almost unfathomable. Tell us about it. Becky yes. Thank you so much for having me. And for this l. B. J. Having this forum and discussion in this milestone election year. It is more important than ever to this these sorts of conversations that bring together particularly folks across governments, Civil Society, and industry to have these important discussions. I spent a lot of the last six months speaking with policymakers and Civil Society around the globe to understand what is top of mind for them going into this election year. While we are very excited about the significant benefits of this technology, we also are clear eyed about its potential risks. And through those discussions, through that dialogue, we have developed a preparedness framework that focuses on three efforts. First, the policies. Making sure we have the right policies in place. Second, preventing abuse. And third, elevating appropriate information and transparency in our tools. First as you mentioned our policy lines, we noted that we dont allow political campaigning and discourage participation with our tools. We wanted to have a set of policies that were a little bit more conservative this goround given that we havent seen generative a. I. In the elections space before. We wanted to make sure we were really taking a conservative approach out of an abundance of caution. Second is preventing abuse, this is getting to the enforcement piece. We think about safety through the entire lifecycle of our tools. Its not a single point. To use, if youll excuse the bad metaphor, if you are going fishing and have a bunch of g net, you dont use just one, you use several nets so you catch as many fish as possible. We think about safety in the same way. We have interventions across the entire lifecycle of our tools to make sure we are enforcing on different harms. One example of intervention that we might have is something called reenforcement learning with human feedback. I dont know how many people here have used any of our tools or generative a. I. In general, but one thing that we have in our chat bot tool, our large language model, is this reenforcement learning where its they very front end how we think about safety. Its a fancy way of saying we ask a question of the model, we generate a bunch of responses, and tell the model which one is best. By doing that we can steer the model over and over to something that is safer, more reliable, and more helpful. Thats how we reduce the likelihood that its going to produce these harmful responses. T finally, the last piece is around transparency n our model we look to elevate phroept sources appropriate sources and site where information is coming from where appropriate. I guess final thought beforehanding it over to someone else on the panel its ever more important we have these sorts of conversations. And that we are collaborating not just across industry but also within industry. We are really excited about. So work that we are doing with our social Media Companies and others where generative information might be actually distributed. Making sure that we have those close connections going into the election season. Talia speaking about close connections. Open a. I. Has a partnership with the National Association of secretaries of state. And what an honor for to us have the president of this organization with us, secretary scott schwab. Thank you so much for being here. Lets get started about thinking about kansas. When you think about kansas, what a. I. Or foreign influence threats worry you the most . Secretary schwab this is where we come from looking at there is a difference between the campaign side and the election side. And often people commingle them. As a secretary if you get that fake biden phone call, im not concerned about that because thats campaign side. Well let our Ethics Commission deal with that. Our bureau of investigation and what not. But on the campaign side, this is where it can get to be really a concern. I hate to use these examples because then when you use the kpafrpbls you get people you give people ideas. If john son county is a Johnson County is a wealthy county. I get the honor to live there. I love t but its a very its a purple county. So imagine if somebody generated a video that was shared on social that said, due to bomb threats, all Johnson County polling places will be closed on election day. Imagine the chaos if someone used my image and likeness. We already know that sometimes news is so quick. We got to get this out there. Now im out there saying thats not me. Now the news is trying to say, whats real . And then when its all sorted out there was no bomb threat, it was fake. How many voters did you affect im not going to take a chance. Im not going to vote today. You cant undo that. Those are the concerns i really like what minnesota is doing as relates to a. I. If you generate an a. I. Image or video or a voice and it doesnt have a disclaimer saying that its a. I. , then you are using it to influence influence a campaign or election, its a crime. They carved out you can do satire. The saturday night live exemption. Those are the things on the election side that become terrifying. You are not misrepresenting a tkaepb candidate. They a candidate. They can undo that. There are two great motivators for humans to make decisions, hope and fear. Fears stronger. Its a lot easier. So if you get cause voters to have fear to not vote, how does that truly influence election . Thats outside the campaign side. As secretaries that is the conversation we are having. Talia disturbing scenario. Secretary schwab now somebody is on the internet i have an idea. Those are the concerns. Talia we have a student in the audios i have an idea how to deter. Secretary schwab social securityr sos. Chaos. Gov and hope us. Talia you authored an article in Foreign Affairs earlier this year where you said generative a. I. Companies in particular can help by developing and making available tools for identifying a. I. Generated content and ensuring that their capabilities are designed, developed, and deployed with security as the top priority to prevent them from being misused by nefarious actors. How well do you think generative a. I. S are do. Secretary schwab the example she just gave. Creating safety nets. You dont know until when we hear the phrase in the article its got to be safety has to be your top priority. When boeing has a door come off an aeufrp airplane, whats the first thing they say . Safety is our top priority. I get t this still happened. A lot of times you dont know what the holes are until after it happens. We dont know how they are doing. We can come through november of this year and im curious about your opinion you are more of a swing state than kansas. There is a good direction which way kansas will go. But there are concerns that we wont know until after the election what will happen. My bigger fear is not this president ial election. Its going tonight one in four years because it will be more developed. And nobody knows how to truly weaponize a. I. Right now. But in four years im pretty sure they will. In 2020 our biggest concern was misinformation. We got hit with the pandemic. The gray philosopher yogi berra once said the problem with trying to predict the future it keeps changing. Thats what happened in 2020. Now 2024 in 2020 bedidnt we didnt deal with a. I. Now its like a freight train. Well know about it in 2024, we dont know how creative people have come to deploy ttalia really interesting. And some conversations we had before also mentioned. The things that will develop between now and 2024 we need to look at that. Secretary, i want to bring you into this conversation as well. In january, you said that addressing a. I. Threats to electoral integrity will be a partnership between the federal government, private sector, and local governments. Im hoping you can give us a Progress Report on how much progress has been made in helping state and local governments understand the threats and how to approach them. No progress. And reesely, real its really, really frustrating. Especially when a high level federal official arrives in your state and they ask you what are you doing about a. I. . You look at them and go you want my state to step up, put the resources behind something that is receiving billions of dollars of investment, you are the federal government. You have access to researchers. You have access to information. I can only treatment about. Im trying to figure out how to get 17 counties across our battleground state to be able to use legacy systems that exist to transfer on to a statewide system. And you are asking me to be the leader on a. I. . Its unfor the tphafplt and unfair. Talia not a good Progress Report there. Hopefully someone it