But also to learn from the amazing speakers that we have going on stage. The Diverse Group includes impact start ups focused on improving lives through innovations and health care, Sustainable Farming and recycling. A 3rd of the companies have been founded by women, while around half or coming from the middle east. Say, 15 years ago we didnt have really strong if a system and right now is the hub to all of the other countries, suddenly decided they want to jump in the middle east. I think thats a be progressive, a major steps where the story slips. Each country has its own characters, every space in special talents and special ideas. And its really diverse. Im so excited to see where were going to head over the next 3 days, this center. And joe, how will be a high innovation and networking . Thousands of people with big ideas heres to connect with investors and partners, hoping to create profound change not only in the middle east, but the rest of the world. Alexandra buyers, alda 0. So a peru has declared Health Emergency across most the country because of an outbreak of things, even the health industry, it says the combination of a heat wave and heavy rain has led to the spine cases at least 31300 cases have been confirmed this is a file with fuzzy to say, tell us that from 8 oclock, finding more pools on a website out is there a dot. Com is the address calling from us, and well be here with the news a shortly. Abuse continues to get p v a. Sites here, the, the rights organizations filed a civil suit against and that states saying supply is around would be a fonts for f. 355 projects makes the netherlands complicit. Impossible war crimes in gaza. Then that lens has house the you will be in Distribution Center for f 35. 00 spare parts from with also supplies is were out in 2022 and exported. Spare parts worth 2400000. 00 to the country with the escalation of the war. And guys, at this number is not predicted to be much higher for the 1st time, the court has ordered the country to stop sending weapons to swell. The verdict is seen as a severe blow to the dutch government to have argued that stopping its gone for the ocean to the ex 35. 00 program with jeff or dice it sites with us. And is ralph was, has a severe use Economic Impact in this product to judge argued that political and economic interests con, be more important than the protection of civilian lives in the war. The Artificial Intelligence is invisibly running our lives and is expected to bring far more for found changes to key man. But in part one of this discussion, Meredith Whitaker and Camille Francois challenged many of the notions that we have about a, i think theyre not technologies of the many. We are the subject of a, were not being users of a i, in most cases, when you get to the bigger models that are trained on more data. Graceful biases and stereotypes get worse. And in this final episode, these amazing women discussed big tech and how to navigate the world of a i. So how do we make a less discriminatory how to resolve its issues with privacy . And how do we tackle this or Zealand Business model, the well hello. Hello again. There it is. Wonderful to be here. Camille. A wonderful to be here with you. Last time we chatted, we talked about risks in a flyer. Some people worried about a i taking over the world and destroying to mattie. What is this thing they recall ex, essential risk. Where is this coming from . How do you feel about that . Oh wow. Well, accidental risk is, is a thrilling story at a little bit know an emotional level. Its very activating to think of, you know, do and conflict and the sort of in great power scenarios. And its kind of catnip to a lot of powerful man. The idea that a, i this, you know, this scrape data, a big compute, you know, big models is going to somehow find the escape velocity to become sentients and super human. And we had better hold on because we either need to control that powerful, powerful, powerful ai or were going to be superseded by it. Theres no evidence that x a central risk is going to happen. I think theres a lot of questions around like why now did it catch on so powerfully. And i think a part of this was the answer that question that you know, i know, camille, you think about this as well . Is it, its, you know, while there are some true believers for whom this is very meaningful and i dont want to take that away from them. This is also an extraordinarily good advertisement for these technologies. Because what military, what government, what Multi National doesnt want access to this hyper hyper, hyper powerful, a guy doesnt want to be the one whos sort of controlling it, doesnt want to imagine themselves that the hellmouth, dest star. And this advertisement also serves to distract from the fact that these systems continue to be discriminatory. And that discriminatory capacity continues to accelerate the fact that these systems are used by the powerful on those with less power in ways that often obscure account ability for harmful decisions. The fact that were talking about a technology that is built on the basis of concentrated surveillance power like the world has never seen. Right. But we can erase all of that by being like, look over there. The terminator is coming and you talked about it, its not exactly a new idea, right . Nick, fostering wrote super intelligence now 10 years ago. Its a book that sort of focuses on that idea that a, i will look, celebrate to a point where it can no longer be controlled by human and will pose and ex essential wrist. The fact that today this concept dominates some of our conversation on safety is meaningful and were re, sons are because were at the Pivotal Moment where we have governance, for instance, for the 1st time saying hey, we would like to organize in to discuss with the safety mean in the context of a i and so we have government is coming to the table, we saw it with the a i safety summit we so a series of 1st declarations. First regulations. Theres the white house executive order in the us to hear a stream of process coming out of the g 7. And so there is this urgency to define what is it that were worried about and that we want our elected representatives to protect us from and to focus on when we talk about the safety of a i. So i think youre right, it doesnt mean that everybody should laser focused on avoiding terminator scenarios. It also means that we need to focus on the very immediate harms to society. The bias is discrimination and the surveillance implications which we havent talked about just yet. I see your, your, your surveillance eyes. Oh yeah. Yes. I was sure you get there. Smells in the furnace of concerns over surveillance and privacy. And yeah, i think we were around google at, i think 2014 or so we met, but that was, that was the post snowden era. Right. So we came out of the ninetys in the us with a Regulatory Framework that had no guard rails on private surveillance. So private company could surveil anything. Right . And, and they can surveil it in the name of advertising, right . And so we get, you know, after the ninetys and this sort of, you know, permission list surveillance, youll see a lot of very cozy partnerships between the us and other governments and these private surveillance actors. Right . So, you know, getting data from them in certain ways, brokering relationships, convincing them to create back doors in their systems. And this is documented in the snowden archives, which of course happened in 2013. So this is listed on the system. Take a moment to define whats a back door. A back door is a, you know, generally intentional flaw in a secure system that allows and a 3rd party access to contents or communications. So it would be a if we are using an encrypted system. So you and i are texting and we think that is secure, but in fact, the code is allowing a government or a 3rd party to access that and to surveil our communications. So a back door is sort of, you know, the colloquial term for a flaw in the system that allows that kind of access. I think share the critical security concept is this idea that you can have a back door thats only for the good guys. And so if theres a hole in your system, theres a hole in your system. I think thats why we care so much about strong and twin encryption and making sure that when we say a system is secure, its secure for everybody and from everybody. Yeah, it either works for everyone and that means i cant see it. That means the u. K. Government can see it. That means putting cant see it. That means x, y, z. Hackers cant see it or its broken. And we can all see it. So you were saying 2013 big moment or reckoning and Silicon Valley over privacy. Yeah. And those concepts of surveillance. Yeah. And that was, you know, kind of the world i lived in, right . Watching this privatized surveillance apparatus that google, you know, that had been justified on, you know, hey, we have a do the, to our customers. And were just giving people more useful ads and more useful services, but snowdon kind of broke that open, right. And since then, theres been a kind of an easy, you know, situation where, you know, encryption has been added to some things, but the, the pipeline of data and Data Collection and data creation continues because that is again, monetizing surveillance is the economic engine of the Tech Industry and so again, what happened in 2012, there was a recognition that this Surveillance Data could also be used to train and to an a i. And that these a systems were incredibly good at both conducting surveillance. So think about facial recognition, think about productivity monitoring. So i think that we have to read a, i is almost a surveillance derivative, right . It pulls from the surveillance Business Model. It requires the Surveillance Data and the infrastructures that are constructed to process and store this data and it produces more data as of heightens the surveillance ecosystem that we all live in. You know its, its also what i have observed working on this information and on troll farms in 2017. And i was doing some field work. We know before that around 1015 or, and working with this journalist in human rights activists, including maria, who were so often targeted by governments. Their phones were being hacked. We were very concerned about making sure that they had secure software. We could secure their phones secure their computers. They were very much under heavy surveillance. I remember they were the 1st ones to say, hey, theres something a bit off thats happening on social media. And we think its harmful. We think its violence and we think its related to the hacking. We shouldnt take it seriously and we should try to uncover whats really going on. And we should really apply the same rigor is and tools that we had in our work on Cyber Security and say, we cant analyze this. We can do forensics. We might even be able to attribute it. If we see networks of fake accounts that are deployed against a journalist or a human rights activist, with the sole purpose of silencing them, threatening them and we might be able to hold a few people accountable in this process. We were of course, sort of slow to do them as an industry and that created the sort of greek reckoning of 2017 right. What it took for Silicon Valley to care about that is really the us president ial election of 2016 in the fact that russia was able to use what we now call toll farms. Ryan series of handful of fake accounts to, to have um, you know, to have a campaign against these president ial elections and what, what followed after is a full year of Technology Executives having to go to congress and justify why they had listed it. Sir, that i think was also sort of familiar, really Pivotal Moment where i some new foundations where establish for, okay, maybe we now live in a word where as a society, we feel that Technology Companies have a responsibility to protect democracies too. And that we feel Technology Companies have a responsibility to tackle this information and then to think about how their technologies can be abused to manipulate elections. That is also something thats coming up for us in a i, in a really interesting way. But what i am concerned about in addition to those very real, very pernicious problems that happened when you mass scale a global em information and social platform, you know, again, incentivized for a sort of clicks and engagement and profits in surveillance. And advertising is that the solution space in my view seems not to go far enough. So you have Something Like the you case Online Safety act, which is this massive omnibus bill that was catalyzed through these very real concerns, right. What do we do about these problems . But they rarely look at that Business Model and the sort of, you know, they take as a, given these mass social platforms. And then the solutions often look a lot like extending surveillance and control to governments. Expanding the surveillance apparatus of Large Tech Companies to, you know, government chosen embryos or government actors who will then have a hand in determining what is acceptable speech, what is acceptable content. But its not actually looking at, you know, how do we attack the surveillance Business Model that is at the heart of this engine. And so, you know, this is very real for me and were in both space and the us. But we now have, you know, books being banned in certain states. We have, you know, Reproductive Health care or health care in general, unavailable to many people in states where Reproductive Health care has been criminalized. So, you know, i really worry about these problems with platforms about the way they exacerbate hate and allow trolling and just information. And i also really worry about the solution space when that is handing a key to governments that would lock up a woman and her daughter for accessing health care that wouldnt banned books. And that, you know, across the world are trending toward the authoritarian. Absolutely. And so what we need to think is also a diversity of these platforms, right . Platforms that are not tied to these surveillance capitalism. Business models platforms that can put security and privacy 1st, that can operate in a public interest. And i think thats what were doing with signal re and i want to know little bit about that ourselves from the sense of the ninetys and doing a gaming yourselves from the sins of the 90s. Think they have read honest commentary. Write this as well as fashionable commentary now, and its, its hard so it makes a trip to or come into the work. It is a trip to work commentary. So lets talk a little bit about whats happening with signal. I was very excited to see that you have a list, a piece about how much it takes to run signal. Yeah, and you said it costs 15000000 a year to actually operate this technology globally. Why did you do that and out, how are you using 50000000 that year to the ext cigna will work at scale. Well, we did that in part because we are a non profit, a rare non profit, not a fake nonprofit like open a, an actual non profit. So that operate in a tech space, again, dominated by this Business Model. So we one wanted to be accountable to the people who rely on signal the tens and tens of millions of people across the globe. We use this as Critical Infrastructure who donate to keep us running and we want it to offer a Cross Section of just how expensive it is to develop and maintain highly available Global Communications infrastructure. These are the, you know, Free Products and services, and thats shed light on how profitable this industry is. And how significant the monetization of surveillance is in terms of a revenue generator. Were a non profit because the engine of profit is invading privacy and our, our function are so focus is creating a truly private communication app where we dont have the data. You dont have the data. You know, if the cops are facebook or anyone doesnt have the data because its only available to you, but then the question is okay, with out the data to create the revenue to cover 50000000. 00 a year. And by the way, 59, thats very cheap. So how are we going to guarantee privacy while supporting, you know, what it takes to actually produce an app that works for everyone. And were, you know, i think that question is way, way bigger than signal. And i think its one we need to be asking of every company out here, wheres the money . And that sort of is nice not to how we started this conversation, which is making sure to that the money goes to tackling those very risks, to safety, to moderation, to privacy, right. Making sure that the investments are also just being, you know, keeping in line with the, with that was texting and managing those since your technical harms. That is a good segue for us to take a few questions, either on the infrastructure, inventing new resourcing models. So we spoke a lot about corporations, governments, federal, i think its almost embarrassingly easy how individuals, everybody in this room hits the consent button when you want to read a thing on the web page. And we lose all of that data. How do you go from being the mind not at the off individuals need to recognize that protect their own data was, is the easy access to information im giving that up . I dont think this is a matter of individual choice or individual blame. We cant function in this World Without using these services, right . You know, we have to do this to get a job to function in the workplace, to go to school, to have a robust social life in a world where so many of our, you know, public spaces and ways of communicating with each other had been hollowed out by these platforms, i actually think it can be really dangerous to make this a, you know, an issue of individual will or intellect or consciousness. I think, you know, what were talking about is that a deep or collective issue where our lives are shaped by the necessity to use these systems. And where like look, facebook creates goes profiles for people who are not on facebook in order to fill in your social graph. Data tells you something about the people arent represented of the data the same way. It tells you about the people who are. So im encouraged by im new frameworks that are emerging that are maybe helping us thing a little bit more collectively about our data. And so for instance, in the united states, lot of people are working on this idea of data trust. And this idea that you have data rights and you can also work with organizations who may represent your data. Ryans make it easier for people to collectively say yes, i will. And trust a non profit that its trusted to make sure that i can exercise my right i and this part this, this entities to, for instance, can be also collectively bargaining to make for that. Again, collective right . So being represented, i think that were heading towards new, new frameworks, new governance mechanisms, new regulations where we think a little bit more collectively about our data. Lets take another question so far our conversation or your conversation has been pretty us to interact and you know, rightfully so. But what do you think about the, essentially a i arms race between the us and china, and what it means for the wasting ship between the 2 countries as well as the impact on the rest of the world. As you know, there are very valid concerns about that kind of, you know, the potential for the misuse of this technology. Im not going to dismiss those. But for me, this is a, an economic arms race, you know, which whole the us are, china is going to engage, you know, as much of the world as possible and kind of a client states, right. Provide the infrastructure, provide the api i is provide the sort of, you know, affordances, so that, you know, they can both extract the data and sort of, you know, and revenue from in various countries and, you know, maintain control through these, these companies. So i think, you know, i think theres a lot more to say about that framework. You know, when we talk about a race, we really need to be asking like, where are we racing to . Is this a race to the bottom up sort of, you know, to polls of an economic surveillance state that are exercising massive social control over the rest of the world. And is that a race we want to win . When i think about, you know, government sort of rushing to make those investments and us talking about those armories. Of course, i also think about the fact that we have little agreement on what are the legitimate ways to deploy a i in military context in conflict context. How does a, i shape the laws of war . I also, i mean guys that we have investigative reporting that if targeting has been done by a i, that theres a massive apparatus and that, you know, were witnessing a significant, significant, unspeakable civilian casualties. I think then in that context, i think these are very important questions. We are in friction in gene in the world where we have multiple warheads and conflicts and seeing governments accelerates to those fields in deploying those types of new technologies in conflict context. Most give us pause and help us ask as to what are the rules of the role and for the deployments of these technologies in these contexts to so your, your, when, when we talk about arms race, this is sort of 1st where my mind goes. Sure. So well take one last question that all the information vacuum site now, which are good breeding grounds for this information. What do we do when you know where you may have this push back on the on line safety now act surveillance, bowers in that. So when you see government and big tech centers, fusing that powers is the solution, they break up big tech. I like you and very concerned about this sort of metastasis of surveillance and censorship powers. Again, in the hands of, you know, governments and corporations that dont always reflect their Community Norms or, you know, the, the social benefit or the interest of the marginalized or etc, etc, etc. So i dont have a solution for like the one weird trick to do to solve it. But i think its going to require social movements because again, youre looking at sort of entrenched power and a kind of government. Government is willing to weaponized the language of accountability and the language of reducing big tech harm in context where that expands sort of the big tech model or the authority to governments. Right . But what we havent seen are sort of in bold regulation. There is not political will to use the Regulatory Framework. So i think there needs to be much more demand. And i think about it almost as like, you know, is a kind of dignified stance, right . Like, we dont want to live in this world and we should have the imagination. And i think, you know, like the, the deep optimism, right . That is willing to recognize a world in trouble in danger. And, you know, terrible peril isnt looking away from that with pollyanna eyes, and then is demanding changes to that with a clear map of just how bad it is. Thats a very elegant freezing to say that we, we should and we are able to have an alternative futures alternative models. And then to, to say, and d like this is, this is not how we want to live with technology. Its not being a luddite to say, these are not models that should continue, right. Lets invent alternative features that are more rates preserving that are better for society that are better for the planet to. Thats and so huge climate implications and everything that you just said around surveillance, capitalism that we dont talk nearly enough about. Yeah, i mean is that history of computation that actually traces it back to plantation management techniques that were used to discipline control and surveil and slave African People as part of the trans atlantic slave trade. And ive written on this sort of a history of computation as taking templates from those labor control mechanisms at the sort of birth of industrialization. What paradigms where they reflect reflecting and reflecting and know that doesnt mean we throw them away. That means were mindful and we just like in a kind of punk rock spirit, we demand more of them. I love that. I think that is the perfect ending. Lets embrace that, then crock spirit, lets and then more, lets invent better schusters. Thank you so much for this conversation. Together, the 2. 00 to 4. 00 episode to the special c these on a i. Weve gone beyond the headlines and hype spoken to scientists and industry leaders, working to align profit motives with safety, as well as examine the coated bias that is already impacting our world. When real life is now stranger than fiction, we need to step back, look at the history of a i how its impacting economies around the world, how it is affecting violence and more fear, and what steps we can take now to make a safe and ethical for us the, the wrong is going to the falls. So alexis, parliament is as well as 80 members of the assembly of exports of body and power to points. A future Supreme Leader of our own. With the question of succession booming, a regional stand off in place, one of the heaviest inflation rates in the world where the outcome of the election have an impact on the countries future. Iran elections on l to 0. This is the 1st one they saw that we see in real time its the victims themselves. Theres a disconnect between what we are witnessing on social media versus what were seeing on mainstream. It is always an attempt to frame a true side of them, but there is no 2 sides to this. The western media does have a western bias. Understand what they are looking to get out and raise. The listening post covers how the news is covered. The. Ready the cuz i wanna know about this, and this is a news online from doha coming up in the next 60 minutes. He was president joe martin and it says israel has agreed to hold its military operations in jobs on for the most of the month of ramadan. And is very striking, we have the quite hospitalizations, southern guys are kills at least 8 honest any is women and children are among the dead of all the launch is a bottles of rockets into northern israel say