Us. The reason why im excited about today is cashmere has been a pioneering kind, trailblazing voice in surfacing these issues of the social impacts of technology. What were always focused on at all tech is human, is really this gulf between how fast innovation moves and how slow our ability to consider its impact on how we live, love, learn, even die, how we get jobs, how we see the world, the future of the human condition. These are big questions, and thats why these are questions we cannot deal with sitting alone in front of a screen, come together because we need all different backgrounds, all different voices. But going back kashmir hill and just a very role that journalists play, journalists are informing about what is happening these oftentimes things done in a secretive manner. So especially around her work that came out with big New York Times expose a year or so ago around clearview ai oftentimes what one founder might want to do is not in a line with our interpretation of the tech future that we want and so much of what were trying to do here at all. Tech is human is cocreate a tech future aligned with the Public Interest in line with our values. And thats why it behooves us to make that a small sliver of society does not dictate the future of our tech, which is the future of our democracy and the future of the human condition. So people like kashmir hill are really exposing whats happening. So when we think about a topic like facial recognition, one of the concerns i think a lot of us face and have in surface is that it feels like its being placed upon us without our implied consent. Right, that the technology is released into the wild without us having some modicum of control of consent over how its designed, develop and deployed. And this is the part that really needs to change. And this is why and ill take a human we try to bring together stakeholders across Civil Society government, industry, academia. Again, the Important Role of journalists who inform the general public, who influence policymakers, who influence tech companies. Theyre all parts of the puzzle. So when you look around today and after todays discussion, when were all mingling and if you still dont want to leave after we close down here at 830, well continue the conversation next door at brass monkey. Is this good to meet other people who deeply care about where this is headed . Because the real important part is that our future is not decided our future is being determine end by what we do or do not do today. So without further ado, i have the pleasure of welcoming kashmir hill and our executive director of articles human rebekah tweed to be in conversation. Please give them a warm welcome. Thank you, david and thanks everyone for coming. Tonights very excited to be here in conversation with kashmir hill hill. Kashmir is a tech reporter at the New York Times and the author of your face belongs us, which you can pick after this. And kashmir will be signing for you over there. She writes about the unexpected and sometimes ominous Ways Technology is changing our lives, particularly when it comes to privacy. She joined the times in 2019 after having worked at Gizmodo Media Group fusion Forbes Magazine and above the law, her writing has appeared in the new yorker and the washington post. She has degrees from Duke University and new york university. She studied journalism. So welcome, kashmir. Thank you so much. Its great to be here. You are. Yeah. Glad youre here in new york. Youve been on a whirlwind book tour and a lot of the book actually takes place here in new york. So can we set the stage . Can you just give a little bit of an overview . You start working at the times, its 2019. Someone approaches you with a tip about a startup. They have scraped billions of photos, faces and apparently they can determine who a person is just based on a photo alone. So can you tell us what this company is . And what about this company is different and whats concerning to you . Yeah. So i find out from a records researcher that police are kind of secretly using a Company Called clearview ai, which at the time had 3 billion photos scraped from the public web. And there was very Little Information out there about the company and i remember on their web site, they had an address here in manhattan, it was just a few blocks away from the New York Times building. When i mapped it on google maps. But then when i walked over to it. The building wasnt there, it didnt exist. And it was the first of many kind of weird flags about investigating the company. It wasnt obvious who was working for them and there just wasnt a lot out there as i was trying to dig into them. It was clear that this company that was exposing so much about us was trying to stay in the shadows and what they had done with was was really shocking. We just hadnt the us seen Something Like this done before and it was this company that no one had ever heard of. Yeah. And can you tell us a little bit about who is behind company . Can you tell us about who and on top . Yeah. So the main person i think of as being behind clearview ai is a young guy named quinton tat. He grew up in australia with obsessed with computers from an early age at 19 years old he dropped out of his college where he was studying Computer Science because he thought what the professors were teaching was kind of boring and moved to san kind of chasing the dream of Silicon Valley and was basically just trying to make it there for a while. It was 2007. He was he made facebook quizzes. He made iphone games, kind of stuff at the wall to see what would stick, but nothing did. And you know he ended up moving to new york 2015, falling in with a very kind of conservative of crowd and the people that would become his cofounders, the company that became clearview ai. Yeah, something i thought was really interesting about his story. He doesnt have like some, you know mission that hes chasing. Definitely. He just wants know whats going to be successful, whats to stick. So these facebook quizzes this you know he does like an iphone app like trump hair and hes really just seen what you know whats going to work until it seems like once he hits facial recognition something is different there he really kind of goes down this like dark rabbit hole and he you know, you outline in the book this kind of like long history of people thinking they can determine about a person based on their features. So it was just really interesting to me that he seemed pretty obsessed with can you determine who is a shoplifter or a border crosser, someone whos crossing illegally or someone whos a terrorist or, you know, a cheater so im really curious if you can talk little bit about, you know, some of the use he was thinking of when he stumbles onto idea for facial Recognition Technology, like how did he get there . And what was he trying to do with it before he discovered the use case . Law enforcement, yeah. I mean, set the scene. So 2015, 2016 this time that one contact comes to new york is a really big time artificial intelligence. Its when theyre really starting to make advances in computer and Machine Learning or neural Net Technology and this is what powers so many of advances that surround us now kind of chat, you know, image generation. And it was the same thing with facial recognition was a time when computers were just getting the and the Software Techniques were just getting. So much better for computers to process all this data to recognize patterns. And so went to in his the people he kind of founded the company with originally were thinking you know can we take a photo of somebody and just from their face alone understand more about them. They werent originally thinking about just recognizing a face, putting a name to a face, but what about their what can we learn about them from their facial features . And they were tapping into the ideas that we we consider are pretty outdated about what you can just tell a person from their face. And so one thing he talked about doing with his cofounders was this was around the time that ashley was breached. Does everyone what Ashley Madison is so extramarital dating site and so a place to go if you want to cheat on your partner and the people hacked the site exposed all the users along with their names and their addresses is and email addresses. So he oh we should take this of people look them on facebook download all their faces is and then we can train a computer know what a cheating face looks like they had same idea about intelligence that you could figure out you know give it a whole bunch of photos of people with high iq and figure out whos most intelligence. Same thing with criminality they really had this idea that computers could kind of data mine who we were and figure that out from your face that their business did not end up being very successful and huan contact says now you kind of renounces idea that computers would be capable of that and they went instead to something that was much easier by comparison which is which was just download a whole bunch of photos from the internet and make this tool that when you upload a photo of somebody it shows you all the other places where their face appears along with a link to the website and so thats where you can find out their you know their social media profiles who they know maybe photos they dont even know are on the internet. Yeah. And i think its really to see how that develops into the this use case for Law Enforcement agencies and thats a really interesting story. Can you tell us a little bit how he discovered that that was where he wanted to. That was the target market for, his product. Yeah. So clearview ai originally was called smart checker and when they first started doing this one of the first places actually that they got access to photos was venmo dot, which at the time you know, venmo makes everything public, died by default. And on venmo dcoms web site, they had real time transactions that were happening on the network like people paying people and it would have their photos in a link to their Profile Photo and they ended up downloading all of photos. And so originally smart checker was just like a way to find take a picture of somebody and find their venmo account. But originally theyre just trying to figure out who they are, who pay for this. Like who would this tool for taking a photo of somebody and find out who they were . And so they pitched it to hotels and companies as banks, you know, figure out who the High Net Worth people are as they walk in. I mean, theyre really just kind of giving it to investors, like giving it to Grocery Stores and had all of these kind of the first people to use clearview ai were essentially very wealthy people, many of them here in new york. My favorite story is john catsimatidis, who is resident billionaire, owns the christie, so i always pronounce it incorrectly. Is it christie . Chris christie . These markets, they pitched it to him to have in his Grocery Stores, to identify shoplifters. He told me he was having a really hard time with haagendazs thieves like people were stealing ice cream from the stores but they also just gave the app to him to put on his phone and i was like, well, how do you use . And he said, well, one time i was having at cyprian, cyprian is the italian in soho and my daughter walked in and she had this date on her arm and i wanted to know who he was. So i had a waiter go over and take a photo of them. And then i ran the photo through clearview ai and i figured out who the guy was and i approved, but it was really they were talking to a real estate building about maybe putting clearview ai to kind of check the identities of people coming into the building and the security chief there was vetting the app and he used to work for the nyp and he said, wow like, this works very well and this would be a tool that my colleagues, my former colleagues would love. And so he introduced them to the nypd. So its almost happenstance that they ended up being used by police. And then it was very popular with the nypd. Yeah. And you can see how people have a tendency to want to misuse a tool like that for their own personal uses and it seems like that part of what was happening with potentially with Police Departments as well where having to where the nypd wanted to to build in some checks to make sure that didnt happen during a mention anything were on that yeah so the nypd originally at first was used by the financial crimes unit there and you know they tend to have a lot of photos of fraudsters as an atm at a bank counter and so they started running these photos through clearview ai networked well and they told their colleagues and so clearview ai just offering these 30 day free trials to any police officer. So they just handing it out like candy to the nypd and its really striking to me because i dont know. I first started looking into this. I didnt realize police could just use any tool they wanted like just from some random guy download it and actively using it in investigation ends without kind of checking the accuracy of the algorithm, seeing how well it works. It was really just yeah. Just, just try it and see you like it. So it starts spreading among the nypd they start telling the people they know at other Police Departments, you know, the department of homeland security, it just kind of like started spreading like wildfire or through Law Enforcement as as one investor put it. And the one thing the nypd was worried about is, you know, if we start giving it to all of our officers, how do we make sure that . Theyre only using it for purposes and not theyre like out at a bar and they see a, you know, pretty person and run his or her photo and find out who they are and they were starting to ask clearview, like, how how do we kind of keep track of how many searches are being run . How do we make sure theres kind of a Case Associated with it and clearview ai was kind of building in kinds of ways of monitor ring usage in reaction to the police as opposed to before started using it. Yeah, thats so interesting. And one of the other things i find really interesting, the book kind of bringing it back to. The one that theyre is a time where you you really outline his foray into maga world basically and how he starts associating with a lot of you know some far right personalities like Chuck Johnson and and also the deplorable of played into the development of the tool you know smart checker at the time that became clearview. So could you talk a little bit about the impact of ones proximity maga world . Yeah, some of the early use cases of the company the tool that became clearview ai were pretty alarming to me. And so one of the basically the first time it was used, it was still called smart checker at the time was at the deplorable in washington, d. C. , which was event around trumps inauguration to kind of celebrate all the work that people had done to help get him elected. And they want to make sure they didnt have anybody from, the far left coming in. They didnt want any antifa people there. And so apparently they kind of searched the people who bought tickets and they claimed that they identified two people who were affiliated with the Antifascist Coalition in d. C. And made sure that they didnt get tickets and i found out about that because they used that use case when they were pitching technology to the hungarian government as a Border Control technology. And they said, you know, we weve actually they they claimed in this presentation that they had find tuned their product, identify people with george soros and the open Society Foundation who are kind of pushing for democratic reforms in hungary. And they said here this would be a great tool for keeping them out. And so just in, i think its because there were really affiliated with these kind of conservative causes, but it shows how how you could use a Technology Like this in a very chilling political way. Yeah. And one of the things i think so interesting when you were in the book about whats different about clearview you are not necessarily a technology breakthrough, but an ethical breakthrough of sorts where they were willing to do what other others were not. And you talk about how scrapes, billions of photos of faces using basically what was available. And there you know, the Ashley Madison example is classic and then venmo that thats so interesting they basically took this real time feed were going to just like pull these photos but then facebook you know making Profile Photos public by default and then being able to to scrape all of those photos. So facebook as assures people that the technology exists to make sure that you cant scrape those photos. So what i was confused about when i was reading it was how did clearview come through and end up scraping of those facebook photos . They werent supposed to be able to. Yeah so there is this theres this problem, all of us in terms of dealing for privacy is that it is . One hard for us to understand whats going to happen in technology that we need to protect against. And so i think many of us who have posted photos of ourselves and others and loved ones on the internet over the last, you know, two or three decades or anticipo stating that a company like clearview ai was going to come along and scrape all and, you know, organize the internet around our faces. So, one, its hard for us to to kind of predict that and ourselves against that. And then this other question of what kind of Technology Companies have done or not done to protect us. And so with facebook, i mean, i, i think facebook such an Interesting Company because it forced all of us to really grapple with what Online Privacy means because of the way that we started putting all this Information Online and it got used in unexpected ways from your boss figuring out that youre not sick, youre on vacation to just having everyone your relationship status but you know facebook it kind of encouraged all to put our photos online and right do it right alongside our faces and weve done it on facebook. Weve done it on instagram. And, you know, they havent done i know that they try to make sure that people dont come along and scrape it. And they say in terms of service that youre not supposed to come along and scrape that data. But clearview ai and other actors have kind of done it again and again. So unfortunately, theres just not people that are really protecting us. Contacted, you know, he developed scrapers himself. He also told me about just a hiring like a brand and people that he met in strange corners, the internet to go out and hunt faces him. And he said sometimes wouldnt even know their names. They would just be like, oh, i scraped couchsurfing dot com and angel list and ive got this collection of faces. Do you want to buy . You can pay me and you know ether or some other cryptocurrency and yeah, it was just kind of this free all on the internet to go faces and i think its hard for us to protect ourselves that because its not just the that you yourself have posted posted publicly its its people who have other people who have posted photos of you. And there are, of course, that have many of our photos like google and facebook. And i think its so interesting at the end part one, you talk about how they these companies could do it but chose not. And you say it wasnt that they couldnt build it it was that they were afraid. So why was big tech afraid to do something that a random startup like clearview ai was not . Yeah. When i first found out about clearview ai and reported about it, i and the people i talked to, i interviewed, we all thought that clearview ai had made this technology a cool breakthrough that, you know they had done something that Silicon Valley, the government hadnt been able to do, gather all these photos from everywhere and create this very powerful algorithm for searching them all. But as i did more reporting and for the book, i discovered that google had developed a Technology Like this internally as early as 2011. Then chairman eric schmidt said it was the one technology that google had developed but decided to hold back. I think that would change with the current generative a. I. Tools. I think google had tools they didnt release and then open they i kind of changed the game. But facebook too, i to see this video of facebook engineers in a Conference Room in menlo park with the with the most absurd version of the future ever seen. One of the engineers is wearing a baseball cap and has a smartphone on the brim of it, held in place by rubber bands. And when he looked around the room and the camera focused on somebody, the phone would speak. The persons voice. And so actually the Technology Companies and google and facebook arent known for being super. You know privacy holdout. You know, they have have changed the norms. Google is the company that sent streetview cars out around the world. They put all of our homes on the internet but this technology was something that they saw as a step too far. And i think theres a lot of reasons for that. I think its just very radical and challenging to our privacy, this idea that anyone can put a name to our face at any point in the real world. I think that they were under a lot of scrutiny. You know, they said that they are worried it could be used in dangerous ways. And i think when you have these kind of big Technology Companies that are subject to the public talking about them, lawmakers, regulators, theyve had lot of privacy lawsuits. I think they decided, yeah, the worlds not ready for this yet. But then get a company like clearview ai that was able to take advantage. Open Source Technology is facial algorithms that were very accessible. All of these photos on the internet and they just to move forward with it, break through this taboo and the breakthrough that they made really an ethical one, that they were willing do what other actors werent and do you see a future once clearview kind of breaks the seal that other actors like google or or other would potentially step in and and do same or take it to a new level you know, what do you what you see coming if clearview ai ends up kind of testing the legal implications for this going through like taking the legal risk and then determining like what what are the guardrails and whats socially acceptable so there are already copycats and clearview ai has decided to limit the of its database of it now has 30 billion faces in its database probably many of you in its database theyve limit it to police use but there are other face Search Engines now that have gone online in the last few years. One is called pimeyes and. Anyone can use it like you can go to pimeyes dot com right now and upload your photo and they have a smaller database clearview ai but it may well show you other photos of you on the internet. Maybe some you know about, maybe some that you dont and its a subscription you know you can pay 30 a month and youre able to do this i talked to one person he kind of came to me and essentially confessed that he was using pimeyes in a really disturbing way and he wanted to tell me the story because he didnt think you should be able to use it this way. And he wanted policymakers to know and i call him david in the book, he didnt want to have his real name revealed, but he basically had a addiction and, kind of this privacy kink where. He would see women and online pornography and then go use pimeyes to find out their real names, their kind of vanilla lives, like find their high photos on flickr. And then when he kind of got tired of doing that, he went through list of facebook friends and started looking for risque photos of them that might be on the internet. And he found things, you know, on revenge sites of nonconsensual uploads of of of their photos were obscure know their names were not attached to it but. Then suddenly when you have this search engine, youre able to do so. So yeah. So its already there, which is part of why feel like this book is so necessary right now, to decide whether tools should be out there, whether they should exist, how easy it is for us to get out of these databases. And then i think for the big tech companies, now that theres been this breakthrough, they will probably be reckoning with whether they do something similar. And i could totally imagine clearviews working on these augmented reality that that the app work sends and ive tried these actually where you look at somebody a circle appears around their face you tap it and itll pull up all the photos of them online and ermita has also talked about you know they have augmented glasses in development and their chief Technology Officer has said wed love to put facial recognition capabilities in it. It would be this great tool, you know, at an like this, you would just know peoples or if you went to a cocktail and theres somebody that youve met five times and you cant their name it could supply it to you theyve you know the chief Technology Officer were not sure this would be legal not sure society wants this. So theyre kind of holding back to see what happens. But i could imagine a kind of consent model for facial Recognition Technology in the real world where a company that has your social graph like knows who youre connected to, would allow you to create privacy settings for your face where you could say, okay, im comfortable all being recognized by people, im connected to my friends or friends of friends or public at large. I do think thats one possible future. This technology, depending on what we decide as a society about what we want. Yeah, thats so interesting. And speaking of what . Decide that we i think its so like throughout the book the way you framed it. Can i ask a quick question . Who in this audience, if had the option to opt into this and you could say, im willing to let my be recognized, how many of you would actually do it. Is there only one hand up . Wow. Well, i asked same question in San Francisco in a third of the audience raised their hands. Yeah, responsible psychic audience, maybe interested in that. It throughout the book, you showing this kind of its nonlinear you kind of i think its really and super compelling the way you kind of show these is the contextual framing around different events that happened that basically where you can see where the publics appetite for privacy is shifting and certain events that really pivot things suddenly for instance you talk about 911 and how the appetite for security suddenly trumps any privacy concerns. You talk about Edward Snowden and how his whistleblowing on the wiretapping the domestic wiretapping, the nsa basically shifted the publics interest in it went from concerns about corporate surveillance, government surveillance. And you talk about covid complicating, our relationship with privacy, and then also even the impact of russia invading ukraine. And you can really see where like momentum shifts and how public really matters and. So this is a really interesting time. I think theres i think because of even just chatbots. I think theres a lot of Public Interest because of the accessibility of an app like that interest around ai. I think thats putting a lot of pressure on policymakers. Were having a moment around ai and ai policymaking and of course, in the eu, the eu ai act is is probably going to pass by the end of the year. And so i wonder, do you see being a moment where ai broadly and maybe some of these issues around surveillance and how it dovetails with privacy, do you see this being moment where we really can have a shift and where, you know, this is a time where we can actually, like, change the conversation around privacy. And i yeah, i mean, it has been a moment outside of the United States after ai, after i kind of exposed the existence of the Company Privacy regulators in, europe and canada. Australia launched investigations into clearview ai and determined that under their privacy laws, what the company had done was illegal and that you know, under their privacy laws, clearview ai couldnt just go and collect all these photos of their citizens and, you know, create a biometric like, you know, face print for them and have the searchable database. They said, no, you cant do this. This is kicked. You know, clearview ai out of their out of their countries. Clearview ai stopped working with Law Enforcement in those countries and all of those regulators recently put out a statement that said, you know, just because people are putting informed option out on social media sites, theyre putting it out publicly. Doesnt revoke their privacy interest in that information. We have not had that same moment in the United States. We just dont have the same kind of privacy protections. We dont have anything at the National Level that really addresses what clearview ai did or what many of these other kind of a. I. Are doing, where theyre just collecting a lot of information off of the sometimes quite private and training it to to do new things. And so, yeah, i do wonder what going to happen here. Theres some states that have relevant laws. The the main one is illinois passed this very prescient in 2008 called the biomet trick Information Privacy act. And i tell the history it in the book, but it says that if want to use peoples biometrics like their face or their fingerprints, their voiceprint as a company, you need to get their consent or be fined up to 5,000. And so that are in illinois can you know clearview wasnt supposed to put them in database and they can get out of the database. And we see this playing out in the real world. Favorite example is, Madison Square garden right here in new york city, very popular. You arena for basketball and hockey games and every major musical act plays there. Madison square garden installed facial Recognition Technology a few years ago, not clearview ai. Different company for security reasons to make sure that they kept out people might be violent you know as a way to protect the big crowds in the stadium thats on top of penn station a major hub but in the last year james dolan, who owns Madison Square garden, said the other people that we keep out of the arena are people i dislike lawyers. And so Madison Square garden went to law the law, the sites of about 90 law firms that had existing complaints. His company scraped the faces of all the lawyers who worked at every firm and created a banned list. And when those people try to go to Madison Square garden or Beacon Theater or, radio city music hall like a mom who went with a girl scout troop to see the rockettes play at radio city music hall. Christmas they get turned away and told that theyre not welcome until the litigation is resolved and square garden also owns a theater in chicago. But they cant do that there. They cant keep lawyers with facial Recognition Technology because they dont have the consent to use their. So were seeing this real difference around the country, around the world in terms of how protect your faces. Yeah, i have a few questions left. One is from community. Were not doing a public q a, but we have had some questions come in through our registration form. And one related to that, apart from legislative victories, is what accessible, practical victories have you against facial recognition . Thats an interesting. So from Civil Society, one of the big kind of characters in the book is the aclu, because kind of been fighting facial recognition in Technology Since 2001 when it was very i mean did not work very well but was deployed the crowd at the tampa bay superbowl which became the super bowl and the press. And so the aclu has really pushing for a more tourism on the use of facial recognition until we can grapple with accuracy issues of bias and just civil liberties. And theyve had bans on police in several cities. San francisco being one of the big ones. Somerville, massachusetts, for a while, oregon put a moratorium film. I mean, portland, oregon, put a moratorium on use. Theres some bills that have been passed at the state level around how police should use it, like whether you need a warrant to search someones face, what of crimes . You should use it for. And then i think just at the level like things that you can do there are state privacy laws. Unfortunately not one here in new york, but in california, colorado, connecticut, virginia that give you the right to access information that a Company Holds on you, delete it. And so people in those places can go to clearview and, say, get me out to one, see what they have them, and then say, delete it. Yeah. So theres theres been some things that have happened that are that are kind of hopeful yeah, thats great. And its good to know as a community here, you know, what kinds of things we might be able to do and actions we can take so at all tech us human see our mission as building and strengthening the responsible ecosystem and a lot of what we talk about is how people come to responsible technology and pretty much everyone has a nonlinear path, how they get so. I do want to say that that your article was really instrumental for me and getting interested, responsible technology, taking it from that i saw in a small niche way in the Music Industry where i had a past life and seeing how same issues are played out across society in much more impactful spaces. Law enforcement. But i would like to know and a question from the community here in the same vein, we have this question what was your pathway. Into this space . How did you get interested in this topic . And you can interpret that. However, im not sure if that is this topic being privacy or facial recognition, tech or tech journalism. But how did come to this space . So been writing about privacy and technology for ten years. Basically since the beginning of my journey as a journalist, because i became a blogger at a time when more, more people were getting on the internet and it just so easy to find information. People as a journalist and then just these companies starting to create you know, kind of vast dossiers on us what we click on, what were reading or what were email, what were sending imessages. We just have such vast kind of trails and that is what i found so compelling about facial Recognition Technology that the face becomes this key the real world to unlocking everything thats knowable about you from online and just seeing the way that that is playing out already, you know, clearview ai making it possible for police to identify stranger is Madison Square garden banning lawyers the way were seeing this happen with Madison Square garden of presents other ways that businesses might discriminate against us like if you write a bad yelp review or google review of a business are they going put you on a list and make sure youre never allowed inside again or that they spit in your soup if you are . And then kind of more chilling usage of facial recognition, facial Recognition Technology in places that are further ahead of us like in russia and china where, you know theyre identifying protesters, then theyll appear at their house the next day, ticket them for protesting or using it to suppress human rights or monitor. We are muslims in china china is interesting because you know they put this they they really have facial recognition deployed much more widely you know on camera as kind of real time recognition of people and theyve used it for kind of security threats. But also to name and publicly shame people who wear pajamas in. They will use it to just automatically ticket jaywalk ours, which would be horrible if was deployed in new york. We would all get tickets all the time that even at the temple of heaven in beijing and the public restrooms, they were having problems with toilet paper thieves who would come in and just take the whole roll. So they installed facial recognition cameras. And so you have to into the camera to get a certain of toilet paper and then you have to, you know, like 7 minutes or something until you can get more. So you can just see, you know, once you start thinking about all the different ways this could be deployed, it could just be so chilling. You could just be tracked all the time that you would no longer have a zone of privacy kind of any public space. I just, you know, a future where we just cease to be to be anonymous is very to me. And so, yeah, thats why i came to the subject of why i wrote the book. Yeah. And one final question from the audience. What is the biggest challenge in reporting technology . I find the biggest challenge is just making sure that you understand how it works and and really relating it to peoples lives. And so i think facial technology is easy in a way, because can just imagine all these things about your face and how it could be used. Just think about the photos you that have been taken over the years and whether any of those are on the internet. Maybe they could be fine. Found you know things that you dont want known about you. I often try to do with technology. First person style pieces because i think it helps people relate to. So back. 2013 i like lived on bitcoin for a week i got the big tech giants out of my life at one point to demonstrate just how hard that is to do. I turned my apartment into a smart and monitored all the data going and out of it. More recently i. I start my husband with airtags and tiles and a gps tracker with his consent. But just to show how easy it is, kind of its getting invade peoples privacy and how hard it can be them to detect it. So i think just making sure that people feel like they understand the technology understand how it impacts their lives, and then hopefully, you know, we as individual tools and then as a society, the policymaker level that we make better decisions that we can harness the good of technology and avoid the kind worst dystopian outcomes. All right. And one last question. What is next for you . Just going home after this long book tour . I dont know. Im always looking ideas for new stories. A lot of people give me tips. So if theres anything in here that you guys think is worthy of investigation, im here. Im going to be signing. But i love so i know exactly right now. But more more reporting on tech, im sure amazing. All right. Thank you so much. Jess. Thank you so much. Im so glad we have the opportunity to talk about this today. And i really want to start by bringing our viewers back to fall of 2020. What would you