comparemela.com

Is the facial Recognition Software we use is not the deciding factor when identifying individuals if we can put a picture of an individual and there is a result and investigator takes the information as a lead this is a human based decision not a computer based decision so when you hear this and you hear the explanation for why it might be necessary would you make about. Well for starters we know that from recent reporting by the Georgetown Privacy Center that Many Police Departments are actually misusing a surveillance and facial Recognition Systems and that these systems dont make us safer so to build even a straightforward seeming Face Surveillance system you need to get hundreds of thousands of photos of innocent individuals who never consented to be part of the Face Surveillance data base and really right now as we would as other guests have said its inaccurate technology its biased technology against people of color and particularly women of color and many of the critical Public Safety benefits are theoretical but what we do know is that its biased its inaccurate and that departments are being transparent about how theyre using it there was a report recently in gizmodo that actually debunked some of washingtonians claims and found that theyre not even following amazons own guidance and its a very poultry week guidance theyre not even following that week guidance and using this system that doesnt make anyone more safe. So i want to push on just a little bit here in this conversation and move on to the United Kingdom because police there and Police Department had been conducting 3 trials of facial recognition cameras or earlier this year metropolitan police in london find a man 90 british pounds thats about 114. 00 when he protested having his picture taken to say ok talk to us about this case because you actually bear you saw with your own eyes. Yeah it did it was really shocking and i think really speaks to. This new power imbalance that occurs when police have facial recognition. A man came out of the train station and saw a group of us standing with placards in the flats letting them know that the surveillance cameras in the area were actually facial recognition cameras and a very very small act of resistance he maybe pulls ringback the bottom of his jumper up his chin and we had been told by paint place of a says he. Will be watching how people who are were responding to us informing them of the facial recognition cameras so very quickly he was sweeps by a team of Police Offices they demanded to know why he had dead to cover his face they just wanted his id they really riled him and he was about aggravated d and they gave him a fine and this is sent chills across British Society actually has been the clip of what happened as they viewed millions of times online and i think people are now starting to wake up to whats happening in the u. K. With this technology and become outraged about it this trot socalled trial by the piece has been going on for 4 years now and weve been campaigning and were still campaigning for it to come to an end because this is incredibly undemocratic undemocratic its incredibly on british to see a massive a and instal like this raising peoples Civil Liberties and really changing the nature of sense of society and freedom in the u. K. You know matt its interesting weve outlined some of the dangers and you know a lot of people who are watching live on you tube right now are agreeing with you and also you know pointing to other things emery for example saying i think the technology on its own isnt the danger thing is its the intent of the people who are using it it could vastly improve our lives or turn our countries into Police States i just want to scroll down a little bit in this you tube chat he also goes on to say the same time the riches of Silicon Valley who are developing all these technologies are forbidding their own kids from using it so i think this is a sign that these technologies should be regulated your thoughts on that yes so 2 points 1st. These systems once theyre built and deployed the harm will be we wont be able to rein the harm in and thats exactly unfortunately what weve started to see him places like china history of surveillance histories around the United States and in other nations is a history of surveillance technologies being turned against people of color against activists and against immigrants we can fully expect governments to do the same with a Surveillance Technology and thats exactly why a coalition in sarah cisco not just of people who understand technology but 25 different organizations ranging from or exact represent immigrants rights to use Racial Justice to the homeless to even criminal defendants a Diverse Coalition is what came together here and said all of our lives depend on the freedom to walk down the street safely without being tracked all of our lives depend on the freedom to not be logged into a government database because were advocating for our own rights in this Democratic Society and so while census those leaders recognize that here in the heart of technology they need to play safeguards in place for new dangerous technologies what really drove this was the community and a Diverse Community and that sort of movement i think is really important point that is possible everywhere that is not just something that can happen here is there cisco and were already seeing the domino effect places here in california but also across the United States are considering similar similar bans lending and i think the idea and terms of you know potentially until it if our police saying is that perhaps there could be a way if the technology were Accurate Enough and if there were enough incidents of the sort of how these algorithms work and how decisions are being made in a really granular way at each stage that it might be possible to set limits on how you can sort of query one of these databases or years one of these services so you so police are able to sort of get their match or get their thing that they need without the cascading effect. But because as matt said once the systems are set up theyre there and theyre persistent its difficult to know how to set those parameters and i think thats why i was the advocates are calling for this sort of pause you know particularly within the u. S. This week but in general because there needs to be some time to sort of societal and as a Global Community to discuss how the use restrictions might be possible if there are possible so you you mentioned something that happened this week and so i want to let our audience in on its own when say the House Oversight and Reform Committee in the u. S. How the 1st hearing on Recognition Technology to examine the impact on civil rights and Civil Liberties i want you to have a listen to the founder of the organization Algorithmic Justice League and shes speaking to congress about the systems and this in this clip in particular caught on c. Span here in the u. S. Shes speaking to a representative of the standard. Take a look at this tweet and britain now the exchange here because its so interesting a. L. C. Starts with our algorithms most effective on women some going to scroll down and have you listen to a little part of that. We need i heard your Opening Statement and we saw that these algorithms are effective to different degrees so are the most effective on women. Are they most effective on people of color absolutely not are they most effective on people of different gender expression no in fact the exclude them so what different graphic is it mostly effective on white men and who are the primary engineers and designers of these algorithms definitely white men so so feel peer outside of the u. S. Where you can see what it is the discussion is and thats happening in the u. S. What do you make of this in these systems and the inherent bias is that some would say are built in. Its a big problem and it needs to be really carefully examined we have similar concerns in the u. K. But we have pressured the police to do some independent testing of the algorithms that using that using a Japanese Company would any see. And weve asked if they understand what vice is like the inherent in the technology and they have said basically that that that theyre not interested but theres also issues with what kind of what shifts of being put together as well so when we 1st sold this technology being used that looking hill carnival which is a black british celebration in london. And thats what this surveillance you know it was that community that we used as guinea pigs for this today and its 2 years in a right which is just just incredible so its you know it was a matter of not only how biased. But the people that lisa talking with it but i also think that some of the Technology Issues ots hemp rope and i share other guests is that actually the better this becomes the more perfect a tool for oppression it becomes as well you know most definitely and you know its worth mentioning that you know i have a gauge of kind of how much our audience is responding to each show that we do here at the stream and this one has tons of twitter threads and tons of comments on you tube so i think you know its generally something that a lot of people are concerned about wondering how many of the questions are left unanswered very quickly want to share history with you before we move on to our next portion of the show an organic african feminist saying what worries me the most is that were asking what can we do a question about developing new technologies rather than also asking why are we doing this and what is motivating us to do this not asking the latter enables the facade of value neutrality that goes on into a very lengthy you know thread that you can check out on twitter of course for now though lets dive a little deeper into this conversation lets look at how the Chinese Government is using a sophisticated facial Recognition Network to track its own citizens with a focus on the minority we Muslim Community take a listen to this comment sent to us by cindy you a past guest on the show. Chinas you sufficient Ignition Technology fits into why do you know if its a 10 to rollout a nationwide social criticism where every system is rated only trustworthiness based on information from big data and yes it recognition in changing that way to seeing this way change will kiss up being recognized by cameras as they cross the road and then identities displayed on big locals across the rage in chinas proud coach at this sort of public shaming can be very effective man when you you hear that from from her there i mean in chinas contacts is there a particular fear or is that are they taking this a step further i mean what can you have any content contextualize that for our audience. China should really serve as a lesson and instructive lesson of what the United States and other nations and frankly what the chinese Chinese Government should avoid one of the stories on the chinese use of this focused on a mosque that previously years before before if a surveillance had been bustling at prayer hours and now that mosque is desolate and deserted this just illustrates very clearly that showing a fact that happens when people know that going outside means having your face scanned your name logged into a government database and maybe your identity placed on a watch list for government agents but we would be fooling ourselves if we didnt think that the United States government had a history of turning surveillance technologies against these kinds of communities we have seen it with everything from license plate readers which scan vehicles to social media surveillance that the people who are disproportionately targeted by american governments and local and federal governments are people of color they are immigrants and weve seen black lives matter tracked and were seen ice the immigration and Customs Enforcement agency deployed these similar kind of tools right now so its really important and thats what its really important to act and defend ourselves right now and thats exactly what San Francisco and now a domino effect of communities are going to do with this particularly dangerous technology. Ill say to the same in the u. K. I mean i just i just want to point out that even in the trial phase in the k. Police have been using facial recognition against peace activists not the most dangerous people. And people with Mental Health problems as well so we havent even seen you know thats a bold towards wealth or as heroin is and by the time i mean the start point and the end point because technology is really just stepping in nature i think in the late. You know i also think that china is a really important example in terms of thinking about whether or not what we know about their system is true its not that i particularly doubt that they could develop but you know this sort of mass scale ubiquitous special recognition i think its probably all true but the source on it is the Chinese Government and the source on you know their ability to spot someone d in a crowd of 50000 people at a concert or something all those sort of triumphs that they discuss and you know this type of footage that youre seeing on the screen and this is really all coming from them with not a lot of you know getting or independent audit and going on so one of the dangers you have is if the system isnt as robust or isnt as accurate as they say maybe someone who wasnt even jaywalking gets put up on the billboard and gets publicly shamed and thats type of thing would be really difficult to bring to light so just again it shows the potential dangers either way whether a system is working as intended or not you know there are still big ramifications i think whats interesting is the idea of opting in and opting out and when you dont have that opportunity to opt in so i want you to take a look at this we did circulating online our producers found this before the show this is Matthew Brennan who treats wow china airport face Recognition Systems to help you check your flight status and find the way to your gate note i did not import anything it accurately identified my full flight information from my face now that is pretty creepy to me its eerie though i could see the argument that it makes things efficient but i want you to have a listen to a clip from 27000 that shows another side of this and it lucian who works as the head of an aid station in shanghai explains how facial Recognition Software can help regular passers by on the street. Thats a good thing this is a we think running this facial recognition system has reduced the time needed to do searches like in the workload of our staff and made our searches more efficient that lets us help people faster when they are unsure about their identity and helps us as much as we can to find their relatives and. So when it comes to talking about dementia patients or very people who are outside the last basal recognition some would say well it can help and that silky whats your take on the beneficial uses and the opting in but but but not having a choice to opt in i mean it might be that there are some beneficial uses i just havent seen them yet and inherently

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.