comparemela.com

The Los Angeles Police department to court forcing it to release the details of its Predictive Policing Program theres 2 layers to predictive policing one is a community and a location based where algorithms are used and the Company Principal has developed that algorithm which was owned by Jeffrey Brown thing and was a professor of anthropology and has a long history himself how this thing was created on the on the feeds of afghanistan and iraq directly coming from the border from the war zones and the other piece is operational laser which is a person in a location based predictable policing program laser stands for los angeles strategic extraction and Restoration Program and the reason why its called laser is that the creators of lasers said that we wanted to go into the community with a medical type precision and extract teamers out of the Community Like lead from a position thats where they came up with a half and so they came over the acronym as people are tumors the exact fact i think is not really. What credible and laser claim to offer is a one stop crime prediction shop the pitch is to tell police not just where crime will occur but also who might commit crimes in the future. The l. A. P. D. Was using these technologies to decide where to deploy their police patrols. Focusing resources on socalled crime hotspots like by these. So this is all the hot spots for a particular time period hot spots are created by the algorithm the prep for longer where they use the information long term crime history or short term crime history and then they create these 505500 square foot hotspots on what basis how are they deciding this so to put it very bluntly theres a lot of pseudo science and now its being presented as these computers are really snootful and they would predict when a crime may happen. But predictive policing doesnt just flag up a place with laser it also sticks to a person the l. A. P. D. Maintain something called a chronic offenders bulletin these bulletins are undisclosed reports on socalled persons of interest people the Police Believe to be likely to break the law. This risk is calculated using a points based formula based on data from Police Records field interviews and arrest reports this is pulled together and scored by algorithmic software created by the defense contractor pal and here a company with close ties to the u. S. Military. So how do you get yourself on to the laser system so these are the things that identified these risks so if youre stopped and a field interview cards filled one point so if you know there is it or if the police stop stop youve got a point one point immediately youre going to point this individual was stopped the same day 3 times so the treat for 3 points right there if there had been a previous arrest with a gun 5 forms if you had any Violent Crime 500 parole and probation is 54 and and identified as again as gang affiliated 5 point. When it comes to the chronic offenders bulletin points can mean prison but its not just about locking people up for hemant the data suggests increased Police Attention at the borders of a historically deprived area called skid row which helps keep the poor contained from the more affluent neighborhoods nearby. So this is like a beachhead so think of the defense of financial district yet from poor people you know when we talk about hot spots you know you will see the dirty divide how the proximity of extreme growth and extreme poverty coexist right here about 2 blocks from. Absolutely. He was going oh i am doing good to me and cory general. Secretary of. The building right there. I meet Steve Richardson who goes by his street name general doe gun hes a former prisoner and skid row resident who now works with the coalition campaigning for greater protection for the local community. Say you guys have been doing work on this predictive policing stuff right is what does that look like out here on the street to people who live here so predictive policing rolls out and a lot of weight because i mean skid row is ground 00 all experiments that have been you know busy so this is poor folks because so all a little programs l. A. P. D. Spy programs everything that they come out with is 1st tested right here they they 1st last to say for seriousness right here was growth 110. 00 extra police to skid row making it the most oldies community not only in america but 2nd in the world to baghdad got all kinds of patrols on skid row so we got the cops on motorcycles we got regular cars we got. We got detail cops theres all polies in like a 15 block area what are cops on horses right smack in the middle of the house you know they have no things continue to come out here from a lot. About 80 percent of people here are black right about any sins people suffer from something. A disability may be physical you know semi or mental and all of us is full of. The most arrested person on skid row was a woman 8 and moody sousa arrested under 80808 times for violating 4118 right 41 a day is up in this book called to say you can sit sleep a lie on public sidewalk so our only crime was she was homeless and have anywhere to go and was forced to sleep in public space she got arrested 118. 00 times for being 100. A year just for just being in public space and it was all over you know based on a lot of predictive stuff like that and the point you you doeg on and you have and are making is that this is a practice that goes way back right over policing in this Community Goes back decades and then that information from that then gets fed into the computer and the computer turns around and says well go back and do some more of the same thing right now and the computer before the information gets in the algorithm is designed for policing so the algorithm would create outcomes that an agency wants to achieve and this is really the key point and the outcome that the agency wants to achieve in this community is cleansing and damage when. We walk further along the tense begin to thin out as to the local residents gathered on the sidewalk were its obvious were approaching the outer limits of skid row the hotspot boundary hamad had pointed out earlier. This is like a fun the still form of host of hotspots that a person from skid row would be walking into and this is where you will have more policing waving for people who are for him than waiting for people to give them tickets waiting for people to throw them against the wall right. For people to intimidate and harass and demand to believe the neighborhood. But few weeks after we left skid row the l. A. P. D. Announced that it was canceling the Laser Program the pushback worked police admitted the data was inconsistent but the l. A. P. D. Says the predictive policing tool pred pull is still in operation. So lets think about the incentive structures with some of the predictive policing tools that weve been talking about what does it say about the incentives and the problems were going to have with these tools that youve got counterinsurgency software then essentially used for Law Enforcement purposes i hate to have such a sinister. Interpretation but i think its about opening up new markets to sell this software to 0 and Law Enforcement in the last is you know been a great market for lots of military technology is quite frankly i think theres actually the opposite incentive to get it they have the incentive to get it all Predictive Policing Software has an incentive to make the sale with police so their incentive is to is to make predictions that are as close as possible to what the police already believe is correct so given that its really hard to know. If ai has been tried on representative data or or not if we have real reason to suspect for example that there might be bias than isnt there a question about whether the system should be used at all well i think thats the fundamental issue is that were seeing the deployment of all kinds of automated decisionmaking systems or ai over we want to kind of characterize it and we dont know the effects until after the fact after the damage has been done is primarily how were learning quite frankly about what doesnt work and i think it goes far beyond bias i mean were talking about aggregating data about us Building Data profiles that for close certain types of opportunities to us and whats more dangerous i think in the digital age about this is that you know in the 1950 s. If you tried to get a mortgage you were you were black and try to get a mortgage at a bank and you were discriminated against you were very clear about what was happening that discrimination was not opaque and when it moves into a Software Modeling system what instead you have is a banker whos like you know im sorry dr noble you just cant have it and i dont really know why and so that lack of transparency is one of the things that i think were kind of trying to contend with here and this just becomes a wholly normalized process we dont understand or with you know the the models for actuarial science for determining whether youre going to pay more insurance for example because you live in a particular zip code doesnt even account for these histories of racial segregation housing covenants real estate covenants so just because we look at the zip code that doesnt tell us about this long history of discrimination that has sequestered people into particulars of codes those are the kinds of things that i feel like over time become harder and harder to see i think one of the things that i find. Worrisome is that we talk about data being collected for these kinds of systems and for the most part they just collect of some completely different purpose it just happens to be there in policing data is created by the police doing what they do theyre driving around theyre stopping people there occasionally arresting people and so forth that data gets produced and then is used in a predictive policing model its not collected for the print predictive policing model thats a 2nd order effect thats used because the data is already there and it turns out that it is a terrible way to predict where future crime will be because what police do is not collect a random sample of all crime they collect the data they can see this is true in most of the places where people are applying i think it is useful to detect where bias is happening and simulation can be important that i think thats true however it doesnt necessarily allow people to have again this conversation that i have been discriminated against its just sort of leaving the expert analysis to make that discovery when in fact there are a whole bunch of people that wanted to be homeowners or you know wanted to move house and they dont really understand why these decisions are happening so as a Data Scientist whats your take on this how do we build a kind of test for when its appropriate at all to use Machine Learning and when its not the question should be who bears the cost when a system is wrong so if we unpack a particular system and we say ok were building a Machine Learning system to serve ads and the ad that were serving oh this customers searching for sneakers but we served or boots at. Oh dear we were wrong there no one cares thats a meaningless meaningless problem the consumer could care less we get along ads all the time were trained to ignore them lets compare that to a system which makes prediction about whether or not someone should get credit. In a credit based system if were wrong the consumer who should have gotten credit doesnt get it or the consumer who should not have gotten credit does get it in both cases and in particular in the case where someone who should have gotten credit does not get it that consumer bears the cost of the air she doesnt get whatever it was that she needed the credit for to buy a house or a car or Something Else the company that failed to offer the loan may bear a small cost but there are a lot of customers so they dont really bear much of a cost and so when the customer bears the harm. We can predict that the harms will be greater because the people deploying a systems a little incentive to get it right. We know that if people of color are over Police Report people are over policed and over arrested they are also likely to be over sentenced. Machine learning isnt just used to predict crime its also used to decide whether a person should be given bail or how long a sentence a prisoner serves. Criminal courts in the state of florida and use a predictive Sentencing Program called the correctional offender management profiling for alternative sanctions compass. In 2016 journalists at the us news outlet pro publica investigated compass and discovered an apparent racial bias at the heart of its algorithm. Investigative report and one of the things that they found in their hand a review of all of their records was that africanamericans were twice as likely to be predicted to commit future crime i found it incredibly interesting for example the story of died one of the reporters told that there was a black woman a young black woman who had taken a bike in one of her neighbors front yards and kind of ridden it around and the person here on the bike said bring that bike back and so she did it but a neighbor called the police on her and she spent 10 days in show and the Compass Software gave her a score of 8 out of 10 that she was likely to commit a crime again and that and they looked. White man who had a history of Violent Crime of history of being in a. Now out of jail the software gave him a 3 year so he was more likely to be a waste. Once again the bias in society was revealing itself in the machine. The. Journey to work can be a challenge on its own. But for some peruvian villagers traversing one of the worlds most dangerous roads is a risk that comes with the job. We follow the journey of these people as they get there to survive. Risking it all. On aljazeera. We have a news Gathering Team here and there is a 2nd term on there all over the world and they do a fantastic job when information is coming in very quickly all of the ones youve got to be able to react to all of the changes and aljazeera we adapt to that. My job is is to break it all down and we held the view i understand and make sense of it. Germanys birth varian helps where stunning scenery is playing host to europes latest arrivals. Separate in origin. They share a common roof and together dream of a german future. Welcome to german cafe vald loved a witness documentary on aljazeera. He said ill be here in doha with your top stories on aljazeera iran has rejected u. S. Claims that it was behind 2 tanker explosions in the gulf of oman on thursday the u. S. Released video to back up its claims but germany said it was inconclusive speaking on fox news the us President Donald Trump called iran a quote nation of terra were being very tough when sanctions. When i came into office they were a absolute terror a nation of terror and they gave changed a lot since ive been president i can tell you they were unstoppable and now were in deep deep trouble and theyve been told in very strong terms and we want to get them back to the table if they want to go back im ready when they are so d. C. Media is reporting it strikes against yemeni targets by the Saudi Led Coalition according to the reports the strikes hit you see military targets and air Defense Systems in the capital sanaa earlier a military spokesman for yemens hoofy rebel group warned people to stay away from airports in saudi arabia and the u. A. E. The rebels have targeted saudi arabias airport twice this week Officials Say the 4 oil fridays attack by intercepting drove into the airport near the yemeni border. The World Health Organization has stopped short of calling the spread of ebola an International Emergency more than 1400 people have died of the virus in the 2 yasi since august there are fears it could spread to uganda after 2 people died there this week. The u. N. Security council has held a meeting on its peacekeeping force in the darfur region of sudan the un confirmed the killing of 17 people in the burning of more than 100 houses there earlier this week the u. K. Says the Un African Union mission in darfur must stay in the region. Parts of brazil have come to a standstill during a 24 hour nationwide strike against proposed Pension Reforms if approved the minimum retirement age would be raised and workers contributions increased. The last 5 suspects wanted in connection with the Easter Sunday bomb attacks and in custody for lankan nationals were extradited from saudi arabia on friday Police Say One of the suspects helped lead the attacks which killed more than 250 people in a pull back to the big picture the news comes after that. The risks of bias baked into Machine Learning arent just confined to law and order. Upon release prisoners must reintegrate into a world that is increasingly automated. Today for them as for you and me opaque computerized systems will help decide their access to state welfare to private finance and to housing take Credit Scores these are shorthand for a persons financial trustworthiness in many ways Credit Scores are the gatekeepers to opportunity and increasingly theyre produced by algorithms fed on data blind to context and history. If that Credit Report comes back with a low score that means this individual is supposedly a high risk so you begin to sort of just go around in a circle. Low credit score criminal background cant get housing because you dont have housing you cant get a job because the job that youre applying for requires a permanent residence. There for a ged and are stuck in this cycle of an opportunity youre at the whim of a machine driven system that decides on the basis of different criteria that are on the notes to you. This is one of the darkest topics of our era their human biases in targeting on the on the battlefield their human biases in who gets loans their human biases in who is subject to arrest and these human biases are horrible couldnt we fix it with algorithms that wouldnt be biased but then it turns out the algorithms are perhaps worse the algorithms have refined the worst of human cognition rather than the best because we dont know how to characterise the best. I went to the work rebooted conference in the heart of the Tech Industry San Francisco california to see if a i could be used to bring out the best in human endeavor some people are going to do well some people can do less well i met ben prng who heads the center for the future of work at cognizant a Multinational Corporation specializing in i. T. Services. I know a lot of people are anxious about the whole notion of bias within the algorithm and so one of the jobs weve speculated on the to be creative is what we call an algorithm bias order to which could be a sort of morphing off of the traditional kind of cool order row to make sure of that the reason unconscious bias within. Algorithms are going into production environments within big businesses so that people can reverse engineer decisions made by software you do look at Job Opportunities opening up you know have said that you do anticipate some job losses in certain areas yeah occluding some that actually people you think havent seen there is a class of new software theres a motion the last couple of years in the industry its called reports of process automation. And you can get a team of 500 people down to 50 people thats the reality of whats going to happen in big business is that a lot of that kind of white cold you know skilled semi skilled work midlevel mid skill level what is going to be you know is replaced by this kind of software in the snake denying that some people will be kind of left behind in the in that transition so what other jobs do you think that ai might open up in 5 or 10 years time so we came up with this job we call a walk or talk which is this idea that you know in a lot of. Towns around the world certainly where i live in massachusetts lot of seniors theyre very isolated so what if there was an imbecile platform where. People in the neighborhood could log on to the platform ive got a spare hour on a tuesday afternoon or saturday morning i could go and walk and talk with a senior in my neighborhood so People Living in the kind of gave the economy a living a kind of portfolio style set of jobs they maybe drive. They maybe drive a lift they may be due to their house through a b m b they may do things through task rabbit what if they could literally monetize that spare time they have to go and walk and talk with a senior that doesnt sound like a Technology Based job but that would always ride on a ai infused platform in the same way that. Most of the people who do care work are women and women of color guess what guess who has been taking care of other peoples kids sense they were in slaved and brought to north America Black women this idea that somehow these historically oppressed suppressed communities are now in some type a better situation because theres an app interface between them and the new people who want that work done and then call it a fascinating new gig ng opportunity i think is just completely nonsense the experience of marginalized people basically foretells whats to come for the entire population degrees of control lessening of autonomy. A real difficulty in confronting and sometimes resisting these systems. Some say if you want to know whats to come with ai you need to look to china. The chinese want to be the primary Innovations Center for. Seen as both a potential driver of more social instability but at the same time the Chinese State thinks that i can use this tool to call social unrest. China is home to 1400000000 people its Capital Beijing has more surveillance cameras than any other city in the world facial Recognition Technology is woven into everyday life getting you into a bank your residence checking you out at a shopping till 800000000 Internet Users and weak Data Protection laws the Chinese State has access to colossal amounts of data and chinas credit scoring system aims to go far beyond finance. There is this ambitious goal to have a National Unified social Credit System that would assign a score to citizens to judge whether they were their behavior was politically acceptable or was socially desirable. The plan is for all chinese citizens to be brought into the social credit scoring system in 2020. And uses data everything from financial records and traffic violations to use of Birth Control and processes that data through algorithmic software to give people a score for their overall trustworthiness. A high social credit score can mean better access to jobs loans travel and even Online Dating opportunities. Can mean being denied some of the modern benefits of citizenship. Probably the most troubling aspect of social criticism is not necessarily the social Credit System itself but actually the application of some of these facial Recognition Technology is to expand the surveillance state and to check behavior of citizens in the Western Region of job where at. Minorities waders have been disproportionately targeted in terms of their location being tracked 247 whether theyre going to mosques which areas of their traveling to and that has been empowered or is in the process of being empowered by facial recognition algorithms being connected through security integrators. Autonomous region is home to chinas weaker population. And ethnic Muslim Minority that has faced systemic forced dissimilation. A small fraction of the weaker resistance to this oppression have turned to violence. Including attacks on civilians. Leading president g jumping to embark on a socalled peoples war on terror. Aimed at stamping out weaker separatism and imposing a secular ideology. New ai led technologies particularly facial recognition are the latest weapon in xi jinping crackdown. Some reports have indicated that it was a database that tracked 2600000 residents. Tracked where they were going and that database had labels of sensitive locations like whether theyre going to a mosque or whether they were going to this particular region job so that was updated on a 24 hour basis and that database had i believe more than 6000000 records so it showed it was tracking these people real time waiters are now in reeducation camps. So thats a pretty significant departure from normal life where youre forced to study in a camp and repeat party montra. Its a stark picture of how Artificial Intelligence can go wrong the Chinese Government deploying ai to track and suppress its own minority populations. Facial recognition checkpoints engine john use deep learning technology to identify individual leaders cross checking them with Data Collected from smartphones to flag anyone not conforming to communist party as unsafe a threat to state security. Has become a test bed for authoritarian ai. This harsh system of control may seem a world apart from the west but systems like social credit actually have some parallels. In some ways if you think about the origin of some of the science the social credit coming from some of the major private businesses in china how different is it really from a kind of experience or an equifax or one of these other kind of Credit Rating agencies that actually do collect also very granular data on westerners and in that data is then shared with all kinds of other entities and used to make consequential decisions in current operation i would say that theyre different. I think the difference will be when its not just your Financial Behavior one its also your social your Political Behavior that gets observed and oftentimes the social Credit System becomes a projection of our own fears about what is happening in our societies in the west where its not necessarily what is object really happening in china that is important but its about using whats happening in china as a way to project what were afraid of. So when i think about chinas millions of workers being tracked 247. And potentially put into reeducation camps i think about the black community in the United States i think about predictive policing and i think kind of east and west one of the problems and worries that got with ai is the way that it gets road tested on communities of color in the form. The way these technologies 3rd being developed is not empowering people its empowering corporations

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.