after the country welcomed in the new year, half an hour and 2023 keys. air raid sirens rang out. the cities mer, says air defense systems are working to protect people. chaw, stratford has more. cording to the foreign minister to metro caliber. there are almost 30 suicide drone attacks on the capital was like 45 country wide. he saying that all those drones were intercepted, but of course, that doesn't mean that damage isn't caused to were to infrastructure to residential areas. north korea's leader is promising an exponential increase in the production of nuclear weapons state media. se kim duncan has ordered new intercontinental ballistic missiles with nuclear capability. leaders around the world are paying a tribute to former hope. bennett 16th was died the age of 95 benedicts led the roman catholic church for just on the 8 years. he sat down in 2013 spain, canada and australia have become the latest countries to impose cove 19 checks and travelers from china. morocco has gone a step further, posing a ban. all the rifles from china croatia has adopted the euro as of january. the 1st it comes nearly 10 years off to join the european union and also becomes part of the shank enzo, meaning unrestricted, travel to other member nations is when a secondary protests in bolivia, santa cruz region, after the arrest of a right wing opposition leader police were out in force after 5 days at bonnet confrontations, supporters of jail governor lewis. fernando camacho security has been increased in brazil's capital before the swearing in of presidency leg to lewis and isolated a silver on sunday. last week's foiled bomb plot by support of outgoing liter diable. so narrow as race tension in priscilla and brazil is in a 3rd day of national mourning for pele, widely known as the king of football. his wake will be held on monday at the club stadium in santos, where he 1st made his name is all the headlines needs, continues. heron al jazeera. after all hell the algorithm. as 2022 jewels to a clue. we reflect on the major stories that shatelle join al jazeera for a series of in depth reports. looking back at this year, and to head to 2020 to trust is fundamental to all our relationships. not just with our family and friends. we trust banks without money. we trust doctors without really personal information. but what happens to trust in a well driven by algorithms as more and more decisions are made for us by these complex pieces of code? the question that comes up is inevitable. can we trust algorithms ah, from google searches to gps navigation algorithms everywhere. we don't really think too much about them, but increasingly government's corporations and various institutions. a using them to make decisions about who gets public services, who gets denied, how people are monitored and policed. how insurance is charged. i want to start here in australia where an algorithm used by the government has resulted in more than 400000 people being in debt to the countries welfare system centrally. it's been called the rubber deck scandal. back in 2016, a decision was made to fully automate a key part of the estrella welfare system. the part where the earnings of low income people are compared with the amount of government money they received. the government says they do this to ensure the right amount of financial assistance has been had. while the data matching algorithm officially called the online compliance intervention, had been in place since 2011. any discrepancies previously flagged by the system were investigated by government employee. first, with automation, all human checks were renewed. the government had instituted an algorithm that essentially said, let's match 2 lots of data together and match them together and save people have a day. so some of the math was just bad, just plain wrong, like it was spread shape. so a mashing to cells together in the cells didn't shop. asha wolf is a journalist who has been reporting on the rubber dead story since it broke. she's also an activist, one of the chief organizes of the not my debt grass roots campaign. often people didn't realize that this was automated in the 1st place. and it wasn't till we started getting people talking together on social media, on twitter that we realized. actually, it's the government this wrong. it was almost like a $100000.00 people, had been gas lighted, into thinking they've done the wrong thing, that it was their fault. and, and now outraged when they realized that there was a fault in the actual algorithm in the car. and the trillion government disagrees. we're doing good point because we want to be more thorough is a price. if we are covering example. and we are recruiting money for the more checks is a bit of an understatement. the old system resulted in around $20000.00 discrepancy notice as a year. but in the early days of the new automated system that jumped 220008 week. more than a 1000000 letters had been sent out by the algorithm. sometimes disputing government payments from as far back as seventies. and what was worse was these systems were imposed on paper with intellectual disabilities, with homelessness, with ha, chronic health issues, people who would barely literature, not literate at all. people who didn't know how to use a computer, people her living in remote communities with that access to internet, people her had no bloody clue how to deal with this sort of administrative, bureaucratic bumble, david digna was notified. he incorrectly declared his income from a teaching job while he was on a disability pension back in 2011, his brother, debt $4088.00 in essence. or rather it is, it's leveling an accusation toward you that you've done that chain re done the wrong thing. i know ahead i wanted details of how they calculated more did. and i was told i couldn't had that. and the reason i was told was that the computer, whoops, my personal information and then sources a piece of information via another patient. he another pacing of recently. and i can't provide a way to me because it comes in too many places. and we see going back, in other words, the algorithm is inscrutable. it's totally unknowable, even the staff don't really understand. robin, can you tell me how much evidence or how much notification didn't simply provide you? proving that there was a good many provide me with anything other than this letter. and the other thing that i have is finally a text message came through to say, hi, the money you owe us is due in 2 days. the fact that you couldn't get any concrete evidence about this is how we have calculated your days. here is what you and here are the hours you work that really i found it, it sure any confidence that i had in that the government will do the right thing. the fact that they couldn't prove to me that i owed the money. i'm really concerned me say you didn't find that you received a letter in the mail that's generated by an i that essentially says high, the government wants to let you know that we underpaid you by you know, 5 $1000.00 or that you should have been eligible for these services, but we didn't tell you therefore we're telling you now and we come back pay. nobody gets back paid. in fact, you're only eligible for like back pay it. i think it's 6 lakes with government services that the government can roiba debt you back for many, many years. automation, computerization, algorithmic eyes ation. if that's even a word they're always sold to us is such a positive thing. all upside, no downside. as estrella department of human services put it, computerized decision making can reduce red tape, ensure decisions are consistent and create greater efficiencies for recipients and the department. the problem is had a challenge, a system that has my face and no name, and nobody signs the bottom of your letter say, you know, i'm in charge of this b. good afternoon. welcome to the department of human services. since like on a good day, didn't up sitting on hold for a couple of hours trying to speak to a human. the real question is, how has it come about that the government has either pay people but billions. um, because really the criminal waste is occurring at the end of the governance line. it's the government that's doing this. otherwise you're saying a 100000 citizens had made mistakes. well, if that's the case, then the system is too difficult for people to negotiate. so i'm not here shaking my fist. hm. at technology, it's not a digital fall. it's not a computers fault. this system has been, you know, designed, you know, quite explicitly. um, you know, by government governments responsible for its failures and governments really responsible for the hell that putting all sorts of welfare recipients through unfairly by issuing them footsteps. this is something i heard from virtually every one i spoke to about rogo did. they said, we're not against technology, it's not like algorithms are all bad. it's the people and the institutions designing these codes, we can't seem to trust. and this really gets to the heart of our relationship with algorithms. they're often complex, hidden behind walls of secrecy, with no way for those whose lives are actually impacted by them to probe them because they've been kept off limits. despite all the criticism and even a formal inquiry, these strolling government stands bytes, algorithm, and automation in the welfare system. we do have a complaint to be placed on the block map to line web re cooper. i. we recovered our, the $300000000.00 to the pac tire for that price. so the system is working and we will continue with that system. there are at least 20 different laws in australia that explicitly enable algorithms to make decisions previously made by ministers or staff. we don't really know the full extent of how these are being applied, but there are places around the world where the use of algorithms are even more widespread. like hair in the united states where algorithms are being used to make big decisions across everything from the criminal justice system, health, education, and in toilet. the united states has a longer history of algorithm use the many other countries. silicon valley is a big reason for that, of course, but also there's much lisa regulation here on how private companies and governments can collect and use data. but for those studying the effects of algorithms on american society, one thing is clear. often it's to poor marginalized, get the worst steel. ah, i'm on my way now to troy in new york state. to meet with virginia you thinks she's the authority on everything to do with the border mating equality. actually the title, one of her books, virginia says america's poor and working class had long been subject to invasive surveillance and punitive policies. she writes about prison like poor houses of the 19th century. the bad conditions were thought to discourage undeserving poor from supposedly taking advantage of the system to what i see as being part of the digital poor house are things like automated decision making tools on statistical models that make risk predictions about how people are going to behave in the future or algorithms that match people to resources on. and the reason i think of them as a digital poor house is because i that the decision that we made in 1820 to build actual poor houses was a decision of that public service systems should 1st and foremost be moral thermometers that they should act to decide on who is most deserving of receiving their basic human rights. virginia studies into the automation of public services in the united states. points to developments in the late sixties and seventies. along with the civil rights movement came a push for welfare rights. people are forced to live in the most in human situations because of our poverty, african americans and unmarried women who were previously bod from receiving public funds, could now demand state support when they needed it. while technology was touted as a way to distribute financial aid more efficiently, it almost immediately began to serve as a tool to limit the number of people getting support. so you have this moment in history where there's a recession, anna, and a backlash against social spending. and a social movement that's winning success is that and discriminatory treatment. and there really is no way to close the roles. they can't close the roles the ways i had in the past, which is just to discriminate against people. and that's the moment we see these tools start to be integrated into public assistance. i think it's really important to understand that history. i think too often we think of these systems, us just simple administrative upgrades sort of natural and inevitable. but in fact, they are systems that make really important concept on shall political decisions for us. and they were from the beginning, supposed to solve political problems among them on the power and the solidarity of foreign working people. in the early 19 seventy's close to 50 percent of those living below the poverty line in the united states received some form of cash welfare from the government to day it's less than 10 percent in public assistance. the assumption of many folks who have not had direct experience with these systems is that they're set up to help you succeed. they are not in fact set up this to help you succeed. and they're very complicated systems that are very diversionary that are needlessly complex. and that are incredibly stigmatizing and emotionally, very difficult. so it shouldn't then surprise us that a tool that makes that system faster and more efficient and more cost effective. um furthers that purpose of diverting people from the resources that they, that they need. having algorithms make decisions such as who gets financial aid, who owes money back to the government has caused concern among many different groups. but what's causing a full on panic for some is the fact that algorithms are being used to actually make predictions about people. one of the most controversial examples is the correctional offender management profiling for alternative sanctions. it's a bit of a mouthful, but it's short form is compass and it's an algorithm that's been used in courtrooms across the country to assist judges during sentencing. now of course, algorithms can't weigh up, arguments, analyze evidence, or assess remorse. but what they are being used for is to produce something known as a risk assessment school to predict the likelihood of a defendant committing another crime in the future. this school is then used by judges to help them determine who should be released and who should be detained, pending trial. and now the judge has to consider a couple factors here. there's public safety in flight risk on the one hand, but under the real cost, social and financial detention on the defendant on their family and the other. now historically, what happens is the judge looks into this defense eyes and tries to say, hey, you're high risk person or your lowers person, i trust you or i don't trust you. now what algorithms are helping us do is make those decisions better. the compass algorithm was brought in to offset or balance out inconsistencies in human judgment. the assumption being of course, that a piece of code would always be less biased and less susceptible to prejudice. however compass is faced several criticisms, primarily accusations of racial bias, inaccuracy and lack of transparency. in 2016, a man named eric loomis, sentenced to 6 years in prison, took his case to the with sconces state supreme court. his allegation was that the use of compass violated his right to due process. it made it impossible for him to appeal his sentence. since the algorithm is a black box impenetrable, unquestionable eric limits didn't get very far. the supreme court ruled the use of compass units. sentencing was legal. the verdict, however, revealed the ways in which the ever increasing use of algorithms being normalized. the court had a funny argument saying that nobody knows where the decisions are coming from. and so it's, it's okay, you know, it's not that the state has some particular advantage over the defendant, but that everyone is at this sort of equal playing field. and it's not that there's an informational advantage for one side or the other. now, to me, i find that somewhat dissatisfying, i do think that in these high stakes decision, particular criminal justice system, we don't just want to have an equal playing field of no one knows. but i think we need to have the equal playing field of everybody. we need to have this transparency built into the system for the record equivalent, the company that sells companies software has defended its algorithm. it points to research commission that the company meets industry standards for fairness and accuracy. where the compass saw most of the privately developed algorithms meet acceptable standards for transparency. is another question. even when they are used in the provision of public services, algorithms are often close to the public. they cannot be scrutinized. regardless of that chart says that in certain cases he would still be comfortable being judged by robust algorithm. so i do think it's, it's true that many of the people in the criminal justice system are the, the most disadvantage. and the reality is they probably don't have a lot of say in their futures in their faith and how these algorithms are going to evaluate them. um he, on whether this would happen, if more powerful people were being just by these algorithms. i don't know. now, me personally, i would rather be judged by a well designed algorithm done a human are in part because i believe the statistic goal. i'm methods for assessing risk in fact are better than than humans and may situations. and it can lease what is well designed eliminate a lot of these biases that, that human odd decision makers often exhibit the united states has a massive racial discrimination problem and public services. that's real. so it is really understandable when agencies want to create tools that can help them keep an eye on frontline decision making in order to maybe identify discriminatory decision making and correct it. the problem is that that's not actually the point at which discriminant discrimination is entering the system. and this is one of my huge concerns about these kinds of systems is they tend to only understand um, discrimination as something that i is the result of an individual who is making irrational decisions. and they don't. the systems are not as good at identifying bias that is systemic and structural. the promise of algorithms is that we can mitigate the bi, see that human decision makers always have, you know, we always have, we're, we're always responding to the way somebody look this way. somebody acts and even if we try as hard as we can, and if we really have these good intentions of a try to just focus on what matters, i think is exceptionally difficult. now that again is the promise of algorithms. the reality is much more complicated. the reality is that allianz are trained on past human decisions and they're built by fallible human themselves. and so there's still this possibility that, that buys creep in to the development and application of these algorithms. but certainly, the promise is that we can least make the situation better than it currently is. one of the things i'm really concerned about about these systems is that they seem to be part of a philosophy that increasingly sees human decision making as black box and unknowable and computer decision making as transparent and accountable. and that to me is really frightening because of course, computer decision making is not as objective and as not as unbiased as it seems. at 1st glance, we build bias into our technologies, just like we build them into aarp children, right? we teach our technologies to discriminate. ah, but on the other hand, people's decision making is actually not that opaque. we can ask people about why they're making the decisions or making that can be part of their professional development. and i think this idea that human decision making is somehow unknowable is a sort of ethical abandonment of the possibility to grow and to change that we really, really need as a society to truly address the systemic routes of racism and classism and sexism in our society. so it feels to me like we're saying, we'll never understand why people make discriminatory decisions. so let's just let the computer make it. and i think that's a mistake. i think that's a tragic mistake. ah, that will lead to a lot of suffering. come for a lot of people so going back to the question that started us on this journey, can we trust our group? well, the biggest thing i've learned from speaking with asha virginia, shirad and many others, is that i've actually got the question. it isn't really so much about whether algorithms are trustworthy. it's more about the quality of the data that feeds and the objectives of those designing, controlling human biases, human imperfections, that's what we see reflected in now, algorithms. and without better oversight, we risk reinforcing our prejudices and social inequalities that wafted our program was shame that the past future as well. and by the path, that's often things the stigma and bias and stereotypes and rejection and discrimination. and really what we need is to allow for to be different from the old. of course we can build better tool tools and i see them everywhere that i go. but what makes a difference about good tools about just tool is building those tools with a broader set of values from the very beginning. so not just efficiency, not just cost savings, but dignity and self determination and justice and fairness, and accountability and fair process. and all of those things that we really care about as a democracy have to be built in at the beginning from step one in every single tool . ready we're actually getting our hands on the data we're analyzing the data. now one thing that we've done is we try to make as much of the data available as possible. so to encourage people to look at death, a, one of our, one of our projects is called the stanford open leasing project. we release lots of data and the criminal justice system. we release code for people to play with the data. and i encourage everyone to look at that and try to understand what's going on. and maybe they'll discover a pattern that yourself. my biggest piece of thought is to never underestimate your invoice. you know, you might be fighting some shane, some computer system that you've never been able to make or say, but his him flicked into huge harmless buffering. but your words can make government scared. your voices combined can make nets and courts sit up and pay attention together. we can shape the way these tools are created and the ways that they impact us as a political community. if we want better outcomes from these systems, we have to, we have to claim our space as decision making and decision makers at these tables. and we can't do that if we think that these technologies are somehow god's. they're built just just the way we build our kids. we build these technologies um and we have a right to be in dialogue with them. course the world, young activists and organizers are on the move. a generation change meets the new yorkers using alternative approaches to flight institutional racism and police brutality. this is indeed a nation wide problem. network wires, a systemic solution generation chain on i'll just do. we are all response. even people far away are so helping with the environment, problems in the amazon because their consumers i teach kids about the oceans are facing today. i've been working in earnest, trying to find ways to get this language up to them. kids, when do we do as the ocean? why and what are you going to do to keep out of the sort of language that keeps the red blood women right? they have want to circle back to their fight for equality and got them. eric, i was told the thing that was texting women, we made a challenge in the region. i will not start being throw like i want to sleep. we don't have read them in this study. these are about 2 weeks now, 3 days. journey to a show. we wish them your grade. so one destroys our country. someone needs to rebuild in depth analysis of the days headlines from around the world, whatever the deed was offered to them, they had to sign because they didn't a wouldn't get in frank assessments. do you think diplomacy still stand the chance? i'm not very upset about any kind of negotiation informed opinions. everybody tweets, everybody on tick tock, tick, tock doesn't vote. you have for her to winter, it's going to be whole there. pretty soon. inside story on al jazeera, he came from a wealthy background in paris and became an artist against his family's wishes. he went on to bring a fresh perspective to oriental, is painting, falling in love with the higher and culture making algeria his home and converting to islam of g 0 world tells the story of love to dean, the knee and his unique artistic work. the french orientalist analogy 0 january. oh, now it is era hope process is to promote a message of peace and reconciliation while visiting the democratic republic of congo and sat, sat on his 5th visit to africa as head of the catholic church, rigorous debates and unflinching questions. up front, cut through the headlines to challenge conventional wisdom immersive personal shore . documentaries, africa direct showcases african stories from african phil michael's can public private partnerships. so some of the world's most pressing challenges when government business in civil society did meet for the world economic forum, senegal, host, the all africa musical was a celebration of talent and creativity from corners of the african continent. genuine analogies era. ah ah, this is al jazeera. ah . hello, i am sammy's aiden. this is the news i live from doha, coming up the next 60 minutes, krasier dobbs, the euro, and joins the used, borderless travel zone. we'll have a live report from zagreb.