Of control back as well. We meet the robots with a sense of touch. So, how did you sleep last| night . Did you sleep well . And spencer looks at how a Video Game Engine can make movies. We can move the shard. Laughter. Which does make me feel more powerful. According to britains biggest cancer charity, someone is diagnosed with cancer here in the uk every two minutes. Despite massive progress, many treatments are still incredibly harsh and not always successful. However, we now have technologies like Aland Robotics which are helping us to discover more effective drugs with fewer side effects quicker. One in two of us will be diagnosed with cancer in our lifetime one in two for years, treatment has been centred around surgery, chemotherapy and radiotherapy, sometimes a combination. Cancer therapy has evolved over the last two decades, and particularly moving, shifting away from chemotherapy for every patient and for every cancer to personalised treatment. In some cancers, we have really made great progress. And 10 15 years ago, Melanoma Survival Rates median Survival Rates were around six to nine months. Now, patients live, survive years. Standard Cancer Therapies can be incredibly tough to go through. And one of the great new hopes for Kinder Treatment is immunothera py. Thats drugs that work alongside your own Immune System to kill cancer cells. Scientists here at labgenius in South East London are harnessing the power of robotics and ai. Theyre hoping itll help them find new Immunotherapy Treatments and make existing ones better. Theres a lot going on in here some of it by automated machines, some of it by people. Thats right. So, the different thing about what this machine is doing, and what you might envisage or see happening in a normal lab, is all of the experiments have actually been designed by an algorithm. And all of the data youre using is your own data . Thats absolutely right. And actually, that is a really, really key point. We want to build models that are predictive of certain biological features of interest, and the challenging thing for us is that data thats required to train those models, it doesnt exist anywhere in the public domain. But its notjust these few machines ive seen. Its everything. Eva is a smart robotic platform, synthesising, purifying and testing molecules using Machine Learning and the clout that comes with cloud computing. And this room is the final bit of the lab process, to make sure that the molecules are tested for purity, that they can withstand changes in temperature and, crucially, that something has been created that can be produced at scale. Labgenius says that with its unique approach, its more likely to find high performing treatments faster. And they have tested that claim. We took a molecule thats currently in clinical trials. Its an Immunotherapy Drug . Its an immunotherapy, thats right, currently in clinical trials. We took the one thats furthest along. And we said, can we use this process to make molecules that are any better than this . We were hoping for a tenfold improvement in that molecules ability to distinguish between healthy and disease cells. But what we actually found is that these Machine Learning models were able to deliver molecules with 400 times improvement in terms of their killing selectivity so, really, orders of magnitude better. These are molecules that, as a protein engineer, you would never have sat down and designed yourself. And i think thats something thats really quite differentiated and special around this fusion of Human Ingenuity and machine intelligence. But they want to take this to the next level. Cancer cells are densely covered in Surface Markers that distinguish them from normal cells. And its these markers that molecules developed by labgenius are targeting. Theyre called immune cell engagers. One part of the immune therapy is targeting a very specific cancer protein, or receptor. The other part is inviting t cells to join the party and really attack cancer cells. So this is version two of targeted immunotherapy. Its a very pinpointed approach. You can attack much harder and with much less collateral damage. I particularly believe the preclinical development, which is now through Companies Like labgenius, is much more condensed, much quicker, supported by the right regulatory framework, much, much quicker. And it really, you know, historically, with drugs took ten years to be developed. It would bring us down to five years and even quicker sometimes. Will we cure cancer . I think thats still an open question, but better understanding of biology, a more targeted approach will help us getting most of our Cancer Patients surviving, with good quality of life, for many years to come. Wow it really does feel like we are getting there with these cancer treatments. I know its always going to be slow progress, but this is another example of the power of big data. Yes, and what really stuck with me from that day was the testing of drugs on healthy cells because many experts have said to me the big challenge isnt creating drugs to kill cancer, its developing tolerable drugs to kill cancer. Right. Of course. Yeah. 0k. Well, next, Shiona Mccallum has been looking at some new tech which could help to monitor Cancer Patients at home rather than in hospital. Shiona cancer has dominated lynns life for the last seven years. You get ct scans, youve got mri scans, youve got blood tests, seeing the doctor, youve got all these other tests and things. And it does feel, at the very beginning, that every day, there was another appointment coming through. Its exhausting physically and mentally. Youre just waiting and waiting and waiting. She spent a lot of time away from her beloved cat, sunny, getting chemotherapy and blood tests in hospital something shes always found stressful. I dont like needles. I dont like having bloods taken. You can be waiting two hours for the results. If the analysers break or something goes wrong, it can be 3 4 hours before you get your results for the blood tests sitting in that blood room, getting more and more anxious, getting hotter and hotter. By the time i would go into the chair, i would probably. You know, i do faint. But lynn has recently had access to a Tech Solution that she says has made a huge difference to her life. When i did the training session, it was so simple to follow and to use. This is liberty, a device which allows lynn to do a quick finger prick at home instead of a full blood test at the hospital. You can do it the day before you have your treatment or the day before you go and see your consultant, when you want to do it. Youve got that little bit of control over your treatment and your time. I can do my blood test and i can go and do Something Else whilst its being analysed. And the one thing that i did like about it as well i didnt have to keep going and being reminded that i had the cancer. The kit has been designed by Health Tech Company entia and is the worlds first remote Patient Monitoring Solution for people undergoing cancer treatment. The liberty has just had Regulatory Approval and is in the early stages of being rolled out to various hospitals. Here at the christie in manchester, they were one of the first places to try out the tech. Absolutely packed an all too familiar sight in a hospital waiting room. If you could do the test at home, youd be far more relaxed. You wouldnt be worried about the appointment times, the waiting rooms, particularly if youre feeling ill and tired. And its just another worry for you, another anxiety whilst youre going through your treatment. What would it mean if you could do this process at home, using. . Itd be amazing. Wouldnt have to get childcare and wouldnt have to worry. Youre more comfortable at home, arent you . Youve got your home comforts. Coming here can be quite scary sometimes. And reducing patients coming through the doors was one of the reasons this hospital took a punt on the tech. What weve done Recentlyl At The Christie has actually positioned Phlebotomy Units around the region. So we now have 10 or 12| units around the region, what we call bloods closer to home. So the patients can book in themselves to have their blood tests, but it still means we have to staff those units in order for the patients to be able to have the blood tests. If the patients were able to simply get those blood tests done or do i them themselves at home, that would result in significant efficiencies. And thats really important. Were not trying to distance i ourselves from patients, were trying to make their lives easier. And during the clinical trial, the medical teams found that the Blood Testing at home was just as good as in the lab. And then the next thing, of course, is, how acceptable is doing the bloods at home to the patients . And im very pleased to say thatj its a highly acceptable approach to the patients that weve tested this in. They also found it easy to use. The results from the devices are easy to read. I can see the white blood cells, the neutrophils, the haemoglobin and the platelets all on the screen. And it meant a lot less of this giving blood which many patients find particularly difficult. So what is the ambition for this bit of tech . So now that weve got Regulatory Approval, were really actively engaged with many nhs centres across the country and really looking at new clinical use cases that this solution can actually be applied to. But its not too onerous for them, is it . Yeah, so everything that weve done at entia has been designed to offer patients a very simple journey. I mean, we are not trying to medicalise patients further. Were really trying to take them out of the hospital setting, placing them back in the community, in their homes, in the place that they really want to be, and to provide them, then, with tools that are very simple and intuitive to use. So it looks like there will be more patients like lynn who will get the chance to access the liberty. But given the volume of Cancer Patients across the country, mass adoption is still a long way off. Ijust. Fell in love with the machine, to be really honest with you, because it was just. It gave me a little bit of control back as well. Lara tiktok has filed a lawsuit aiming to block a us law that would ban the video app in the country unless its sold by chinese founded company bytedance. The Company Called the act an extraordinary intrusion on free speech rights. Ofcom has warned social media sites they could be banned for under 18s if they fail to comply with new Online Safety rules. The uks Media Regulator has published draft codes of practice which require Social Media Firms to have more robust age checking measures. At the moment, teenagers, younger children up and down the country can experience harmful content on their social media feeds again and again, and this has become normalised. And that has to change. Researchers from mit and project ceti claim theyre now a little closer to understanding how sperm whales communicate, all thanks to Machine Learning. The team used algorithms to decode the sperm whale� s phonetic alphabet, revealing sophisticated structures in their communication, similar to our own phonetic system. And finally, Apple Hasjust announced its thinnest product ever in the form of a new ipad pro series. Atjust over 5mm, the devices will also come with a more powerful chip, as well as a stylus in the form of the apple pencil pro. Robots have integrated into our lives in many different ways, taking over tasks that were once exclusively performed by humans. But one thing that has been limiting them is their inability to feel not emotions, but touch. And a company in edinburgh is working on bridging the gap. Skin is the biggest organ in the human body, and its a vital organ. If you look at most of evolution, youll see that almost every animal out there has some form of sense of touch. So the idea was really that at some point, touch will be the bottleneck. It will be the limiting factor in robotics, to get robots into unstructured environments in the real world. To give machines the subtle capabilities of human touch, they developed electronic skin by printing sensors onto different materials that can be applied to robots. They� re fully flexible and printed and extremely thin, and they provide all of those sensations to a robot. If you want to, lets say, handle something very delicate, a strawberry or, in our case, you want to hold the hand of, you know, your grandmother in a hospital, or maybe a newborn baby, Something Like that, then you need a very highly precise sensor. Taking it a step further, these sensations can then be relayed back to humans again through haptic interfaces, such as a body suit or a glove. Despite its importance, touch has been overlooked in the field of robotics, Making Robots Clumsy and unfit for a variety of basic tasks. The precise sensing capabilities of electronic skin enable robots to engage with objects and humans in a more gentle manner. By detecting changes in pressure and touch, they can navigate environments with increased safety, steering clear of collisions and avoiding unintended harm or damage. To demonstrate this, they are deploying valkky, an avatar system equipped with electronic skin on its fingertips, to a hospital in finland. There, it is operated by nurses to support them in caring for patients, especially those who are immunocompromised. After the pandemic, they have realised that there are a lot of problems that they would like to solve, they would like to prepare for when Something Like this happens in the future. And this is where we came in and started figuring out, how can we build a robot, build a system that can actually be useful in a hospital . Without this specific e skin technology, the robot would be too dangerous to be near patients. As a nurse, you come in, you sit down in a chair, you put on a vr headset. You have then a choice of either a Vr Controller with a couple of buttons and joysticks, or you use a haptic glove. So, how did you sleep last night . Did you sleep well . Are you experiencing any pain as of the moment . I nurses can use the robot avatar to speak to patients, deliver food and medication and take physical measurements. We have a set of Laser Scanners that tell us how far different obstacles are. For more spatial awareness, we have a 360 degrees camera, which means that, through the virtual reality, the user can look around at any time, even see behind. We have a little nice feature of a rear view mirror. Every time there is something happening behind the robot, you get to see that as well. Then weve got the heat camera as well, to be able to tell peoples temperature. In a hospital, where the Risk Potential is high, these functionalities are the only reason they are allowed to run such a project. They also plan on using Machine Learning and the data sets they gathered to automate some of the nurses tasks in the future. Utilising ai, their aim is to develop autonomous and dextrous robots, capable of performing practical tasks such as Repositioning Patients or changing catheters. You can use as many cameras as you want to, but when it comes to actually detecting whether you are causing any harm, being able to sense touch is very important. And this is where we would like to cover the whole robot with electronic skin. The era of tactile intelligence in robotics has dawned. The sensation is still fundamentally very different from human touch, but integrating electronic skin is an emerging field that robots can now put theirfinger on. Indistinct radio chatter. Spencer you know, theres something strange. In this neighbourhood. Music ghostbusters by ray parkerjr. But, no, this isnt the big new ghostbusters movie currently hitting cinemas. This is a short film made using video game graphics. The software behind it is unreal engine, a system that allows Video Games Developers to build sd worlds where the physics, lighting and non player characters all behave in a realistic way in real time. I aint afraid of no ghost. The motivation behind the project originally was to see how far we could push real time graphics, and really to see if we could shoot what would normally be a post Visual Effects shot that would potentially take weeks or months to process after the fact, if we could shoot that live on a Motion Capture Stage using current real time gaming technology. And so Sony Pictures connected unreal engine up to their Motion Capture Stage, called in ghostbusters afterlife Directorjason Reitman and asked him to play around. The Stay Puft Marshmallow Man was puppeted by a motion capture actor, and reitman used a vcam a kind of virtual camera to record the action from a variety of angles that would have been pretty expensive to do in the real new york. He could shoot from street level, he could get up to building height, he could shoot from cranes, he could do drone shots. Theres a great example of a happy accident during the shoot, where jason stepped backwards into a building and actually really liked the shot through the window, so we ended up doing some shots through a building window, down onto a street. And those buildings werent built from scratch. See, unreal engine comes with 20 Square Kilometres of ready made city to work from, complete with autonomous inhabitants. Theres a system within the city where traffic is driving and coordinating junctions and avoiding other vehicles in a completely kind of organic way. We made some enhancements to that. Very specifically, we had the challenge of driving the ecto through some busy streets and needing vehicles to pull out of the way of an Emergency Vehicle with sirens. So, we made some modifications for the vehicles to automatically detect when there was a siren from an Emergency Vehicle if they could hear that coming, they would all pull to the side of the road, so the ecto could drive straight through. Having full access to and full control of a few city blocks means that this kind of production can be achieved without a massive film crew having to shut down new york city. Johnny siren blares. Or london, for that matter. Yeah, we can move the shard. Laughter. Which does make me feel more powerful. Laughter we can move. Move. Yeah. Lets move the tower of london. The tower of london. Thats impressive, isnt it . This is flite, a full is minute film made in unreal engine by oscar winning Visual Effects director tim webber. Again, he was able to build a futuristic city, shoot from any angle and pick the weather, the time of day and lighting conditions in a way that would be impossible in the real world. Weve ended up with an over three minute shot of a continuous action sequence, with our actress riding a hoverboard, withjet cops flying in, very, very carefully choreographed. If you wanted to shoot that for real, it would take you weeks. You couldnt shut the bridge down for weeks. You know, one long, continuous shot i mean, it would be. Even for a really, really Big Budget Feature Film and we werent a big budget film that would be an exceedingly big challenge. This technology has come so far from the days when actors and directors would have to imagine what their cgi sets, which would take months to render afterfilming, would look like. As well as using the virtual camera, we sometimes did virtual scouting and exploring the locations in vr. So wed put the headset on, and we could very quickly move around, and a few of us could be in the same space virtually at the same time. So thats the equivalent of being on a realfilm set and Walking Around and going, ok, i think we need to be over here, or i think we need to move that. Absolutely. Yeah, its totally virtual, but you can move wherever you want. And even. We put the headsets on the actors so that they could explore the environment, too, so that they could. When they were acting, they had a full understanding of the environment they were going to be in. Tim is building on his work on the film gravity the one that won him the oscar in which, for the most part, the only real things in the shot were Sandra Bullock and george clooney� s faces. Framestore pioneered the use of led walls to show the actors what they were performing against. And these days, the real time abilities of unreal allow the scenery to be completely dynamic and react to improvised and unprepared camera moves on the day of the shoot. However, tim still thinks that the actors performances themselves shouldnt be synthesised. The whole world is created in the computer. Its cgi except for the faces. Creating virtual faces is very challenging and expensive and hard to be properly engaging. Its very easy to fall into the uncanny valley. Faces aside, though, it does look like theres a new tool in the box for film makers for whom a Location Shoot was previouslyjust a pipe dream. So, next time you want to wreck or save a city, who you gonna call . Ghostbusters if ya all alone pick up the phone. Electronic whirring. Engine revs. Ghostbusters why are we here on this roof . we could have been anywhere. Whos to say we really are on this roof . Anyway, thats all weve got time for. Thank you for watching. See you soon. Bye. Hello there. Settled and warm again on friday, with plenty of late spring sunshine around and temperatures rising across the four nations into the low 20s in celsius. The warmth is set to last as we head through the weekend. Temperatures will remain above the seasonal average, warmest towards the east. And its still dry for the vast majority of us on saturday. A scattering of showers, with the real breakdown happening on sunday. Heavy, thundery showers out towards the western half of the uk. Further east should stay largely dry. And here is the area of High Pressure thats keeping these dry, settled conditions for the time being. It will eventually push further eastwards into scandinavia, but weve got a bit of an easterly breeze, and thats been dragging some mist and low cloud in from the north sea. But that will lift and clear across the South East Of England and east anglia through saturday morning. Still maybe a hang back of cloud towards parts of the yorkshire lincolnshire coast, though. Lots of sunshine to start the day and well keep the sunny skies for most through the afternoon. But a scattering of showers across scotland pushing northwards, perhaps some heavy and thundery, but theyll be fairly isolated. Its still very warm 2a degrees celsius in glasgow. Chance of a shower, too, across Northern Areas of Northern Ireland and north wales. A little cooler towards these north sea facing coasts, with some of the cloud possibly lapping onshore again at times. 25 or 26 degrees celsius in london and south east england. So the High Pressure starts to push further eastwards as we head through sunday. That allows for these low Pressure Systems to roll in from the west. And this Weather Front will bring us Thickening Cloud across the south west of england, western wales, on sunday morning. Some showers across the Western Isles and western scotland, pushing into Northern Ireland, and the chance of some thunderstorms developing all across the western half of the uk. But it should stay drier further east. Again, there will be a lot of sunshine here, and once again we could see temperatures in the low to the mid 20s in celsius. But cooler out towards the west, of course, underneath the cloud and with the eventual rain. And here comes that low Pressure System swinging in as we head through monday. Its going to give us quite widespread rain on monday, especially through the afternoon, so expect it to turn a lot more showery as we head through next week. And therell be a drop in temperature, too, so unsettled and cooler as we head through next week. Bye bye for now. Live from washington. This is bbc news. A Us State Department report criticises israels conduct in gaza but stops short of recommending the us halt weapons supplies. In gaza, the territorys main Un Aid Agency warns it only has three days worth of food remaining. And Russian Forces launch a surprise cross border attack on ukrainian territory near the city of kharkiv. Hello im caitriona perry. Youre very welcome. The us has released a report to congress finding that israel may have used american supplied weapons in breach of International Humanitarian law in some instances during the war in gaza. The document says however that the us government