These hearings here. This is the first time that the select committee has taken live evidence in this way outside of the united kingdom. Id like to thank the witnesses from the Different Companies that well see today. A small piece of housekeeping at the start and for people unfamiliar with the work of house of Common Select Committee and more familiar with the work of congressional committee. It is understood that anyone taking part of a proceeding like this answers questions honestly and truthfully and it is an offense to mislead parol lent. We dont require you to swear an oath. Were also we have a number of sessions to get through today and im conscious we need to do it in good time and order. Be clear and direct with questions and to be clear what youre asking for and to say to the witnesses too because of the time constraint, witnesses answer the question put to them and dont join into general statements of policy and if the question is directed to you personally, answer that and if we can do that we can keep to some sort of time. I would like to start the questioning and mr. Grin gas to ask you and does google regard campaign of disinformation of fake news being harmful to your customers. First of all, thank you very much mr. Collins for the opportunity to be here and address the committee and present evidence. Theres no question that miss information and influence on our society is an issue and its an issue for google. I should point out, as you know, the mission is to organize the worlds information and make it successful and useful. Billions of users take advantage of us everyday and rely on us to provide right information. This is not only important to us as a company and important to your users and society, its actually crucial to what we do. I often think of ourselves aspect in the trust business. We will continue to retain the loyal usage of people around the world to the extent that we retain their trust with our work everyday. I agree, its an important and significant yessi issue and we continue to make sure were surfacing the best information from the best possible sources and do our best that trust worthy information does not surface. Im glad you think it is important, i asked if you thought it was harmful to the society and to your customers and the recipients of it. It can be harmful and miss information in a broad range of areas not just news and politics. People come to google everyday looking for medical information and its not hard in the medical area if youre concerned about health do peach pits cure cancer and you can go to google that might suggest that. And cross or surfaces looking at content. How do we make sure they have the right information that is helpful and not harmful. Double that as you say, as this information is harmful to society and to individuals, what responsibility do you think the companys like yours and the other Tech Companies well hear evidence from today, what responsibility do you think you have to protect your customers to be exposed to harmful disinformation. We feel an extraordinary responsibility. If they dont trust us, they will stop using our products and the business will collapse. We believe strongly in having an effective democracy and supporting Free Expression and a sustaining high e koe some to make sure our citizens and customers have the knowledge and information they need to be good. Theres been Research Done looking at the next up feature that google plays in providing fake information to users through recommendation. What steps are you taking to address that problem. Thank you and thank you to the committee. Im juniper downs and i our recommendation engine was designed to give users the kind of content they want to see. It works well. The educational content, comedy, music, providing users with the kinds of content they want to see. News content makes up a fairly small percentage, less than 2 . And theres work to do in terms of making sure were surfacing the right news content to your users. We have been investing that we demote low Quality Content and the as some of the press coverage has shown over the past week, we still have work to do and theres progress to be made and we recognize that and take responsibility for it. Youtube, supplied the Central Intelligence with channels, in saint petersburg, 1,100 videos and 43 hours of content. Were the identification of those channels based on the analysis of the committee or was that youtubes own research looking for channels like this . So the security and integrity of our product is cored to our Business Model. We cooperated with the investigation to whether there was interference in the u. S. Election and the channels that we discovered on youtube connected to the Internet Agency due to a thorough investigation that we conducted using our own resources. We have publicly reported that information to congress through Intelligence Committee where the general counsel testified back in the fall. The identification of those accounts based on your research and not intelligence or information you were given. Our research and leads and intelligence provided to us by others including other companies who are conducting similar investigations. But what did you do yourself . That wasnt based on leads but based on your ability of analyzing how people used the platform. We looked at the leads and went beyond that looking at advertisements that had a connection to russia and organic content to see if there were channels on youtube connected to the Research Agency that were not purchasing advertising but uploading content. The committee written to twiter and facebook asking for the analysis whether russian agencies involved in the elections in the u. K. And would you youtube or fasz book conducting that research for us and hope well get an update from twitter later, would youtube be prepared to conduct the same analysis for the u. K. Looking for potential interference in elections . Were happy to cooperate with the u. K. s government investigation and if there was interference of elections in the u. K. And we have conducted a thorough investigation around the brexit and found none. And we found no evidence of our services being used to interfere in the brexit referendum and we are happy to cooperate. Im asking about the committees investigation and were not looking necessarily for paid for advertising linked to the election but the operational of channels or films that can be linked back to Russian Agency and had a political purposeful message. We are happy to cooperate with the investigation. So thats a yes . Yes. Thank you. So if i can bring jo stevens. Thank you. You may be aware of a british journalist who we met earlier this week. And she was talking about the auto fill or autofinish search when she typed jew into the search function it would auto finish it for her. It came up with antisemitic searches. Why did it take a u. K. Journalist why was she able to identify the issue and why didnt google identify it previously. I thank you for that question. The occurrence happens from time to time, many we catch ourselves and others we dont. The auto complete feature on Google Search is important. You come to search and begin to enter a query and we offer suggestions based on what other users search for. Its a dynamic feature. Everything we work on in terms of the corpus expression of the web or the corpus of querys we is changing everyday and are indeed malicious actors out there who will seek to gain this as well. So as we have constructed these, we continue to build defenses to this and build mechanisms as such that users inside or outside of the company have the agency to identify these things and mention and bring them to our attention so we can correct them and thats what we look to do. Those terms that you might see in those are offensive to all of us egregiously offensive and we look to take care of them and it goes back to the trust of our users. So your algorithms are developed on the date that it is gathered, it is cyclical. Yes. It determines what phrases are pick up. So what safeguards are you putting in place to prevent the cyclical enhancement of hate content if people are searching using those terms . Well, its the continuous advance of our systems in terms of what kind of terms to look for and strings of words to look for and make sure they dont occur and maintaining con ensta substantiate efforts to evaluate the results we serve. We have a large team of radars. 10,000 people constantly at work assessing different results against querys. Are we surfacing the right kind of results and do they have authority and we use that 100,000 bits of data to continue to train our systems. This is an ongoing effort. I would like to believe that the algorithms will be perfect, but we will continue to strive to make sure situations like that occur infrequently as possible. My final question, do you have an ethics policy that your Developers Work within or a framework or do developers develop inherent biases influence the algorithms they build . We have a 160 set page of radar guidelines, Public Information and available for you to look at that guides their assessments. So we work on top of policy and constantly evolve that policy is we constantly train those who have to apply those policies sir, can i clarify, your radars are they developers . Theyre not necessarily developers so your developers that build and develop the algorithms, do they have an ethic policy . Yes. We have an internal poll circumstances the honest results policy that prevents the Engineering Team and people like me to that prevent people to influence algorithms in inappropriate ways. Youre in the trust business. Trust is based on knowledge. Double your customers know what information you retain about them and how you use it . I would hope so. We have made great efforts to provide transparency and control to our users about the information that we collect as they use our services. And we collect that information to make the Services Better to them. They can come to a control panel and hundreds of millions of people to look at what information were collecting and for that matter to change the settings. They can tell us to not collect certain kinds of information thats there for them. Its important to maintain a dialogue with our users about the information we collect to provide Better Services to them. We never share that information with third parties, never have and never will. It is crucially important that we protect that. It is part of the trust relationship that is important that we maintain with our users. So you will give me all of the information that you hold about me . You can come to google and look at the information that we have about you in our different services. And will you tell me how you use that information . For example, im a politician. Would you tell me how you could i could use that information for political purposes . We dont use any information for political purposes but we do try to be transparent about how we use that information. Ill give an example. Google search is not personalized except we will tune the results based on where you are. If you are looking for a restaurant in london. Otherwise, it is important in the to personalize search results. We expect that what you want to see and find is any information thats out there in the corpus of expression without us trying to guide you on way way or the other based on what we think you might be interested in. Do you market your capabilities to politicians . We market various services is, for instance our ads to various folks who want to advertise using our products and to some degree that helps use data to target information. Do you have a specific team who markets advertising to politicians . I am not that familiar with our Sales Organization selling ads to know how we approach these things. We have Advertising Sales teams in countries around the world. I imagine some focus on different categories but it is not in my area of expertise. Is it within your expertise . I want to reiterate about how we use the information that we collect from users ch so the main principal is transparency and control and for youtube, we do collect the watch history of signed in users but the user can pause and delete that watch history. We improve the service for users. We were talking recommendations if we know a person is a lover of comedy or music, we can use that watch history to optimize the service. We dont sell the information to advertisers. We do provide aggregate data to we work with various constituents interested in using googs ad products. Thats the kind of information provided not information about individual users. Thank you. Just on that point. If you can provide such precise aggregate data, why do you find it difficult to identify bad content or fake news being filtered in. Identifying and managing content on youtube is the number one priority for us this year. It is Mission Critical for the business and to our users, our creators and add vitizers and to us. We invest technology and the people working on these issues, the executive team is engaged and we meet hours every week to figure out how to improve the systems to make sure that the policies on youtube are followed and quickly identifying content that violates those policies and removing. So this is a top priority for the business. How much of the revenue do you reinvest in this way . I dont have an exact percentage, but we have spent tense of millions of dollars fighting spam and abuse across our product and committed to have 10 million across tense of millions . Tens of millions . Im sorry i dont know the answer to that question. You must have a rough idea. I dont have the answer but i want to reiterate that we will invest the necessary resources to address these issues not only in terms of the people that we employ but the technology that we develop. We are continually staffing up and we have seen Good Progress if our management of these issues, for example, over the past eight months the work that we have done on violent extremism speaks for skpits we have removed over 100,000 videos but the speed that we were able to identify the content and remove it from the site gotten faster and faster and we are at 70 of the content being removed and thats the kind of progress that we demand of ourselves and continue to strive for across the issues as we move forward. Lets see if we can work out what that percentage is. My suspicious is it might be small. I think many say you can sell the service. Thats how you make your money, selling the service to microtarget people and the same tools must be to identify harmful content. Of the content you do take down, how much of that is user referral of bad content and how much of it is bad content you discover for yourself . So the technology that weve developed to identify content that may violate our policies is doing more of the heavy lifting for us. So the in area of violent extremism, we are at 98 content, removed and identified by the algorithms that number varies from policy to policy depending on how precise the technology is on issues. When it comes to spam, identifying the miss information being dribted through our services in 2017 we removed over 130,000 videos for violating the policy and identified by our technology. If you have evidence of someone who is being heavily exposed to or distributing content or you know, other forms of extreme content that is may abdanger to children, do you what responsibility do you think you might have to share that knowledge with Law Enforcement agencies and people who want to take an active interest in those people . We cooperate and meet regularly with Law Enforcement and share information when it comes to user data we do that and process the mechanisms in place where Law Enforcement can request information from us pursuant to their investigations and if we identify content on our services that poses an immediate risk to life, we disclose that information proactively. When it comes to the safety of children, we report to the National Center for missing and exploited children, any solicitation of minors or content in that way, they then in turn cooperate with Law Enforcement internationally to make sure the instances are followed up with. I appreciate you do that, but someone might say if somebody had a Family Member victim of a violent act and from their interest in the on youtube the things they get interest in and maybe okay, this is someone who is violent and dangerous. People might say have you not got an responsibility to notify the authorities who give you cause or concern . If we identify someone on the service posing threat to individuals and we do see these instances where someone is threatening to perpetuate a violent attack, we would expose that. Thank you. Thank you very much. It is along the same lines i wanted to look at the ethics of google and look at your up next button and your scoring system that you have for videos. Would it be right to say that usually, the videos or the content that has the highest score is something that had the most views. So the quantity, would you say . So hits are important to you juniper downs. So it is quite complex and varies depending on the video being watched. One of the factors is content that is popular on youtube and there is content associated. So if theres an interest on a particular kind of knitting, there may not be a lot of highly Popular Videos about that type of knitting, but well recommend videos that are similar and provide more instruction on that type of knitting but popularity means lots of hits doesnt it . Research suggested that the largest source of news for false news channel came from youtube and these video that is tend to be recommended by the up next button and more research demonstrated a lot of them had a far right bias. So i put it to you if thats the case, you are indoctrinating society. So im not here to comment about the guardian research, but when it comes to recommendations we recognize that we have more work to do. Recommendation is a reflection of user interest. Our goal is to promote authoritative sources and lower Quality Content from less established sources. It is not only popularity but also drawing if sources that are well established known sources of news and put it towards the top do we always succeed, we dont. The stock market fell and our algorithms standard of care market plunge aspect a newsy query because a lot of them were using plunge in the coverage but didnt recognize stock market crash. That was flagged to us and we corrected it and made sure we were surfacing authoritative sources across both but the subtle language can cause our systems to misfire on occasion. The guardian 8,000 into their Software System which tracks political disinformation campaigns and in every case the largest source of traffic came from the youtube up next algorithm, you are saying you dont agree . Im saying it is an area of progress. So that research was done in the lead up to the election. Weve made progress since in making sure the authoritative news surfaces to the top. We recognize some of the results are not where we want to be as a product so it is an area of investment for us. At this point it seems like in choosing to rank videos and to have a system of what is gooeded and better and higher and lower. You are acting as an editor which is an editor of a newspaper will do and yet you are not calling yourself editors but hosts and i wonder whether you might think that the description of what you are ought to be changed and that your whole name for your platform ought to be changed so that you take on more of the responsibilities as would a bonafide newspaper so that you would have to apply broadcast and a newspaper regulations to yourself. At the moment you are unregulated. So when it comes to the scale of our services we have well over 400 hours of content up located by kree tors every minute, there has to be organizing principles by we determine when somebody comes to youtube and searches for funny videos, there has to be an algorithm functioning. I think the question of whether were a publisher gets to the question do we take responsibility for the content of our site and the answer is absolutely yes. This is a top priority to us to make sure we are providing a useful service to our use spers exercising you are self policing . We have is that right . We have a set of Community Guidelines that set the rules in place, they are dynamic and evolve in response to changing trends in the world and changing trends in what kind of content and misuse of services we see. We developed those guidelines because we are trying to maintain a certain kind of community for our users. It goes beyond what the law requires and we follow the law so when the jurisdiction is launched. We block that content for the relevant jurisdiction. I think the model of self regulation that i just described. If you look at the forum, cooperating with the European Commission of the hate speech being removed is a good model how we can collaborate to ensure we are acting responsibility and quickly to deal with these content issues and the commission has acknowledged the process. You are described as the overlook story of 2016 and describing this disease on society and affecting behavior it is surprising you dont seem to register any of that. We take the criticism there seriously. Again as a top priority of the kpa company to make sure we detect content that may violate policies, we are watching for bad ak tors. The opens of youtube brought tremendous benefit to society, educational content, culture, the arts, music. And the benefits are tremendous. It presents challenges and we are committed to managing and dealing with responsibly. Thank you. We have been busy trying to work out the question on how much youtube makes. Figures not disclosed by the Company Estimates it is about 10 billion u. S. Dollars. Does that sound about right. I cant confirm any of those number it is is not information that i have access to. Lets say if it was and for easy math if you are investing 10 million a year in making youtube safer and the revenue is 10 billion you are re not likely to be satisfactory in addressing some of the concerns that Rebecca Powell raised. I appreciate if the company will expose figures in terms of revenue. How about a percentage we would be grateful for that. Based on public relatable figures it seems to be where they come out. Are we investing adequate resources in addressing these issues. Theres no constraint on the resources that we will put towards getting this right. We have investing tremendous, time from our Engineering Team, executive teams and the trust and safety operation which is nearly doubled in size over the past year. We take these issues seriously and if we decide that we are unable to make our desired progress because we dont have adequate investment in the issue. We will invest more in addressing it. If we are a multibillion dollar business and tens of millions of dollars addressed these concerns that have been well documented, that sounds like an unam boishs program of investment and the given the huge size of the platform, people wonder if that will be successful. I want to follow up with the questions about the article on the guardian. It quotes, one of your former engineers, who says watch time was the priority that your algorithms give to prioritizing what comes up. Not necessarily truthfulness or decency or honesty. How about that allegation . Watch time is an important metric because it demonstrates that we are providing a service that users love and want to spend time on the product and they are enjoying and finding the experience valuable m for certain kinds of content, watch time for music for example, i know i listen to hours of music on youtube because it keeps delivering the music i want to hear. We know that for news and other verticals, the veracity of the content that we provide to our users is important which is why the changes we made over the past years have been about making sure we are surfacing more authoritative contents to users, getting the information they are seeking and lowering dedicated new surfaces and have a new breaking news shelf. We surface high Quality Content near the top and the recommendations limited to the authoritative sources and we invest in making sure that the users of youtube have the skills they need to assess the content they are watching. So in the u. K. We create add campaign committed to 25,000 younger people and learning to check sources and bias in fake news. Its an effort that supports the whole ecosystem and we work with publishers. We launched a program, players for publishers and has 50 european news providers where we provide them with Technical Support to have a video player embedded into the website and work with them to optimize the content they produce to youtube and their watch time doubled and they appreciate that fact their watch time increased in the investment to them. So you do accept theres a problem of bogus news and misleading news and you accept that you have the responsibility to and the means to address that . We recognize that theres a problem of misinformation and we are dedicated to making sure that we are promoting authoritative content when it comes to news and do you have a sense of mobile responsibility to take those actions . We have a sense of social responsibility and its a business priority to for us. Again, the trust that users in our services is relying on a high quality experience. If people are coming for news they expect factual reporting from a variety of sources. We provide sources from diverse news source and yes because we recognize it is an important policy matter and it is important business priority. Paul louis confirmed the conclusions in relation to the United States president ial election which suggested that youtube was six time more likely to recommend videos for the Trump Campaign than the clinton campaign, do you accept those figures . Im not here to comment on the particularly methodology. Go ahead comment on it. We didnt publicly say that we didnt agree with the methodology . Why not. We were this is a but youve got a bigger selection because youve got the whole set so maybe you can run that methodology past your set which is bigger and consider it will be a more accurate solution, how about doing that . Let me explain how the recommendation functioned in the lead of the election. It is a reflection of the user interest. To the extent that there was more content recommended about one than the other, there was a lot more interesting expressed in that candidate, so we saw more searches and trends across broadcast coverage for the u. S. Election. One candidate got more coverage. Thats what users were looking for and interested in. Youre recommendations up next simply reinforce whats gone before and it tends to spiral in a particular direction as opposed to providing balance. So this is an important question to figure out how to ensure that were providing information that users will want to watch and it is diverse. We do not build political bias so the algorithms we design products for everyone. We dont build bias, there is a human element. People want to watch what they want to watch. It is hard if someone isnt expressing an interest in a particular person to just insert something that is opposite to that and expect them to watch it. We see abandonment of the service when we do that. Humans have their own will so we troo i to create a diversity content and keep it topical enough that the user feels full fulled. If a video is getting hundreds of hits or thousands of hits and gets taken it down isnt that strange in reference to the Business Model . Imnot sure i understand the question. If a video is getting hundreds or thousands or millions of hits and thats good for spu they take it down when it is at the top of the game would that not strike you as odd . The Community Guidelines apply to all of our crete tor it is. So we identify content when flagged by users or dedetectived. If they violate the policies we remove them immediately. And what if the creator themselves take it down. They have control over the content and free to delete it any time. Thank you, chair. You mentioned how much progress you made and you believe that your moves in this area are quite effective, yet we discovered potentially youre not spending 1 of your turnover and more recently, this morning, we have the front page of the wall street journal that states the old style journalism and he says the recommendations that you present are divisive, misleading and false content despite changes the site made to highlight neutral fair and the potential of 77 of viewing time taking up with these recommendations. So in the light of that, where has your, why has your self regulations failed . And how many chances do you need . So, again, our recommendation engine designed with a main view case of youtube in mind. The people that come to youtube and love the site for, comedy, music, cooking vlogs and beauty vlogs. The effort we put in to demote lower Quality Contents a work in progress and we will continue to invest in getting it better. You substantiate by the fact that is kite effective . Quite effective, news is less than 2 of our watch time so it not the majority use case of the platform. So like 2 i think they are working better than they were six months, eight months ago and so on. This is an area we are investing to make sure we are providing the right news experience and committed to doing better and when we see the results in the wall street journal article, we are not proud of them. Thats not the experience we want to provide. So you agree this article, you dispute the guardians one but this is a fair cop effect. Is this a fair cop . I think that the results that are pictured in that article were taken from youtube and so to the extent that we look at those and think we can do better, we certainly look at thoses and think we can do better. So one area that you do better is sporting, so for example, if i wish to post a final touch down in the super bowl and i think a soccer goeg, that would be taken down within minutes. It would be almost instant. Why does it take so much longer and at all when it comes to misinformation by foreign powers that are specifically looking to undermine the west . So when it comes to copyrighted material we have invested a lot over the years to make sure we are protebting peoples intellectual property right, the reason it is faster, because they provide us with a digital file with the copyrighted material so we have that as the starting place. We know what the contents looking for and we can match it quickly using technology. If we were provided a digital file of every misinformation video, we would be able to identify it quick lichlt our system has to look for patterns, bad actors are evolving and evading detection. It is a cat and mouse gam and trying to stay one step ahead. So it is not about the money and being sued that you act quickly when it comes to copyright but when it comes to the misinformation by foreign actor it is, the evidence is abundant and out right, it is effectively not in the same ballpark. Just a sporting analogy. It is a top priority for us 1 . Im not sure that the financial behind this is the right metric and the technology that we deploy which is a hard thing though monotize, to identify this content with speed. If you try to quantify in people hours, how many hours of human endeavor being saved by the technology, i think we would get to a vaster number. So in order to address the issues at scale need to invest in the technology that can find the content and enable us to act on it quickly. Okay. Im going to turn to you in one moment. Sort of germany has reg ulated ahead h of the elections hate speech and social media, and in the reaction no the immigration, and move the commentators responded and said that there is a demonstrable reduction in the level and the effectiveness of the interference there and also in the french election, and surely this is strong evidence that the way in which the western democracies protect themselveses is to regulate you . So we work with the elections and campaigns around the world to help them with the Digital Tools to protect their web sites and campaigns from any interference or phishing at emts and so o attempts, and so on, because we are quick will to provide security support to campaign, a and this is a service that we can provide and collaborate with ho thers on that, and in terms of the german law that was with recently passed, there is robust public debate about the risks and the ben fitz of that approach, and i think that since it has been implemented in january, a lot of the criticism of the way that it has played out. So i think that the debate is ongoing on whether that is the right approach. Thank you. Mr. Gingrass, your company has been very kind to host us in new york on evidence that was presented of the stark impact of the social and National Press of the sucking out of advertising by companies such as yourselves from the more traditional media. What happens when you talk about how you want to train the journalists to get them to enable and basically to use your platforms more effectively . What happens when you have a few sort of the Global Players or the likes of the New York Times or the Washington Post or the british broadcasting corporation. It is pointless to have a journalistic probe because there are so few of the them left. What do you do ensure that the pop journalism survives . It is a very important question, and one that we are deeply concerned about as well. We should point out that the success of the efforts with the Google Search and the ads business depends upon a rich ecosystem of knowledge out there including the local level. As we often say that our success is dependent on the success of the publishers. I will point out that clearly the Publishing Business at every dimension has been certainly changed and affected by the introduction of the internet without question. I will also point out, however, the various things that we are doing to assist in this. First of all, i should point out that we share 70plus percent of the display add revenue with the publisherers throughout the world to the tune of 12. 5 billion, and so that is one element as well providing them with the add systems, and tools that allow them to derive revenue to millions of publishers around the world and benefit from that. We drive traffic to many new sites in the tune of well beyond 10 billion visits her month which is valued by third parties at between 5 and 7p per visit which is another 5 to 7 billion revenue. But that is one way that the eco system has changed. Think about it with our own action, and i started out with this in the Providence Journal when i was a teenager does the prove are dense journal exist . Yes, it does. Well, there are not that many who do, and thank you nor backgroubac for the background, but basically the point is what happens when the journalism landscape is decimated and people dont know what to trust in area. Is there a sense of responsibility that you have in order to ensure that we have that trusted news . We do feel a strong sense of responsibility for the reasons they pointed out. It is important to society and the nature of the business, and so we have mounted many, many efforts. My role at google is both on the one side to oversee our efforts on the surface search of google news, and the other el role is to mount the various efforts that we have to help enable the Publishing Services and the legacy publishers to go from the old marketplace of information to todays, but the behaviors have changed. When you think about it and i was young and used for the used car, i went to the classified ad s in the newspaper, and if you were doing that in birmingham, you would go to dumb tree, and if i were looking for a house, i would go to the newspaper, but in liverpool, you will go to zillow, and even with the the National News. Twoint the local newspapers for National News and much of it may have been wire services, but now peoples behavior has changed to go to the london times and so the marketplace has changed. So in what we feel is part of our responsibility is how do we help the news organization, and enable the news organizations to help figure out what kind of products can they serve to the local communities . We do it in various ways. One is through the deep collaboration with with the publishing industry, and it is impornt that we understand that there are challenges and they understand what we can do and mounted the efforts like the subscription project, and can we bring better tools to drive the subscription avenues and better day to provide them better opportunity in the markets and help with tools of the digital storytelling and through the Google News Lab and we have trained thousands of journal iss in the the u. K. And we will continue to do r more and we have provided funding through the Digital News Initiative and 6 Million Pounds, and i should say 6 Million Pounds in the u. K. To help folk like Trinity Mirror to develop new News Services in your area. So there sis a lot of opportuniy here to continue to develop, and we are seeing the good science of success around the world, but no question that there is more work to be done and more innovation that needs to take place to reach the sustainable ecosystem of the local news publishers as well. Thank you. Thank you. Thank you, jim. A couple of follow up questions. One for mr. Mathison and one for mr. Nate. Mr. Mathison, your business relys on the people being on the site and viewing for as long as possible and hence the sophisticated recommendation engine, and so how do the sophisticated recommendation engines differ from the children to the adults and is the goal the get to user regardless of the age to stay on the site and consuming the content for as long as possible . When it comes to google, we only al allow the yuserers 13 ad over, and that is the limitation on the site. I think that your question gets to this issue of public concern around tech ed a daddiction and particularly the young peoples use of smartphones and social media. When i look at the goal of the google service, it is to provide products that enhance peoples lives and not detract from them. And so we believe that this question of how we can fulfill that goal better is a really important question to society. So we are investing in the research to bert understand these issues, and better understand what kind of Product Design we can implement to ensure that we are providing a product that is an enhancement to peoples lives and p productivity and looking at youtube, we have a billion views of educational cop te aal conte and this is so tremendous of an opportunity with the coned academy. How do you differentiate between adult consumption and child consumption . We dont look at the the adult consumption versus the child consumption, and the recommendations are based on the content watched and the content associated with the video or the assigned watch