Transcripts For CSPAN3 Instagram CEO Testifies On Kids Online 20240709

Card image cap



>> may the meeting of the subcommittee, on the dangers of social media to children and teens on social media. we really appreciate your being here mr. mosseri, your response to our invitation is very welcome. i want to thank you and your team for your cooperation and i want to thank ranking member senator blackburn for being such a close partner in this work, as well as our chair woman cantwell and ranking member roger wicker for their support as well and all the members of our committee for being so engaged on this topic. as a note to start, i understand mr. mosseri has a hard stop at five, so i'm going to be strict on the five minute time limit. i know everybody thinks of me as a very nice guy but i'm going to be ruthless, at least attempting to be ruthless, as best any senator can be with his colleagues. in this series of hearings, we've heard some pretty powerful and compelling evidence about the dangers of big tech to children's health, well-being and futures. our nation is in the midst of a teen mental health crisis. social media didn't create it, but it certainly fanned the flamed and it's fueled it. and if anybody has any doubts about the potential harmful effects of social media, the surgeon general issued a powerful report about the implications of social media, as well as video gaming and other technologies on teen mental health, and that's part of the reason we're here. the hearings have shown the social media, in particular big tech, actually fans those flames with addictive products and sophisticated algorithms that can exploit and profit from children's insecurities and anxieties. and our mission now is to do something about it. we're here to do more than shake fists. we really are seeking solutions. and we welcome the voices and the vision of big tech itself in that effort. i believe that the time for self-policing and self-regulation is over. some of the big tech companies have said "trust us," that seems to know what instagram is saying in your testimony, but self-policing depends on trust. the trust is gone. what we need now is independent researchers, objective overseers, not chosen by big tech, but from outside, and strong, vigorous enforcement of standards that stop the destructive, toxic content that now too often is driven at kids and takes them down rabbit holes to dark places. the day before this hearing, instagram announced a set of proposals. these simple time management and parental oversight rules should have, could have been announced years ago. they weren't, and in fact, these changes fall way short of what we need in my view. many of them are still in testing, months away. the roll-outs will be done at some point in the future. we don't know exactly when, and unfortunately, these announce changes leave parents and kids with no transparency into the black box algorithms. the 600 pound gorillas in those black boxes that drive that destructive and addictive content to children and teens. no effective warning, or notice to parents, when their children are spiraling into eating disorders, bullying or self harm. nothing more than the bare minimum controls for parents and, of course, no real accountability to ensure parents and kids these safeguards will work. i'm troubled by the lack of efforts on instagram kids, once again, this pause looks more like a public relations tactic brought on by the hearings just as the announced changes seem to be brought on by these proceedings announced just hours before your testimony. and we need real serious review of those changes. the magnitude of these problems requires bold and broad solutions and accountability which has been lacking so far. facebook's own researchers have been warning management, including yourself, mr. mosseri, for years about instagram's harmful impacts on teen's mental health and well-being and whistle blower who sat exactly where you are told us about those documents, about the research, the studies, which show that facebook knew. it did the research, it had the studies, but it continued to profit from the destructive content, because it meant more eyeballs, more advertising, more dollars. given those warnings, it seems inexcusable that facebook waited a decade to begin, and only to begin figuring out that instagram needed parental controls. in the past two months, your subcommittee heard horrifying stories from countless parents whose lives and their children's lives have been changed forever. one father from connecticut wrote to me about his daughter who developed severe anxiety in high school because of constant pressure from instagram, that pressure became so intense following her home from school, following her everywhere she went, on into her bedroom in the evening that she attempted suicide, fortunately her parents stepped in and sought help and found a recovery program, but the experience continues to haunt her and her family. facebook's researchers call this fall to that kind of dark rabbit hole a perfect storm, that's the quote, perfect storm, created by its own algorithm that exacerbate downward spirals harmful to teens. again, facebook knows about the harm, it's done the research, the studies, the surveys repeatedly. it knows the destructive consequences of the algorithms and designs. it knows the struggle with addiction and depression on instagram, but that data has been hidden, like the algorithms themselves. just yesterday, in a surgeon general's report provided powerful documentation on how social media can fan those flames and fuel the fires of the mental health crisis that we face among teens. and it signals that something is terribly wrong. what really stuns me is the lack of action. in fact, just within the last two months, two months ago, this subcommittee heard testimony from facebook's global head of safety, ms. antigony davis, at that time, i showed her the pro-eating disorder rampant on instagram. i demonstrated through an experiment, how its algorithms will flood a teen with triggers and toxic messages in just hours after we created an account. this was glorification of being dangerously under weight, tips on skipping meals, images we could not, in good conscious, show in this room. it's been two months, so we've completed our experiment. on monday, we created another fake account for a teenager and followed a few accounts promoting eating disorders and again, within an hour all of our recommendations promoted pro-anorexia and eating disorder content. two months ago, the global head of public safety for facebook was put on notice by this subcommittee. nothing has changed, it's all still happening. and in the minetime, more lives have been broken, real lives with real families and futures, and you hear from them yourself. we all know that if facebook saw significant threat to its growth or ad revenue it wouldn't wait two months to take action. so why does it take months for facebook to act when our kids face danger, when time is not on our side. time is not on our side. so no wonder parents are worried. in fact, parents are furious. they don't trust instagram, google, tiktok, all the big tech fears and by the way, this is not an issue limited to instagram or facebook. parents are asking what is congress doing to protect our kids? and the resounding bipartisan message from this committee is legislation is coming we can't rely on trust anymore, we can't rely on self-policing. it's what parents and our children are demanding. senator blackburn and i are listening to them as are other members of committee. we are working together. your proposal for industry body as parents yet again, to trust us, do it ourselves, but self-regulation relies on that trust which has been squandered. we need to make sure that the responsibility is on big tech to put a safe product on the market. you can't conceal when products are harming kids. so the first imperative is transparency. we need real transparency into these 800 pound gorilla black box algorithms and addictive designs and disclosure has to include independent, qualified researchers who will then tell the story to the public. we need to update our children's privacy laws, congress should pass the bipartisan's children and teens' online privacy protection act authors by senator marky who is here today, i'm proud to be working with him on updating and expanding it. parents and children need more power, and more effective tools to protect themselves on the platform. and that's why senator blackburn and i are working on a framework and made good progress to enable that protection. there really should be a duty of care. united kingdom has imposed it. it's part of the law there. why not here? that ought to be part of the frame work of legislation we're considering, section 230 reform you make a reference to in your testimony, the days of absolute broad, unique immunity for big tech are over. and finally, enforcement. state authorities, federal authorities, law enforcement, has to be rigorous and strong. so i hope that we will begin the effort of working together, but one way or the other, this committee will move forward and again, i thank you for being here, i thank all of my colleagues for attending, and i ask for remarks by the ranking member. >> thank you, senator blumenthal and welcome, everyone. we are appreciative that you are here today, mr. mosseri, we are grateful for your time, and for your testimony. i do want to thank senator blumenthal for his team and the work, this is the fifth hearing that we have held dealing with the issues around big tech and the invasions of privacy, the lack of data security, the need for section 230 reforms, and looking very directly at what these platforms, social media platforms that affect, the negative and adverse effect they are having on our children. i will tell you that today, i am just a little bit frustrated. frustrated because this is the fourth time in the past two years that we have spoken with somebody from meta, as you are now calling yourselves, and i feel like the conversation continues to repeat itself ad nauseum and when i go back to tennessee, i know the people there, lots of moms and dads and teachers and pediatricians, they share this frustration, because they continue to hear from you that change is coming, that things are going to be different, that there are going to be more tools in the tool box, that kids are going to be safer online, that privacy is going to be protected, and that data is going to be secure. but guess what? nothing changes. nothing. the chairman just talked about what happened with ms. davis when she came in and how we pointed all of this out specifically of what we had found and yet, yesterday, what happened? the exact same thing. so i hope that you appreciate the frustration that the american public feels, that they are, they appreciate what the internet can do for them, in bringing the world closer, but the applications that you are pushing forward, the social media, the addictive nature, the way this affects children, there is such a frustration that you turn a blind eye toward taking responsibility and accepting accountability for your platform, how you are structured, and how you use the data on these children. yesterday, at 3:00 a.m., which is midnight in the silicon valley, you released a list of product updates that you said would raise the standard for protecting teens and supporting parents online, and i'm not sure what hours you all keep in california, but where i'm from, the middle of the night is when you drop news that you don't want people to see and maybe you thought that doing it in this manner would keep members of the subcommittee from seeing it right away and from raising concerns, because while i'm sure you know that we fully share the goal of protecting kids and teens online, what we aren't sure about is how the half-measures you've introduced are going to get us to the point where we need to be to truly protect teens and young adults online. for example, we know that social media is an integral part of teens' daily lives, according to the mayo clinic, 90% of teens between ages 13 and 17 use a social media platform. and 45% say they are online almost constantly, so while telling teens to take a break might seem helpful on the face of things, it's probably not going to get most teenagers to stop doing what they're doing and take a break. educational tools for parents can be helpful, but frankly, i'm more concerned about the things we know kids and teens are hiding from their parents. we know that facebook and instagram have encouraged teens to use secondary accounts and told them to be authentic and we all remember what it was like to be a teenager so while parents might gain some insight into what their teens do on their main accounts, what do they do about the accounts they don't even know exist? the ones that instagram is encouraging them to create. and instagram announced in july that it would default all teens on to private accounts when they sign up for the site, yet just yesterday, my team created an account as a 15-year-old girl and it defaulted to public. so, while instagram is touting all these safety measures, they aren't even making sure that these safety measures are in effect. for me, this is a case of too little too late because now there is bipartisan momentum, both here and in the house, to tackle these problems we are seeing with big tech. as senator blumenthal said, we are working on childrens' privacy, online privacy, data security, and section 230 reforms. this is the appropriate time to pass a national consumer privacy bill, as well as kid-specific legislation to keep minors safe online. we also need to give serious thought to how companies like facebook and instagram continue to hide behind section 230's liability shield when it comes to content like human trafficking, sex trafficking, drug trafficking. despite congress speaking clearly to this issue when we passed fosta and cesta a few years ago. mr. mosseri, there is a lot of work for us to do to improve the online experience, and to protect our children and our grand children. i think it's best if we do this together and i look forward to hearing your ideas and your testimony today. thank you for your appearance. >> thanks, senator blackburn, i'm pleased to introduce adam mosseri, he spent over 11 years at meta and oversees all functions of the instagram app, including engineering, product management, and operations, mr. mosseri, the floor is yours. >> apologies, ranking member blackburn, subcommittee, i'm adam mosseri, and served as hid of instagram since 2018, over the last few months the subcommittee has held a number of hearings on the safety and well being of young people online. this is a critically important topic as you said in your opening statement and it's something that we think about and work on everyday at instagram. the internet has changed how we all communicate. it's changed how we express ourselves, it's changed how we stay connected to people that we care about. it's also changed what it's like to be a teenager. teenagers have always spent time with their friends, developed new interests and explored their identities. today, they're doing those things on platforms like instagram, youtube, tiktok, and snap chat. i firmly believe that instagram, that the forget more broadly, can be a positive force in young peoples' lives. i'm inspired everyday by teens on instagram and proud that our platform is a placing where they can spend time with the people they care about, where they can start incredible movements, where they can find new interests, where they can even turn a passion into a business. i also know that sometimes, young people can come to instagram dealing with difficulty things in their lives. i believe that instagram can help in those critical moments, it's one of the things our research has shown and to me, this is the most important work that we can do, taking on complex issues like bullying and social comparison and making changes. now i recognize that many in this room have deep reservations about our company, but i want to assure you we do have the same goal. we all want teens to be safe online. the internet isn't going away and i believe there is important work we can do together, industry and policy-makers to raise the standards across the internet to better serve and protect young people. but the reality is that keeping people safe is not just about any one company. an external survey just last month suggested more teens are using tiktok and youtube than instagram. that is an industry-wide challenge and requires industry-wide solutions and industry-wide standards. now, we have a specific proposal, we believe there should be an industry body that will determine the best practices when it comes to what i think are the three most important questions with regards to use safety -- how to verify age, how to build age-appropriate experiences, and how to build parental controls. the body should receive input from civil society, from parents and regulators, the standards need to be high, and the protections universal i believe companies like ours should have to earn section 230 protections by adhering to those standards. we've been calling for regulation for nearly three years now and from where i sit there's no area more important than use safety. that said, i understand the developing policy takes time, so we're going to continue to push forward on the safety and well-being of young people online. on age verification, we're developing new technologies to address this industry-wide challenge, creating a menu options to verify people are old enough to use instagram. we're allowing new technology to find and remove accounts belonging to those under the age of 13. we're also using technology to understand if people are above or below the age of 18 so we can create a more age-appropriate version of instagram for them. for example, adults can no longer message people under the age of 18 that don't follow them, and as of this week, we announced that people can no longer tag or mention teens that don't follow them as well. we also provide tools for parents. parents and guardians know what's best for their teens and we're launching instagram's first set of parental controls in march of next year allowing them to see how much time their teens spend on instagram and set time limits. we'll also give teens a new option to notify their parents if they report someone, giving their parents an opportunity to talk about it with them, as a father of three, i care a great deal about creating an online world that is safe for my children and that allows them to benefit from all the amazing things the internet has to offer. as the head of instagram, i recognize the gravity of my role in making this happen, not only for my kids but for generations to come. i'm hopeful that we can work together to reach that goal. thank you. >> thanks, mr. mosseri, first round of questions, again, five minute rounds. just a short while ago, at our last hearing, tiktok, snapchat and youtube sat at that table and they all committed to making internal research algorithms and data sets about their effect on children and teens available to independent researchers. will you commit to doing the same? >> senator, we believe it's important to be transparent both with ranking algorithms and data for research. i will commit to you today that we will provide meaningful access to data so that third party researchers can design their own studies and make their own conclusions on the effects and well being on young people and on ranking, i can do all i can to explain how ranking works and other ways for us to be transparent about algorithms. >> will you support a legal requirement that independent overseers and researchers not only have access to the data sets, but also check the way algorithms are driving content and recommend changes that you will adopt? >> senator, i'd be happy to have my office work with you on that, we do a number things in this area already, provide information every month on the effects of algorithms removing pornographic content from our systems. >> will you commit that the access be provided and an independent, separately provided and separately funded body, not an industry body, as you've suggested, but an independent overseer and researcher have that access? >> senator is on the specifics of how the body works i'm not a legal expert, but yes, i believe there should be industry standards on both data and algorithms. >> because an industry body is not government regulation that mark zuckerberg or others at facebook and elsewhere have called for. an industry body setting standards is not the same as an independent one. let me ask you, shouldn't children and parents have the right to report dangerous material and use and get a response? get some action? because we heard harrowing stories from parents who tried to report and have heard no response. my office made a report and got no response until cnn made the report to press relations. shouldn't there be an obligation that instagram will respond? >> senator, yes, i believe we try and respond to all reports and if we ever fail to do so that is a mistake we should correct. >> instagram is addictive. that's the view that has been repeated again and again and again by people who are expert in this field, parents know it, and for teens who see instagram's algorithms encouraging, for example, eating disorders, they find it almost impossible to stop. the uk code restricts instagram's use of addictive design. legally restricts its use of addictive design. shouldn't we have a similar rule in the united states? >> senator, respectfully, i don't believe the research suggests that our products are addictive. research actually shows that on 11 of 12 difficult issues teens face, teens are struggling said instagram helps more than harms. now we always care about how people feel about their experiences on our platform and it's my responsibility as head of instagram to do everything i can to help keep people safe and we're going to continue to do so. >> we can debate the meaning of the word addictive, but the fact is that teens go to the platform, find is difficult, maybe sometimes impossible to stop, and part of the reason is that more content is driven to them to keep them on the site, to aggravate the emotions that are so seductive and ultimately addictive. the uk recognizes it, i am proposing that design restriction, the same ought to be done in the united states. let me ask you, will you commit to make the pause on instagram kids permanent? in other words, stop developing the site, an app for children under 13? >> senator, the idea of having a version of instagram for 10 to 12-year-olds was trying to solve a problem, the idea being that we know 10 to 12-year-olds are online, they want to use platforms like instagram and it's difficult for companies like ours to verify age for those that are so young they don't yet have an id. the hope is to always, or the plan was to always make sure no child between 10 and 12 had access to any version of instagram, even one that was designed for them without their parents consent so what i can commit to today is no child between the ages of 10 or 12, should we ever manage to build instagram for 10 to 12-year-olds, will have access to that without their explicit parental consent. >> i have more questions, my time expired, thank you for answering my questions mr. mosseri, senator blackburn. >> thank you mr. chairman, staying on instagram kids for a moment, i know you were doing research into 8-year-olds and pulling together data on 8-year-olds and i assume that was in relation to instagram kids. so are you still doing research on children underage 13? >> senator, i don't believe we ever did research with 8-year-olds for instagram kids and no, neither are we doing that today. we entirely paused the project. >> okay, and then if you were to completely remove that project, who would make that decision? >> senator, it was my decision to pause instagram kids. >> so it would be your decision to just do away with it? >> senator, i'm responsible for instagram, so yes, it would be my decision. >> okay. let's talk about jan dowe versus facebook, i assume you can't get into the details of that because the supreme court is still deciding whether to take that case, but the petition which alleges facebook enabled the trafficking of a minor on its platform really raises some very serious questions, and concerns about what we're seeing and how people are using instagram. so do you prohibit known sex offenders from creating instagram accounts? >> senator, human trafficking and any exploitation of children is abhorrent and we don't allow it on our platforms. >> okay. do you require minors to link their accounts to a parent or guardian's account? >> senator, no. if you are over the age of 13, you can sign up for an instagram account, but we do believe that parental controls are incredibly important, which is why we're launching our first version in march of next year. >> okay. you know, yes, the controls are going to be vitally important, but an industry group is not going to give the controls that are needed and probably not even an independent group, that is why we'll do something with federal statute. also, i think it would be interesting to know how many people that are in human trafficking, sex trafficking, and drug trafficking, that have been indicted are convicted that were using instagram. could you all provide that number for us? >> senator, i'd be happy to talk to the team and get back to you -- >> that would be excellent. my staff created an instagram account for 15-year-old girl and it defaulted to public. i mentioned that earlier. isn't the opposite supposed to happen? and have you considered turning off the public option all together for minor accounts? >> senator, i appreciate the question. i learned of that just this morning. it turns out that we, though we default those under the age of 16 to private accounts, for the vast majority of accounts who are, which are created on android and ios, we missed that on the web and will correct that quickly. >> it defaulted to this statement, include your account, recommending similar accounts that people might want to follow. is this a feature that should remain on by default for minors? >> senator, we thing it's important that no matter what your age, it's easy for you to find accounts that you're interested in. >> even if you're under 18? >> senator, i believe teenagers too have interests and it should be easy for them to find -- >> teenagers have interests, yes, but what we're trying to address are the adverse and negative effects happening to children because they are on your platform. can adults not labelled as suspicious by you still find, follow, and message minors? >> senator, if your account is private, if someone follows you, you have to approve it. so adults can ask to follow you, but you have the decision or the ability to decide whether or not they're allowed to. >> okay, in your testimony, you said you removed more than 850,000 accounts because they did not meet your minimum age requirement, these accounts were disabled because the users did not show verification that they were at least 13 years old. so why did you say you didn't want to know when jojo sewa said she had been on instagram since she was 8 years old. is that your general attitude toward kids who are on your platform? >> absolutely not, senator. it was a lot to you try to identify those under the age of 13, and to -- >> but at that moment when you responded, why didn't you use that as a teaching moment? >> senator, i would say it was a missed opportunity. >> indeed, it was a missed opportunity and it sends the wrong message. it looked as if you were encouraging kids that want to be online stars to get on earlier and to build their audience. this is a part of our frustration with you, with instagram, and with these platforms. thank you mr. chairman. >> thanks, senator blackburn. >> thank you, mr. mosseri, i'm looking at this from a perspective of parents, and i guess i'll talk to parents, since so many of them have told me that they have done everything they can to try to get their kids off your product. kids who are addicted at age 10 and they are scared for their kids, they want their kid to see do their homework and not get addicted to instagram, and yet we then find out that what your company did was to increase your marketing budget to try to woo more teens, by 64 million in 2018, to $390 million focused on kids this year. and so when i hear you're going to suddenly, with all your technological wizards, develop some kind of new way to check to see if young kids are on there, you could have been spending this money, $390 million to do that for years, you have the money to do it. i think that we are in diametrically-opposed goals, that of parents and the goals of your companies. our kids aren't cash cows and that is exactly what's been going on, because when you look at the marketing budget and you look at what your company has done, it's to try to get more and more of them onboard, and when i look at your company's quotes from one document, you, not you personally, but your company viewed losing teen users as a quote, existential threat whereas parents are viewing their addiction to your product and other products as an existential threat to their families. is that the truth that you've been increasing advertising money to woo more teen kids on to your platform? >> senator, no, i don't believe those statistics are correct. we increased overall marketing budget between last year and this year, but it was not, i think you characterize the majority of focused on teens and it's not true. >> okay. so had you viewed the kids as a feeder way for people to get into your product? have you not done things to get more teenagers interested in your product? are you not worried about losing them to other platforms? you better tell the truth, you're under oath. >> absolutely senator, senator, we try to make instagram as relevant as possible for people of all ages, including teens. teens do amazing things on instagram every day. we also invest, i believe, more than anyone else, in keeping people, including teens safe. we will spend around $5 million this year alone and have over 40,000 people working on safety integrity in the people. >> and do you think three hours a day is an appropriate amount of time for kids to spend on instagram. >> senator -- >> i'm asking this because just when you put out the new rules, that was an option for parents, three hours a day. is that a good use of kids' time? >> senator, i appreciate the question -- >> and it was in your safety tools that you just put out there, the first option given to kids, to parents, was three hours a day. >> i have them, can i put them on the record -- >> sorry, i was -- if i may, senator, i'm a parent, and i can understand that parents have concerns about how much screentime their kids have. i think every parent feels that way. i ultimately think it's a parent -- that a parent knows what's best for their teen so the appropriate amount of time should be a decision by a parent about the specific teen. if one parent wants to set the limit at 10 minutes, and another parent wants to set the limit at three hours, who am i to say they don't know what's best for their children? >> and do you believe your company has investinged enough in identifying that young children are not on the platform? when you know they're not supposed to be on there and making sure you're registering and using all your technology, not to just increase your profits, but to make sure kids aren't on there? do you think you've done enough? >> senator, two things, one, yes, i believe we've invested more than anyone else, but i also believe it's still a very challenging industry-wide issue. i think there's a number of things we can do at the industry level to better verify age, specifically, i believe it would be much more effective to have age verification at the device level, have a parent who gives their 14-year-old a device tell the phone that their child is 14, as opposed to having every app and there's millions of apps out there try to verify age on their own, should happen at the device level, we understand that might not happen or take time and in the meantime, invest heavily in getting more sophisticated in how we identify the age of people under the age of 18. >> and is it true that someone in your company said that it was an existential threat if you lost teen users? is that true or not, because we have a document that said that? >> senator, i assume that's true. >> so you understand what we're thinking up here when it's our job to protect kids and we have parent calling our office, emailing us, one of the parents likened to me it was like a water faucet going off and over flowing and she's sitting there with a mop, trying to figure out how to use all the tools you give them that she can't figure out how to use, so i think at some point the accountability is on you guys, that means everything from the privacy bills, to expanding the child protections online, to the competition policy, because maybe if we had actual competition in this country, instead of meta owning you and owning most of the platforms and most of the back and forth for kids, maybe we could have another platform developed that would have the privacy protections that you have not been able to develop in terms of keeping teens off your platform that aren't even old enough to be on there. so that's what i think, some food for thought for all of you is the opposition to some of the competition policy, capitalism, pro-capitalism ideas we have over judiciary, and i'll hand it back to the chair. >> thank you chair woman for your work on the judiciary chair committee. i will ask a couple of questions, because we have a vote on going, so a number of my colleagues will be returning from the floor. you know, your suggestion for tech companies to earn section 230 protection has a certain appeal to me, since i'm the author of the earn-it act along with senator graham. it's also the concept that underlies other proposals that we made, but that's not government regulation. so the question is, will you the uk's childrens' code, that instagram has to obey in the uk. shouldn't kids in the united states have protection as good as the kids in the uk enjoy? >> absolutely, senator, a few quick things. one is, i believe that is the age-appropriate design, i'm forgetting the last letter of the acronym and i believe it's something that we support, and we support safety standards for kids everywhere, including here in the u.s. i also view, if you endulge me for a minute, my proposal is an industry body that sets standard for safety from civil society, policy makers and parents, but that once the standards are proposed they would be approved by policy makers like yourself, i also believe policy makers are regulators, should make the decision whether any individual company like mine is adhering to those standards, so not just self enforcement -- >> and then enforce them? seek lawsuits and damages? >> senator, we believe a strong incentive would be to tie some of the section 230 protections to adherence and that could be a decision by regulators. >> so would the attorney general of the united states, or the attorney general of a state like connecticut, where i was attorney general, have the power to enforce those standards? >> senator, we believe in enforcement, specifically how to implement that enforcement is something we would like to work with your office on -- >> well that's a simple yes or no. enforceability has to be part of your proposal. >> i agree, enforceability is incredibly important, without enforcement it's just words. >> so you think the attorney general of the united states could enforce those standards which means it would be written in the statute. >> senator, i don't know, i'm not a legal expert if the best way to enforce it would be with the attorney general, but in general, i think it should happen at the federal level and something i would be happy to have my team -- >> would you favor private rights of a action so individuals harmed could bring an action against meta? >> senator, i believe it's important that companies like ours are held accountable to high standards, but i believe the most important way or most effective way of doing so is to define industry standards and best practices at the federal level ideally, and to seek enforcement, as you suggest. >> these are really yes or no questions. they're pretty clear, and i know you are knowledgeable about them. so i hope you will answer them more clearly in the answers that you provide in writing. i'm going to yield to senator -- >> yes, and i can do this on the record, but i did want, since you denied this idea that the marketing budget went from 67 million to 390 million and that much of the budget was allocated to wooing teens, that was reported on from the yew york times from internal documents from your company, so do you still deny that this is the fact? >> senator, occasionally there are reports that are inaccurate. in this case, i believe that article said that the majority of our budget was focused on teens and i know for a fact that was not the case. >> much of the budget, is that accurate? >> senator, i don't remember off the top of my head -- >> then could you give me the number of the percentage of the budget focused on teens? you must, as a business, be able to break it down that way? i'll ask it in writing. >> i'd be happy to follow up on that. >> okay. thank you. >> senator marky. >> thank you, mr. chairman. you know, thanks to your leadership, mr. chairman, we have francis haugen us quite cl of teen girls say when they feel bad about their bodies instagram makes thel feel worse and 16% traced their desire to kill themselves to instagram. that is your own research. yet faced with these frightening findings did facebook back off its efforts to target children? no. just the opposite. facebook pursued plans to launch a version of the platform for even younger users. instagram kids. and that is appalling. i'm glad facebook has heeded my calls and paused these plans but you have since publicly doubled down on instagram kids and said it is, quote, the right thing to do. your statement makes crystal clear that self-regulation is not an option for parents and children in the united states of america. instagram sees a dollar sign when it sees kids. parents should see a stop sign when it sees instagram. do you support my bipartisan legislation with senator bloomen thal, senator cassidy, senator lummies to update the children's online privacy protection act and give 13, 14, 15-year-olds control over their data? yes or no. >> senator, respectfully, it is important we're clear on what the research actually shows. any loss of life to suicide or any other reason is a tragedy but that was, that 6% number is inaccurate. it was 1% of teens traced their thoughts back. anybody feeling worse about themselves is something we take incredibly seriously. you asked if i support the specific act you are proposing. i do strongly support federal regulation not industry regulation when it comes to youth safety. that said, if you move the age from 13 to 16 we know that 14 and 15-year-olds also want to be online. they also can lie about their age. you are going to make the challenge of age verification even more difficult. that said, i do believe 13 isn't a magic number. people's needs as they grow up evolve and we should build age appropriate experiences based on children's age. >> so will you give 13 to 15-year-olds the right to have all of their information expunged? that is being gathered online. do you support legislation that would give parent and children the ability to have their records expunged? >> senator, you can already delete your account and all of your data. you should have that right whether a teenager or adult. >> would you support national legislation that mandates that each parent and child be given the ability to expunge it? >> senator -- >> would you support legislation to do that, make it mandatory? >> senator, i would support legislation that required companies like ours to allow people to delete their data, yes. >> okay. and just to make that a permanent protection that's on the books. would you support legislation to ban targeted ads toward children? >> senator -- >> toward teens and children? >> senator, we believe that anyone should always have an age appropriate experience on instagram or any social platform and that extends to ads. we have different rules for ads on instagram and on facebook. we only allow advertisers to target based on age, gender, and location. we don't allow certain types of ads. things like weight loss ads and dating ads for those under the age of 18. or alcohol related ads for those under the age of 21. >> so do you support legislation that would ban targeting of ads to children yes or no? >> senator, i believe it is valuable for ads to be relevant but i do believe some measures need to be taken to keep children safe which is why i would support something in the direction of what we do which is to limit the targeting abilities of platforms. >> okay. well, again, yes or no? mandate or no mandate? that is the question. it is exactly why we have to make sure that facebook and instagram don't reserve the right to be able to target these kids. yes or no? >> senator, i'm trying to be specific about what i would support, which is what we build, which is a limited targeting options. >> all right. again, i just keep coming back to the fact that your answers are too vague to make it impossible for us to make these decisions in an informed, legislative way and to do so in the very near future, which is what i think we have to do. the chilling truth, unfortunately, continues to just be that in the absence of regulation that big tech has become a threat to our democracy, our society, and to the children in our country, and let's just be clear. facebook, which owns instagram, opposes regulation. your idea of regulation is an industry group creating standards your company follows. that is self-regulation. that is status quo. and that just won't cut it post the revelations that this subcommittee has made public. we do need laws. we need laws passed by this body. we have to ban targeted ads. we have to make sure that that is the law in our country and everything that this subcommittee has unveiled continues to make that a necessity including the testimony that you are delivering today. thank you, mr. chairman. >> thank you, senator markey. senator baldwin? >> so i -- sorry i missed just a segment as i wen over to vote and it may have come up because this is work i did with senator klobuchar. but i joined senators klobuchar and caputo in writing to metaphor more information about how instagram is combatting eating disorder content and the harms it brings to users, particularly young people. in response to our question about how the platform is working to remove this content, meta indicated that it uses a combination -- i am quoting now from the letter -- a combination of reports from our community, human review, and artificial intelligence, close quote, to find and take down material that violates your terms of service. the response further notes there are more than 15,000 human reviewers on staff. i also met with frances haugen the former facebook employee whose disclosures spurred this series of hearings. and when i asked her about what more instagram could do to remove content like this glorifying eating disorders, she argued that more human review is really the key. according to her, many community reports simply are not investigated and artificial intelligence cannot successfully identify patterns, networks, and distribution points for problematic content. given that mehta's platforms according to your own data have 3.6 billion monthly active users, 15,000 reviewers would seem to pale in comparison to the amount of content those 3.6 billion monthly active users could post. do you agree more human reviewers would help you move more successfully at removing problematic content more quickly and, if so, will your company commit to strengthening its investment in human review? >> thaeng you for the question. we have over 40,000 people and engineers and others who work on safety and integrity. we are investing about $5 billion this year. at a high level, people are better at nuance and technology is often better at scale. i think the most effective thing we can do not only for eating disorder content which is tragic and a complicated societal issue is to invest more particularly on the technology side and in both. we'll continue to do so. >> i know this hearing is focused mostly on youth and harmful -- but in how many countries is instagram available? >> senator, instagram is available in over 70 languages. i, unfortunately, don't know the number of countries off the top of my head but i would be happy to get back to you with that number. >> okay. so 70 languages. that is the point i was going to get to. of those 40,000 or 15,000, which is what was in the letter i think the letter response, how many are language specific sort of monitoring content if each of those 70 languages? >> thank you, senator. actually i misspoke. we review content in 70 languages. we have even more languages people speak that use instagram and we're always looking to increase the number we cover. i apologize for the mistake. >> okay. so there are gaps, then, in terms of human review in those areas. >> senator, we're always looking to improve not only through language coverage but through building more accurate classifiers, through improving the efficiency of our reviewers. because it helps keep people more safe. >> i may have some follow-ups with regard to that, that are more specific. but we also all know that there is tremendous social pressure for kids to utilize social media platforms and services. while it is the industry standard to block or restrict access to those younger than 13, we know that younger teens and tweens are still signing up for social media accounts. i appreciate that your company decided to press pause on its proposed service focused on younger kids earlier this year. i understand from your announcement yesterday that meta is looking to introduce new parent controls and other tools for instagram in the coming months. but i am concern what you are doing today to keep kids under 13 off the platform. tell me more about what you are doing to strengthen age verification and why, given ott problems of which you are already well aware, would instagram and meta's experience with other services focused on younger kids like messenger kids, why have you waited to institute more parental controls or other steps protecting young users? >> senator, on parental controls question, i believe as a parent it is going to be more responsible to develop an age appropriate version of instagram for those under 13 but i paused that project and i took the exact work they were building, which was parental controls, because no one was going to have access, unless they had parents' consent. we pivoted that to teens because 13 isn't a magical number. you also asked how do we verify the age of those under 13? it is difficult given young people of that age don't have an i.d. in most countries. we build what we call classifiers which try to predict age and ask people to prove their age if it looks like they might be too young. we look at things like in certain countries, sweet 16 is a cultural norm here in the u.s. do people say that on someone's birthday and does it line up with the age they said. we get better over time as we get more signals but i want to be very clear it is not perfect, which is why i believe there are better industry wide ways to solve age verification because it is an industry challenge not unique to facebook or instagram. >> thank you. chairman blumenthal will be back from his vote in a moment and we're going to be starting our second round. we have some other members coming for their first round. in the meantime i want to return to a question i asked you about those that are human traffickers, sex traffickers, drug traffickers, their utilization of your site. now, i know that in 2020 you sent over 21 million sex abuse images on your platform. facebook sent these. you sent these and i thank you for doing that. that is the right thing to do. i am interested to know whether these child exploitation images and reports, if in your reports, do you include traffickers? do you include those violations when you make that report with these images? >> senator, i have to check on that specific and get back to you but we do allow you to report an image or photo for violating any of our standards. you can see how well we are at reducing the prevalence of those problems in our consumer -- >> then get back to us and let us know if you are also reporting these individuals that are posting and sharing these sexual abuse images of children. >> absolutely, senator. >> thank you, madame chair. thanks for holding today's hearing. the lack of transparency from big tech companies and the effect these companies have on consumers is concerning to the public and rightfully so. because of the secrecy with which big tech protects their algorithms we have little idea how they use it to express or affect content and affect users without their knowledge. tomorrow the committee on which i serve as ranking member will take a closer look at the effects of this technology and i look forward to hearing from the panel about the details of how algorithms and artificial intelligence are deployed on internet platforms to manipulate users as well as the bigger picture about what the future might hold for corporations and governments when they know more about us than we know about ourselves. we must find ways to improve more transparency and accountability on the algorithms employed on internet platforms that select the content billions of people see every day. since hearing from the facebook whistle-blower we now have more insight into instagram and facebook's troubling practices with regard to how it uses algorithms. in my view, it's long pastime for congress to enact legislation to ensure these companies are held accountable. there is also bipartisan support for shedding more light on the secretive content moderation processes big tech uses and to provide consumers more options when engaging with internet platforms which is why i introduced two bipartisan bills to address these issues. the pack back and filter bubble transparency act. i look forward to discussing these issues with you today mr. mosseri. let me start by asking. does instagram use persuasive technology, meaning technology designed to change people's attitudes and blafrs? >> senator, i've worked on rafrnling and algorithms for years and that is not how we work. we use ranking to connect people with friends they find meaningful and to try and keep people safe. >> "the wall street journal" revealed that instagram often ignored warnings about the harmful impact the platform had on users particularly on girls. with that being said do you believe consumers should be able to use instagram without being manipulated by algorithms designed to keep them hooked on the platform? would you support giving consumers more options when engaging on instagram's platform? for instance, providing a feed not fed by algorithms or that isn't a chronological order? >> senator, i believe it is important people have control of their experience so yes i support giving people the option to have a chronological feed. >> on the issue of section 230 reform senator shotz and i have introduced legislation that would among other things require platforms like instagram to provide for more due process to users regarding their moderation and censorship practices and submit public transparency reports about content that has been removed or de-emphasized. do you believe this provision would help build trust with instagram's users? >> senator, we believe in more transparency and accountability and we believe in more control. that is why we're currently working on a version of a chronological feed we hope to launch next year and why we provide a number of ways for you to see how content moderation works on the platform. for instance today you can go to the account center, i believe it is called account status, and see any content that has been taken down. that is why we're working on more ways to give people more control over their experience and create more transparency about how instagram works. >> koo you believe algorithm explanation or transparency are appropriate policy responses? >> i believe very strongly in algorithmic transparency. i think it would be hard to find someone who as much has tried to explain how ranking works. there are a number of ways to be more transparent. one is to look at the outcome of algorithms as we do in our report and in other cases more appropriate to explain how they work instead of releasing millions of lines of code. >> could you just elaborate a little bit on, when you talk about creating, giving consumers a feed not being fed by an algorithm or that is in a chronological order, you said that you're going to implement that policy beginning next year. how you came to that decision and sort of more specifically what the dates for that implementation and maybe if you could elaborate a little bit on what exactly that might look like. >> absolutely, senator. we have been focused for a few years now on how to give people more control over their experience. one idea we've experimented with publicly is called favorites where you pick a subset of people you want to show up at the top of the feed. another one we've been working on for months is a chronological version of instagram. i wish i had a specific month to tell you right now but we are targeting the first quarter of next year. >> we would like to take what you're proposing to do and codify it and that is what the filter bubble act does. thank you, ma'am. >> senator, you are recognized. >> thank you very much, chair blackburn. i want to thank everyone for calling this important hearing as well to our chair and ranking member. i'll be holding a hearing addressing dangerous algorithms assessing harms of technology and communications, media, and broadband soes side to discuss resolutions to outline content that spreads misinformation and threatens well-being of our children and promotes extremism. one of the lines of questions i had today based on the questions that came before, from other colleagues earlier in this important hearing lies around data retention and deletion. there was a line of questioning now from one of my colleagues as well. does instagram have in place practices to abide by the principle of data minimization especially for sensitive personal information? >> yes, senator. >> can you provide those to the committee? >> i'd be happy to follow up with that. >> that is a yes? >> senator, yes. i'll get the exact details on what we do and follow up with the committee. >> appreciate that. how long does instagram store data related to what websites users visit and what internal links they click from inside the app? >> i apologize. i do not know that off hand but i'd be happy to get back to you with the specifics. >> do you know how long instagram stores location information for a user? >> senator, if you post a photo that has a location, it'll say that location on the photo so that'll be stored for as long as the photo is up. in other cases i assume we have retention policies that are quite short. we'll get back to you with the specifics. it'll dpen on the usage of the location data. >> when was the last time instagram reviewed and updated its data decision deletion and retention policies? >> senator, we operate as one company so it would be for both instagram and facebook so we wouldn't, there wouldn't be a specific retention policy for instagram. i don't know the last time the policies were specifically updated but i can tell you we're constantly working on privacy, making sure we can do more to empower people to protect their own data and we're compliant with the increasing number of privacy regulations around the world. >> would you support federal policy legislation that enforces data retention limits and data deletion requirements? >> senator, not only would we support that we believe it is important for there to be privacy regulation here in the u.s. and that is something we've been very public about for years now. >> when users request to down load their data from instagram, are users given all information that the company holds on them including assumptions that the company has about them based on their behavior? >> senator, when you down load your data we give you everything that is associated with you as far as i know but i want to make sure i double check that. there are certain instances where actually i can't think of any exceptions but i'll get back to you. >> the other way i would ask that question that maybe requires followup is there any information that instagram does not share with users when they request their data? >> senator, we do our best to share all the data or all your data when you ask to down load your data as we add new features, add them to what we call down load your data to make sure you have all the data. >> instagram has the option for users to request to delete their data that instagram holds, is that correct? >> yes. if you delete your account we can delete your data if you request it. >> is there any data instagram holds from a user after a user deletes their information? >> not that i know of, senator, no. >> is that something you can get back to me on as well? >> absolutely. but i can also assure you that we do all we can to delete all your data if you ask us to. to do otherwise would be incredibly problematic >> i appreciate that. there was a question i asked mr. zuckerberg back in 2017 about facebook's collection behavior about nonusers to which he responded to me that facebook did not collect nonuser information. facebook about a week later released a correction to that where mr. zuckerberg must have been mistaken or had a lapse in how he responded to that particular question. nonetheless, i really want to get to the bottom of that especially with the rampant collection of data from individuals as well. the last question, madame chair, that also made it into the record, because i am out of time now, is the work that has to be done in non-english language, disinformation, and misinformation. it is a huge problem. i think it is important for facebook, instagram, meta. i get confused which term i should be using here with rebranding. i guess i'm not so good with it >> i am comfortable with any term you would prefer. >> but that we're able to get disaggregated data as we request it in this hearing from a facebook witness and we still have not been responded to. i think it is very important for the committee's wishes to be respected which were done in a bipartisan way. >> senator, i appreciate that point. since we talked about that the other day i've started to look into that. there are a number of ways we might be able to break out data around our community standards enforcement policy. it is possible, one possibility is what language the content is in. another possibility is what language the person who sees the content speaks. another possibility is what country. and so we're going to get back to you on what we think the most efficient and responsible thing we can do in this space and i'll personally make sure we do that and get back to you promptly. >> thank you, senator. senator lee? >> mr. mosseri, does instagram advocate weight loss or plastic surgery for teenage girls under the age of 18? >> sorry, senator. i apologize. i missed the second word of the question. >> does instagram advocate for, recommend weight loss or plastic surgery for girls under the age of 18? >> absolutely not. we don't recommend eating disorder content to people of any age. >> very glad to hear that. i beg to differ here. leading up to this hearing i've heard about a lot of complaints from people across utah and elsewhere who have told me about inappropriate content available through the explore page on instagram, available specifically to children. so i was encouraged to see, to look into it myself. so i had my staff create a fictitious 13-year-old account for a fictitious 13-year-old girl. the explore page yielded fairly benign results at first when all we did was create the account knowing it was a 13-year-old girl, but instagram also provided that same account for this fake 13-year-old that included some recommendations. the list of recommendations of accounts we should follow including multiple celebrities and influencers. so we followed the first account that was recommended by instagram, which happened to be a very famous female celebrity. now, after following that account, we went back to the explorer page and the content quickly changed. you see, right at first all that came up when we opened up the account were some fairly benign hair styling videos. that's not what we saw after we followed this account, the account that was recommended by instagram, itself. and it expanded into all sorts of stuff including content that was full of unrealistic standards for women including plastic surgery, commentary on women's height, content that could be detrimental to the self-image of any 13-year-old girl, and if you need any kind of evidence on the kinds of harms this can produce you can look to the report that i recently issued through my joint economic committee team specifically on this topic last week. so, mr. mosseri, why did following instagram's top recommended account for a 13-year-old girl cause our explorer page to go from showing really innocuous things like hair styling videos to content that promotes body dysmorphia, sexual content for women and otherwise content unsuitable for a 13-year-old girl? >> senator eating disorders are very complicated and difficult issues in society. >> and complicated enough without a social media site recommending it. >> senator, i've personally spoken to teens in multiple countries around the world that use instagram to get support when suffering from things like eating disorders. we absolutely do not want any content that promotes eating disorders on our platform. we do our best to remove it. i believe and i'll get back to you with the specifics that it is roughly 5 in every 10,000 things viewed. my responsibility as the head of instagram is to get that number to as close to zero as possible. we believe that every company, snapchat, tiktok, youtube, should be public like we are about what exactly the prevalence of contents are on our platform >> i get that. i can only take your word for it here. i understand what you are saying about the overall numbers. that is not how this happened on the account. it was hair styling videos and innocuous stuff one minute and the next minute after we followed a famous female celebrity it changed and went dark fast. it was not 5 in 1,000 or 5 in 10,000. it was rampant. the thing that gets me is what changed was following this female celebrity account and the female celebrity account was recommended to this 13-year-old girl. why are you recommending somebody follow the site with the understanding by doing that you are exposing the girl to other sorts of things that are not suitable for any child? >> i appreciate the question because it is an incredibly important and difficult space. if we recommended something we shouldn't have i'm accountable for that. i am the head of instagram. you said a second ago you have to take my word for it and i don't believe you should. our first community standards enforcement report, this next quarter, this quarter we're in right now is going to be independently audited by ernst and young and we are committed to independent audits going forward. >> that is great and i have independently audited myself. i'll take your word for the 5 in 1,000 point. what i am saying it was decidedly not on this page for this poor, unsuspecting, albeit fake 13-year-old girl. that is a concern. i'm running out of time. i am also running out of patience from a company that told us over and over again. we're so concerned about your children, so concerned. we're commissioning a blue ribbon study or doing a review. stuff like this is still happening. meanwhile the tech transparency project tdt recently conducted another experiment. it demonstrated how minors can use their instagram accounts to search for prescription and illicit drugs and connect with drug dealers. according to tdp it only took two clicks to find drug dealers on that platform. why are children's accounts allowed to search for drug content to begin with more important that leads them to a drug dealer in two clicks. >> accounts selling drugs or any other related drugs are not allowed on the platform. >> apparently they are. >> respectfully i don't think you can take one or two examples and indicate that is what happens more broadly. i want to be clear here. >> two clicks. it only took two clicks. >> senator, i'm not familiar with the specific report. i'm more than happy to look into it. i want to be clear though. i have been talking about the community enforcement standard a lot. i know it sounds like numbers and i know behind every one of those numbers is a person exposing something difficult. i don't want to sound calloused in any way. if there is room to improve i embrace that. that is why we invest and believe in industry standards and accountability and are calling on the entire industry -- youtube, tiktok, snapchat, to come together to set industry standards that are improved by regulators like here in the u.s. in order to make the internet safer for not only kids online but for everyone. >> there is a secret in the movie "monty python and the search for the holy grail." there is a big fight. somebody concludes by saying, look. let's not bicker and argue about who killed who here. we have to reach the point where we realize some real bad stuff is happening and you're the new tobacco whether you like it or not. you have to stop selling the tobacco in quotation marks to kids. don't let them have it. don't give it to them. >> thanks, senator lee. senator sullivan. >> thank you, mr. chairman and thank you for holding this important series of hearings you and the ranking member have been holding. mr. mosseri, have you read the surgeon general's report that he issued yesterday protecting youth mental health? >> i've started to read it. i haven't quite finished it. from what i read so far it makes it clear teens in this country are struggling. >> let me get into it a little bit. i agree, very sobering reading. it mentions in 2021, emergency room visits for suicide attempts by adolescent girls are up 51%. 51%. that's just shocking. and the surgeon general said our obligation is to act, not just medical, it's moral. so the way i've read it, it is kind of a witch's brew of two things that are driving so much of these horrendous statistics related to mental health and suicide. it's been the pandemic and the negative impacts of social media. one of the -- that is in the -- one of the recommendations from the surgeon general is limiting social media usage. do you agree with that? >> senator, i'll answer that question, but first i want to be clear that i don't believe the research shows social media is driving the rise in suicides. >> why do you think the surgeon general of the united states who just issued a 53-page report on mental health and teen suicides said that we should limit social media to help get out of this crisis? >> senator, from what i've read of the surgeon general's report so far it is about a number of different issues not just suicides so to make a connection between one problem he talks about and one of the recommendations he makes i think is a bit of a leap. >> let me ask my question again. the surgeon general of the united states makes as one of his recommendations to address what clearly is a mental health crisis for teenagers particularly teenage girls in america. one of his recommendations is to limit social media usage. so do you agree? >>, two things. >> answer the question. >> senator, i believe parents should be able to set limits for their children because i believe a parent knows best which is why we've developed or are currently developing parental controls to let parents not only see how much time their teens spend on instagram but set limits. >> this is a really important surgery because if we have experts saying we need to limit social media usage which is what the surgeon general just said yesterday, to help address mental health issues, does that go against the business model of instagram or facebook or meta? isn't your business model to get more eyeballs for a longer time on social media? isn't that what you are about? >> senator, if people don't feel good about the time they spend on our platform if for any other reason people want to spend less time on a platform i have to believe it is better for our business over the long run. >> do you make more money when people spend more time on your platform or less? >> on average we make more money when people spend more time on our platform because we are an advertising business. >> right. so -- but you agree with the surgeon general that people should limit their social media usage? my point is they seem to be in direct contradiction with each other. the surgeon general is saying we need to better the health of our young americans and what your basic business model proposition, they seem to be colliding with each other. >> senator, respectfully i disagree. i think the important thing is to distinguish between the short term and the long term. over the long run it has to be better for us as a business for people to feel good about the time they spend on our platform. it has to be better as parents to have not only a meaningful amount of control but be able to exercise that control over how much time teens spend on our platform. we take a very long view. >> you have internal data relating to mental health and suicide and usage on your platform or facebook or meta? >> senator, we deresearch to make instagram better and safer. as a parent that is exactly what i would want. i believe we lead the industry. >> you're not answering my question. >> i apologize. >> you have internal data related to the issue of teen suicide in usage of your platform. >> i'm not sure i understand your question specifically but yes we do research and talk to third party experts and academics about difficult issues like suicide which is inspired work, not allowing any content that talks about the methods of suicide, connecting people who seek out that type of content with expert backed resources, and if someone looks like they are at risk of hurting themselves, proactively reaching to local emergency services on their behalf not only here in the u.s. but a number of countries around the world. >> mr. chairman, just one final question. can i ask very quickly, i looked into a little bit of this issue of your announcement on instagram for kids, and just that phrase kind of makes me nervous. it sounds like a gateway drug to more usage. why did you put a pause on that? are you going to permanently pause that? do you worry you're going to get now kids hooked on more usage with instagram for kids? >> senator, the idea was trying to solve a problem. we know that 10 to 12-year-olds are online. the average age in this kourn when you get a cell phone is 11 or 10. we know they want to be on platforms like instagram and it wasn't quite frankly designed for them. the idea was to give parents an option to give their child an age appropriate version of instagram where they could control not only how much time that they spend but who they could interact with and what they could see. it was always going to be a parent's decision. now i personally as head of instagram am responsible for instagram and i decided to pause that project so we could take more time to speak to parents, experts, and policy makers to make sure we get it right. >> thank you, mr. chairman. >> thanks, senator chairman. senator young? >> thank you, chairman. welcome, mr. mosseri. in ms. haugen's testimony she discussed how instagram generates self-harm and self-hate especially for vulnerable groups like teenage girls. now i happen to have three young daughters and two teenage daughters. this herb unite hits home to me and hits home it a lot of americans. you are here today, the head of instagram, you have an opportunity to tell your side of the story. and i do believe if we are not receiving some constructive, actionable, and bold measures to deal with what is popularly believed to be a serious and significant public health issue, congress will act. because our constituents insist that we act. we've held a lot of hearings now. we've done our best to educate ourselves. but frankly, since you run the platform, since you know the technology, since you spend so much time working on these matters, you could really help us. if you don't, we'll feel imperative to act. that is just the reality of it. with that said, with that foundation laid, do you believe there are any short-term or long-term consequences of body image or other issues on your platform? >>, i appreciate the question. the research we have found shows many teens use instagram to get support when suffering from issues like body issues. for 11 of 12 issues for teenage girls and 12 of 12 issues like body image, anxiety, depression, we found more teens who are struggling found instagram made things better than worse. the one exception was body image for teenage girls which was why i personally actually before we even did this research started the social comparison team that researches inspired ideas like take a break which launched this week and nudges which we're currently working on which encourages you to switch topics if you spend too much time on any one topic. i am not here to say there is any one perfect solution but to give an update on what the research says and what we are doing to make instagram safer. >> got it. i am familiar with nudges, somewhat familiar with behavioral science. i know that is something that can be used by the tech community to generate traffic. what is engagement based ranking, mr. mosseri? >> senator, i appreciate the question. i worked on ranking and algorithms for years. the term is often used to describe trying to connect people with content they find interesting. what we do when you open up instagram is we look at all of the posts from all of the people that you follow, the accounts you follow, and try and show you the one you find most relevant and try to make sure to take out anything that might be against our commune guidelines in order to keep people safe. at a high level that is usually what people refer to when they say engagement based ranking to the best of my knowledge. >> so is there a behavioral bias for teenage girls to look disproportionately at content that adversely impacts their self-image? >> not that i know of, senator. i do think it is important that teens don't have negative experiences on our platform. i do think it is important that we try to understand the issue that you're raising, which i appreciate, which is negative social comparison or body image social comparison. and we're trying to understand how we can best help and support those who might be struggling. >> so there is no negativity bias, just as adults have a negative news bias, which is why so much of the news and current events coverage online can be so caustic and corrosive to our public discourse, because so often people marinate in the negative. there is no similar bias that you have discovered, and i won't hear from one from any of your internal experts pertaining to negative self-image for teenage girls? >> senator, i appreciate the question. i think it's important to call out that social media allows you to connect with anyone you're interested in. and in a world where the definition of beauty here in the united states used to be very limited and very focused on a very unrealistic definition of beauty, media, social media platforms like instagram have allowed or not allowed but have helped important movements like body positivity to flourish. so that if you are a teenage girl of color or if you are a plus sized teenage girl you can see models of color, plus sized models. it has helped diversify the definitions of beauty and that is something that we think is incredibly important. i don't know of any specific bias to answer your question very directly but i want to call out that we help people reach a more diverse set of not only definitions of beauty but points of view and perspectives. >> do you have behavioral scientists who work internal to instagram? >> senator, we have data scientists who try and understand how people use the platform in order to make instagram both better and safer. >> and they would inform you i presume but i want to get you on the record, they would be informing you if they ever discovered a negativity bias in the research or in the behaviors of your user community as it relates to teenage girls and self-harm or self-hate, right? >> senator, i expect my data scientists as i expect my researchers to keep me abreast of any important developments with regards not only to safety but to instagram and the industry more broadly. >> thanks, senator young. >> thank you. >> senator? >> thank you, mr. chairman. companies like instagram are often designing technology to maximize the collection of our data and subsequently sell visibility into its users private lives and interests. that's why when a company called signal bought advertisements designed to show us the information it collects and sells about us, those ads were banned by instagram's parent company. it's the black box of highly secretive, algorithmic systems companies like instagram deploy to operate largely undetected by the users and allow them to operate free of meaningful scrutiny. this is not an open source system. sun light disinfects and congress must not scroll past this critical moment without properly addressing the harms young people are encountering on these platforms. mr. mosseri, thank you for being here. in your testimony, you stated that instagram has limited advertisers' options for serving ads to people under 18 to age, gender, and location. but your testimony neglects to mention that any similar prohibition on instagram's own machine learning ad delivery system. does instagram's machine learning ad delivery system target ads to children using factors other than age, gender, and location? >> senator, i appreciate the question. there is one ads delivery system both for instagram and facebook. we only allow advertisers to target those under 18 based on age, gender, and location. and overall the system also uses activity teens use within the app to make sure that ads are relevant to teens. >> okay. so you don't limit yourselves. you hold yourselves to a lower standard than your advertisers? >> senator, we do limit ourselves in that we don't use any off platform data. but we do also use activity in the apps to make sure ads are relevant. for instance if i'm not interested in a specific band because it is not in that part of the country or i don't like that music it doesn't make sense to see that type of act >> i ask unanimous consent to enter into the record a report from fair play that shows meta is still using an a.i. delivery system to target ads at children. >> without objection. >> thank you, mr. chairman. your head of safety and well-being recently stated that any one piece of content is unlikely to make you feel good or bad or negative about yourself. it is really when you're viewing say 20 minutes of that content or multiple pieces of that content in rapid succession that may have a negative impact on how you feel. so to me, her statement says that your company knows that time spent on the platform increases the likelihood of real world negative impacts. so how do you square a business model that prioritizes user time and engagement with knowing there is a direct correlation between time and harm? >> senator, respectfully, using our platform more will increase any effect whether positive or negative. we try and connect people with their friends, we try to help them explore their interests. we even try to help them start new businesses. but if people don't feel good about the time they spend on our platform, that is something that i personally take seriously and why we build things like daily limits and we're currently working on parental controls that are focused on time. >> does instagram make money from ads that are seen when placed next to highly viewed and viral but also harmful content that violates the rules of your platform? >> senator, we don't allow content that violates our rules on the platform. we release publicly how effective we are at removing that content, and we receive revenue based on ads shown. >> is the money returned to the advertisers then? >> senator, not that i know of, no. >> is it your position instagram will always comply with the laws of the country in which it operates? >> senator, we're going to do our best to always comply with the law. >> okay. >> will you commit to releasing those guidelines to members of this committee? >> senator, apologies. which guidelines? >> it would be guidelines related to how you comply. so let me give you an example. if an authoritarian regime submitted a lawful request for your platform to censor political dissidents would you comply? and if not, what are your guidelines on something like that? let me give you some real world examples. if a government, let's say uganda, criminalizes homosexuality. if the government submitted a lawful request for a data on users that are members of the lgbt community would instagram comply? >> senator, we try and use our best judgment in order to keep people safe. i also believe that transparency on the specific issue is incredibly important. i will double check and get back to your office but i believe we also are public about incoming requests we get at a high level. >> so one of the reasons that i'm concerned about the fact that this is sort of secretive data collection based on a nonopen source algorithm, is because it gives you that veil of secrecy. people don't know what information is being collected about them. yet if a hostile government is able to identify people like women being educated against the law or someone who is homosexual where, in a government where homosexuality is punishable by death and there are governments that do this, if you turn over that data and it is collected by artificial intelligence, that artificial intelligence is not going to discern they are putting a human being in danger. the platform, artificial intelligence, while, when not guided and not open sourced, can be a real problem. mr. chairman, thank you. i yield back. >> thanks, senator. senator cantwell. >> thank you, chairman blumenthal. thank you to you and senator black burn for this fabulous hearing. i know we've had great attendance from members. i am so impressed by the questions that all of our colleagues have been asking so i hope it will lead us to some good legislative solutions and appreciate mr. mosseri for being here today. obviously a big, new day on the job. so wanted to ask you specifically about privacy violations. do you believe that claims of privacy violations by kids should go to arbitration? that is, do you believe when people are signing up for your service when they're 14-year-olds they understand that they are giving away their rights when they sign up to your service? >> senator, respectfully i dispute the characterization anyone gives away rights when they sign up for our service. i think privacy is incredibly important and we do the best we can and invest a lot of resources in making sure we respect people's privacy. >> so if a child has suffered harm of the magnitude and they try to get those issues addressed, do you think they should be in arbitration? >> senator, i am not sure i understand the exact hypothetical but i believe if a child is at risk specifically of hurting themselves -- >> no, no. one of my constituents who was working with a mother whose 14-year-old daughter was groomed by adults on instagram, ultimately was lured into sex trafficking, taken across state lines for prostitution, under instagram's terms of service instagram can argue a child's only recourse against instagram's failure to provide a safe environment would be in arbitration with no open court, no discovery, no judge, no jury, no appeal. i'm asking you about what you think about when real harm is created against children and what should be the process? >> senator, that story is terrifying. we don't allow child or human trafficking of any kind. we try to be as public as we can about how well we do on difficult problems like that one. we believe that there should be industry standards. there should be industry wide accountability. and that the best way to do that is federal legislation which is specifically what i'm proposing today. >> what we're trying to get at is when users in this case particularly young children are signing up for service, what they are signing up to in a check mark is that you are signing up to binding arbitration with the company. so if there is a dispute about something that happened and, yes, we've been considering privacy legislation, our colleagues here have been trying to protect young children in other ways and certainly found very egregious situations of late. the only recourse is binding arbitration with you as a company. we are saying when there is something as egregious as a policy violation they should have other recourse. i am simply asking you if you believe they should have other recourse. >> senator, i appreciate the question and i believe the most responsible approach in this area more broadly not only for privacy but safety is for federal regulation here in the u.s. >> do you think everybody has to go through you or one of our other software ask companies to get redress? due think the only redress consumers should have is through binding arbitration with the company? >> senator, i believe whatever the law states should apply to all companies like ours equally >> i am asking what you think as a company. >> senator, i'm not familiar with the specifics. >> okay. i'll ask you for the record and that way you'll get a little more time and you can consider it. these are serious issues about the fact that serious issues are happening to children and the only redress they have is to go into binding arbitration with you. i think while that might be like, hey. i don't like your service or something happened or you over charge me or this happened that might be great for binding arbitration but serious harm to people i don't think should be sent to binding arbitration. back to the policy question, my colleagues have done a really good job asking about this. obviously people have been talking about the ability to make money off specific content whether in the facebook that was described as potential reach metric. i think we've been talking about that right, people have been discussing that. are you aware of any inaccuracies in the potential reach metric? >> senator, i am not aware of specific inaccuracies but we do the best we can to make sure advertisers understand reach before they spend time using tools like that. >> do you think there is hate speech not taken down by, that is included in that? would you agree that like informing advertisers and the public how much hate speech there is or if it's taken down or not taken down? >> senator, respectfully i believe the potential reach tool allows you to get a sense of how many people you will reach, which is different than how much content on hate speech content specifically in our community standards enforcement report you can see i believe 3 in 10,000 pieces of content seen qualifies hate speech by our definition. >> do you think that advertising can be inaccurate or misleading based on certain metrics? >> senator, as an advertising business i believe it is in our interest to be as accurate as possible. i think whenever we make mistakes we know that undermines our credibility, advertising businesses are based on trust. >> right. they are also based on being truthful to your advertisers. what i am getting at is miss haugen came to testify and is saying facebook purposely made a decision to keep up metrics that drove more traffic even though she knew shall the company knew it included things that were related to hate speech. that that certainly motivated more traffic and when presented with the information the company and various members of the company decided to continuousing that metric. and so what i'm saying, there could be instances where instagram has also continued to have advertisers not fully aware. so do you believe advertisers should be aware if there was any content that was related to hate speech that they should be aware that that metric that they should be aware of what content they're being served with? >> senator, i believe advertisers should have access to data about how much hate speech is on the platform just like everyone should have access to that kind of data. i'm not familiar with what you are specifically references with regard to her testimony but it doesn't line up with any of my experience during my 13 years here at the company that we would intentionally mislead advertisers that would be a gross violation of trust and would envied bali come out and undermine our credibility. >> so you don't think there is any deceptive practices with advertisers that you are involved in at instagram? >> senator, not only do i not believe that i think that would be -- >> so you think advertisers know everything about your algorithm and what it is attached to and giving them page views and information? >> senator, i believe deeply in transparency. i have spent an immense amount of time over the years not only trying to be transparent about how our algorithms work but also investing and making sure we are transparent about how much problematic content is on the platform. i believe you can see that in a community standards report and i believe other companies like snapchat, tiktok, and youtube should do the same. >> my time is way over mr. chairman but i'll ask you questions on the record on this as well. because the point is if companies are involved in deceptive practices with advertisers, and they haven't told them how they are artificially increasing their traffic and it is related to something that the advertisers aren't aware of, that is a deceptive practice. thank you, mr. chairman. >> thank you, senator cantwell. thanks for your excellent work on this issue and your help and support in these hearings. senator cruz. >> thank you for being here and testifying before the committee and thank you for being here in person. as you are aware, i and many members of the committee have had significant concerns about instagram's practices and facebook's practices and big tech more broadly. in september, 2021, "the wall street journal" published a series of investigative articles titled the facebook files. as you know "the wall street journal" reported researchers inside of instagram found 32% of teen girls using the product felt that instagram made them feel bad about their bodies. "the wall street journal" further reported that 13% of british users and 6% of american users traced their desire to kill themselves to instagram. those are deeply troubling conclusions. are you familiar with the research that was cited by "the wall street journal"? >> senator, yes. but if we're going to have a conversation about the research i think we need to be clear about what it actually says. it actually showed that 1 out of 3 girls who suffer from body image issues found instagram makes things worse. that came from a slide with 23 other statistics where more teens found that instagram -- doesn't mean it is not serious, on suicide it was -- and any one life lost to suicide is an immense tragedy but on suicide it was 1% who traced their thoughts back to instagram and i think it is important we are clear about what the research says. >> i am glad to see we found some common ground. you just said twice there it is important to be clear what the research said. i agree. at prior hearings i asked your colleagues reportedly for copies of the research and to my knowledge you have refused to produce it. will you commit now to produce the research to this committee so we can as you just said be clear about what the research says? >> senator, i really appreciate this question. it is incredibly important we are clear about research. i can commit personally to doing all i can to release the data behind the research. two challenges i need to let you know of are number one privacy in certain cases and two and often cases we do not have data anymore due to our data retention policies but given that i can also commit to you that we will provide meaningful access to data to third party researchers outside the company so they can draw their own conclusions and design their own studies to understand the effect of not only instagram but facebook on well-being and other companies should do the same. >> in what format was this research communicated to you? you just referenced a slide that had bullet points. i would love to see that slide. you criticized this committee for not having the full contents of the research when you haven't given us the research. in what form did this research come to you? was it a power point presentation? how was it memorialized and presented to you? >> senator, there are two forms the research comes in. the most important is the data because it allows any researcher. i believe outside the company they can have access to draw their own conclusions and in the specifics, we've made that slide public but i believe the most important thing over time is that we provide regular access to meaningful data about social media usage across the entire industry to academics and experts to design their own studies and draw their own conclusions. that will take time because many studies can often take years but something i am personally very committed to. >> when you saw your own study finding a significant percentage of girls reported that instagram caused them to think about killing themselves were you concerned by that finding? >> senator, just to clarify, i believe the study said they traced their thoughts but, yes. i am concerned about anybody who feels worse about themselves, talking about any one individual, we're talking about people not numbers. >> let's talk about numbers for a second. did instagram do anything to quantify how many teenage girls have killed themselves because of your product? >> senator, we do research and talk to third party experts about not only suicide but self-harm on a regular basis and that research has inspired much of our work to make sure we have not only very clear policies -- >> did you quantify it or not? >> but you do research to estimate, to count how many teenage girls have taken their lives because of your product? >> senator, we do research to understand problems and identify solutions. in the case of suicide to make sure we take down suicide related content from the platform, to connect those speaking out this type of content with expert backed resources, and to connect anyone who looks like they are at threat of hurting themselves with local emergency services. >> how did you change your policies as a result of this research to protect young girls? >> senator, i appreciate the question. we use research to not only change our policies but change our product on a regular basis. with regards to bullying the research has inspired things like restrict which allow you to protect yourself from someone harassing you without them knowing and limits because we learned that teens struggle during moments of transition. with suicide and self-injury. we learned that we need to be incredibly careful because often teens suffering from these really difficult issues use instagram to find support and we need to make sure they can find that support and talk about recovery while making sure we don't -- >> so big tech loves to use grand, eloquent phrases about bringing people together. but the simple reality and why so many americans distrust big tech is you make money the more people are on your product the more people are engaged in viewing content that is harmful to them even if they're viewing every eyeball you're making money and when your colleagues have been asked the same question, as a result of this research what policies did you change, this committee has been unable to get a straight answer to that question about what is different. and i think the reason for that is if you change policies to reduce the eyeballs you'd make less money. why is that inference not correct? >> senator, if people don't feel safe on our platform, if people don't feel good about the time they spend on our platform, over time they'll use other services. competition has never been fiercer particularly here in the states with youtube and tiktok and snapchat. so i have to believe that over the long run it is not only incredibly important we keep people safe but make sure people feel good about the time they spend on our platform. >> so my time is expired but i want to make sure i understand the commitment you've made to this committee. as i understand it you have committed to providing this committee with the raw data from the research you did on users of your product and in particular body image issues and tendencies toward suicide and, also, with the power point presentations that memorialize that raw data. is that correct, that you will provide them to this committee? >>, what i am committing to you is to do everything i can do to release -- >> you can do that, right? is there a reason you can't do what i just said? >> senator, the challenge on the data, which i think is the most important thing is in many cases we no longer have it. >> how about power point presentations? will you give us the power point presentations? >> i think the most responsible thing to do is provide access to data to researchers outside the company. >> we'd like both. is there a reason you are hiding the power point presentations? you said you wanted maximum transparency. maximum transparency would be show us the presentations that were prepared for you presumably you had some reason to trust them because they were prepared for your consumption. >> senator, i believe you already have the presentations which is why we're focused on the data. we think that any researcher should be able to draw their own conclusions based on the raw data. that is the most important and most knowledgeable part of the process. unfortunately, much of that data we no longer have due to data retention policies which is why i am very committed to making sure we can allow access to meaningful engagement data to researchers outside the company in the future to focus specifically on the effects of social media on well-being and kag on the rest of the industry to do the same. >> your commitment is to provide all the data you have and the power point presentation summarizing it. >> senator my commitment is to provide meaningful access based on what researchers request because i think that is the most responsible thing over the long run. >> okay. we are requesting right now. >> senator, to do a study on the effects of well-being, sorry, the effects of social media on well-being you'd have to design that study so i would like to talk to, we've worked with pew, worked with harvard, worked with other organizations around the world. i would like to talk to the researchers and understand what specific data they would like access to. i can't just provide all data that we have. it's an untenable thing to do physically. >> the data that was the basis for the study quoted in "the wall street journal" report. that is the data i'm asking about. and the presentations that summarized the conclusions of that. >> senator, i would love to provide that data. i am personally working on trying to find out if there is any way we can provide it in a privacy safe way and where we still have it. i think that is important. i've been working on that. i don't want to over promise and under deliver which is why i am more focused on how we make sure researchers have access to data going forward. >> thank you, senator cruz. i just want to say mr. mosseri because you and i have discussed this point, the data sets are not enough. this answer is completely unsatisfactory. we want the studies. we want the research. we want the surveys. the whistle-blower has disclosed a lot of it. the answer you've given very respectfully simply won't get it. it is in your files. if it was destroyed we want to know about that, too. but information is absolutely the coin of the realm when we go about devising legislation. i must say there is a kind of disconnect here. senator sullivan asked you about content relating to suicidal or self-harm. i think i'm quoting almost directly. you said there isn't any. well, we have a teen account with all the protections on, the filters, we searched, quote, slit wrists. and the results i don't feel i can describe in this hearing room. they are so graphic. that is within the past couple days. i described to you an account that looked at in effect eating disorders and attracted the same deluge of self-harm and anorexia coaches. so i just feel that there is a kind of real lack of connection to the reality of what's there in the testimony that you're giving today, which makes it hard for us to have you as a partner and maybe we need to have some kind of compulsory process, you know, cruz and i don't always agree. but on this point on the need for information i think you've heard here the bipartisan call for a reality check and for action. and the fact that this content continues to exist on the site despite your denials i think really is hard to accept. instagram is suggested as a solution here to nudge teens. i don't know whether your kids have reached the teenage yet. it takes more than a nudge to move teens. i am well beyond the teenage years in my four children. if you said to a teen on instagram fixated on eating disorders why don't you try snorkeling in the bahamas, that nudge just won't work. instagram has a real power here. it drives teens in a certain direction and then makes it very difficult for the teen once in a dark place to find light again and to get out of it. so my question to you is don't we need enforceable standards set by an independent authority, not an industry body, objective, independent researchers with full access to your algorithms, will you commit to support full disclosure of your algorithms and a commitment to an independent authority? >> senator, directionally we are very aligned and agree on the importance of transparency not only with how ranking works. >> well then you would make available all the studies like the ones frances haugen presented when she was here >> i am confident we are more transparent than any other tech company in the industry. >> that is a pretty low bar mr. mosseri. that's like you are in the gutter forgive me in terms of transparency because they committed to make available their algorithms but only after we press them to do it. and we still awaiting full compliance. >> senator, we've been punishing research for years. we'll publish over a hundred things this year alone. i believe there is an immense amount of data in our quarterly reports. i believe that we'll start having them audited by ernst & young starting this quarter. i believe our ads library provides more transparency in advertising than any other advertising business in any industry tech or otherwise. yes. i believe there is more to do. yes i believe in federal legislation. i believe policy makers should be actively involved in that. i'm looking forward to having our teams work with yours on shaping exactly what that looks like. >> will you support the pact? >> directly we believe strongly in accountability and transparency. i am unfortunately not familiar with every provision in the act but will have my team work on you with that. >> if you believe what you testified here you would say yes i support the act. >> respectfully i don't think it would be appropriate for me to commit to something i haven't read in full but i really do want our team to work with yours. as i've said we're calling for industry wide regulation. we believe it is incredibly important. it is why we're having this hearing today. it is why i appreciate these questions even though they are difficult at times because we believe there is nothing more important than keeping teens safe online. i believe we need to come together. >> you support prohibitions, bans, on advertising and marketing to teens for products that are illegal for them to consume? >> senator, i believe we already prohibit that so in the case of under 18-year-old we don't provide, don't allow ads for, we don't allow ads for tobacco to any age but we don't allow ads for things like gambling for under 18s. we don't allow ads for alcohol for under 21. >> would you support legally enforceable prohibitions where you can be held liable? >> senator, yes. we support industry standards and accountability and i believe part of the industry standards as i call out in my testimony include age appropriate design, which will inevitably include content rules about what is appropriate. >> if you host child sexual abuse material, should the victims be able to sue you? >> senator, child exploitation is an incredibly serious issue. i believe an earlier senator mentioned how we collaborate on this. i believe that we're going to continue to invest more than anyone else on this space. and i believe that federal regulation is the best form of accountability with enforcement to your point before. >> i'm going to turn to senator blackburn. i have a few more questions if we have time to get there. >> yes. and the question about -- i wanted to know if you reported traffickers. i think that is something important to do. you need to come back to us on that. also you mentioned referring children to local authorities. you need to let us know how many are we talking about? is it in the hundreds? thousands? give us the numbers so we have the data. also, i'd like to know how long you hold the data on your research. you've mentioned you can't give us the data and then you say well you may not have some of this data. we need to know how long you're holding this data on minors. those children you're data mining. i was glad to hear you admit you all are a big advertising business. i thought that was helpful to our discussion. you mentioned in response to senator lee's question that you hoped you did not sound calloused. we were talking about children that take their life or that have life long problems because of what they have encountered on instagram. sir, i have to tell you you did sound calloused because every single life matters. every life matters. and this is why we need this research. it is also interesting, you basically give teen girls no recourse if they get into a dark spot using instagram. but we've got a lot of parents that come to us. this is why we're doing these hearings. they are concerned about how their children are going to be affected for the rest of their life. i asked you yesterday when we talked if you ever talked to these parents whose children have taken their lives or have ended up having to have mental health services because of what they have incurred and you said yes you do. so i want to give you one minute to speak to parents who are struggling. their children have attempted suicide. or maybe some of them have taken their lives. so take the next minute and speak directly to these parents. as i told you yesterday i've talked to a lot of parents and they never heard one word from instagram or facebook or meta and they are struggling with this. senator cantwell brought this up to you. so, sir, the next 60 seconds the floor is yours. speak. >> you can have longer than a minute if you like. >> speak to these parents because we're not talking to people. that have ever had any response from instagram and you have broken these children's lives and you have broken these parents' hearts. the floor is yours. have at it. >> thank you, senator. senator, i am a father of three. to any parent who has lost a child or even had a child hurt themselves i can't begin to imagine what that would be like for one of my three boys. as the head of instagram it is my responsibility to do all i can to keep people safe. i have been committed to that for years and i am going to continue to do so. whether or not we invest more than every other company or not doesn't really matter for any individual if any individual harms themselves or has a negative experience on our platform that is something i take incredibly seriously. i know i've talked a lot about parental controls. as i have said i really truly believe a parent knows what is best for their child, or a guardian. i also know a lot of parents are busy. i have three kids and have a lot of support and i can't imagine what it would be like to have four kids or three kids and be a single parent working two jobs. i don't want to rely on parental controls. i think it is incredibly important that the experience is safe and appropriate for your age no matter what it is. 13, 15, 17. but if you have the time and the interest, i also think as a parent you have the right to be able to understand what your kids are doing online and should have control to shape that experience into what is best for them. if you don't have time that is okay, too. it is my responsibility to do whatever i can to help keep not only young people safe on our platform but anybody who uses our platform. >> mr. mosseri, we are telling you children have inflicted self-harm. they are getting information that is destroying their young lives. we are asking you, have some empathy and take some responsibility, and it seems as if you just can't get on that path. so we are going to continue to work on this issue. i wish your response had been a more empathetic response. >> thanks, senator blackburn. i have one more question and i think we'll make a 5:00 deadline. we want to be respectful of your time. i understand it is important to have your internal discussions and debate as any company but these studies and research are really important for parents to make decisions. and i'm reminded of actually some work i did when i was state attorney general in connecticut. we were one of the first states with the help of a company in connecticut named legos to require warnings about small part on toys, which i urged the legislature to do as state attorney general. then i fought the industry when it challenged that labeling. and we won. against challenges based on the commerce clause and other constitutional claims. and the supreme court denied certiorari and then the industry decided it wanted a federal standard because it didn't want to deal with state by state requirements. the point was that the law required companies to disclose risks. it encouraged them to compete over values that were positive and that promoted safety. that is the kind of competition we need in your industry as senator klobuchar indicated earlier. we've been working to promote antitrust so there maybe will be some on safety. but disclosure is effective and sun light is so important. my question to you is would instagram support legal requirements on social media platforms to provide public disclosures about the risk of harms in content? >> senator, i would support federal legislation around the transparency of data or on the access to data from researchers and around the prevalence of content problems on the platform. i think all of those are important ways that parents or anyone really can get a sense for what a platform is doing and what its effects on people are. i believe deeply transparency is important which is why i am confident we'll continue to lead the industry to be incredibly responsible about what happens on our platform. >> you said repeatedly you are in favor of one goal or another directionally. and i find that term really incomplete when it isn't accompanied by specific commitments, a yes or a no. we're going to move forward on this committee directionally with specifics, kinds of baby steps that you suggested so far, very respectfully, are under whelming, a nudge. a break. that ain't going to save kids from the addictive effects. and there is no question there are addictive effects of your platform and i think you will sense on this committee pretty strong determination to do something well beyond what you've indicated you have in mind and that's the reason that i think self-policing based on trust is no longer a viable solution. so we thank you for being here today. we'll continue the effort to develop legislation, many of us, senator markey, senator thune, senator klobuchar, myself, working with senator blackburn who has been a great partner in this effort. this hearing record will remain open for two weeks if you feel you want to supplement any of your answers. my colleagues would like to submit questions for the record should do so by december 22nd and we ask that your responses be returned to the committee as quickly as possible in no more than two weeks after receipt. that concludes today's hearing. thank you very much for being here. i think we are very much on time. thank you. >> thank you, chairman blumenthal and i appreciate your time. >> thank you. >> good to see you. >> take care. yeah. i'll see you on twitter. >> hello. how are you? get c-span on the go. watch the day's biggest political events live or on demand any time anywhere. on our new mobile video app. c-span now. access top highlights, listen to c-span radio, and discover new podcasts all for free. down load c-span now today. >> so how exactly did america get up to its neck in debt? >> we believe one of the greatest characteristics of being american is we are striving to provide equal opportunity for all citizens. >> c-span's video documentary competition 2022. students across the country are giving us behind the scenes looks as they work on their entries using the #studentcam. if you're a middle or high school student, you can join the conversation by entering the c-span student cam competition. create a five to six-minute documentary using c-span video clips that answer the question, how does the federal government impact your life? >> be passionate about what you're discussing to express your view no matter how large or small you think the audience will receive it to be. and know that in the greatest country in the history of the earth your view does matter. >> to all the film makers out there remember the content is king. and just to remember to be as neutral and impartial as possible in your portrayal of both sides of an issue. >> c-span awards $100,000 in total cash prizes and you have a shot at winning the grand prize of $5,000. entries must be received before january 20th, 2022. for competition rules, tutorials, or just how to get started, visit our website at student cam.org. the supreme court heard oral argument in a case centered on the reimbursement rate cut to hospitals for medicare part b prescription drugs. a group of hospitals argued the rate cut cost them more than a billion dollars annually. money they say they rely on to provide essential health care services to the under served. >> we'll hear argument next in case 2011-14 american hospital association versus becerra. >> mr. chief justice and may it please the court congress enacted the statute at issue which i refer to as paragraph 14 to curb the discretion hhs normally enjoys when it sets medicare rates for out patient hospital services. for the drugs covered by the statute paragraph 14 directs that the agency may set rates

Related Keywords

Togo , United States , United Kingdom , Utah , Bahamas , The , Connecticut , Americans , American , ,

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.