Transcripts For CSPAN3 Instagram CEO Testifies On Kids Online 20240709

Card image cap



>> the response is welcome. i want to thank you and your team for cooperation. i want to thank the ranking member for being a partner in this work. as well as our chairwoman and ranking member for their support as well. thank you to all the members of our committee for being so engaged on this topic. but understand there is a thought, i will be strict on the time limit. i will be ruthless, attempting to be ruthless as any senator can be with his colleagues. in this series of hearings, we have heard some powerful and compelling evidence about the dangers of women's health, well-being and future. our nation is in the midst of 18 mental health crisis. social media did not create it but it fed the file created. if anyone has any doubts on the effect, certain general yesterday issued a powerful -- about the implications of social media, gaming and other technologies on teen mental health. that is part of the reason we are here. the hearings have shown that social media, in particular tech, actually fans those flames with sophisticated algorithms that can exploit children's insecurities and anxiety. our mission is to do something about it. we are here to do more than take steps. we some of the big companies have said to trust us. that is seemingly what they're saying. the trust is gone. what we need now is independent researchers, objective overseers , not chosen by big tech but strong vigorous reinforcement of standards to stop the disruptive, toxic content and take radicals to dark places. before this hearing, instagram announced a set of proposals. these oversight tools that could have been announced years ago. there were not. these changes fall short of what we need. many of them are still in testing, months away. rollouts will be done at some point in the future. we do not know if that is the plan. unfortunately, these changes leave parents and kids with no transparency into the black box algorithms. 600 pound gorillas in those boxes that drop disruptive and addictive content to children and teens with no warning or notice to parents when there teenagers are spiraling into eating disorders and causing themselves harm. no real ability to ensure that the safeguards will work. i'm troubled with the lack of answers. it looks more like a public relations tactic brought on by our hearing just as these changes seem to be brought on by these proceedings announced hours before your testimony. we need serious review of those changes. facebook's own researchers have been warning management, including yourself for years about instagram's harmful impact on teens mental health and well-being. they waited a decade to show that -- the testimony has been heard some horrifying stories of whose lives have been changed forever. one parent spoke to me today whose daughter had severe anxiety in high school because of constant pressure from instagram. it became so intense following her school, everywhere she went. on that into her bedroom in the evening. she attempted suicide. actually, her parents stepped in and found a recovery program. the experience continues to harm her and her family. facebook researchers call this fall into a dark rubber hole a perfect storm. that is the quote, "perfect storm." . it exacerbates downward spirals. they know about the harm, they have done the research and studies and know the consequences of the algorithms that are designed for those who struggle with addiction and depression on instagram. that data has been hidden, like the algorithms themselves. just yesterday, that surgeon general's report provided powerful documentation on how social media can fan those flames of the mental health crisis among teens and singles something terribly wrong. what stuns me is the lack of action, not just within the last month. i showed -- that it was rampant on instagram, how the algorithms will flood a teen just hours after we created an account. this was glorification of being dangerously underweight. images we could not could consciously show in this moment. it has been two months. we have completed our experiment. on monday, we create an account for teenagers and followed accounts promoting eating disorders. all of our recommendations promoted pro-anorexia and eating disorders. two months ago, the global head of public safety for facebook was put on notified -- notification from this committee then nothing has changed. it is still happening. we all know that if facebook saw a threat to their -- or ad revenue, they would not take two months to take action. when time is not on our side, time is not on our side. no wonder parents are worried. you do not trust instagram, google, tiktok, big tech. this is not an issue limited to instagram or facebook. parents are asking, what is congress doing to protect our kids? the bipartisan message from this committee is, legislation is coming. we cannot rely on trust anymore. we cannot rely on self belief. it is what parents and our children are demanding. we are listening to them. we are working together to make a proposal. we will do it ourselves. self-regulation [indiscernible] we need to make sure the responsibility is on big tech to put a safe product and markets. you cannot be allowed to conceal when products are harm to kids. the first imperative is transparency. we need real transparency into these 800 pound girl is, these -- gorillas, these black box algorithms. we need to update our children's privacy laws. congress should pass the bipartisan children's, teens protection enacted today. i'm proud to work on updating and expanding it. children need more power and tools to protect themselves on the platform. that is why senator blackburn and i are working on remark -- working on framework to enable that production. the united kingdom has imposed it, it is part of a law there. why not here? that is part of the framework we are considering. you make a reference in your testimony, the days of broad, unique immunity for big tech are over. enforcement. several authorities, law-enforcement has been rigorous and strong. i hope that we will begin the efforts of working together, but one way or the other, this committee will move forward. i thank you for being here, i think my colleagues were attending. >> thank you senator blumenthal. welcome to everyone. we are appreciative that you are here today. we are grateful for your time. i do want to think senator blumenthal and his team for the hearing we have held. big tech, the invasion of privacy i looking very directly at what the social media platforms, the negative and adverse effect they are having on children. i will tell you that today, i am a little bit frustrated because this is now the fourth time in the past two years that we have spoken with someone from meta -- is you are now calling yourselves -- and i feel like the conversation continues to repeat itself. when i go back home to tennessee , people there -- moms, dads, teachers, pediatricians -- basic -- they share in this frustration because they continue to hear from you, the changes coming, things will be different, there will be more tools in the toolbox. the kids will be safer online. their privacy is going to be protected. nothing changes. nothing. the chairman just talked about what happened with -- when she came in. yet, yesterday, what happened? the exact same thing. i hope that you appreciate the frustration that the american public feels. they appreciate what the internet can do for them in bringing the world closer, but the application that you are pushing forward, the social media, the addictive nature in the way this affects children, there is such a frustration that you turn a blind eye toward taking responsibility and accepting accountability for your platform, how you are structured and how you use the data on these children. yesterday at 3:00 a.m., midnight in silicon valley, you released a list of updates that you said would raise the standard for protecting teens and supporting parents online. i am not sure what hours you all keep in california, but where i am from, the middle of night is when you drop news that you do not want people to see. maybe you thought that doing it in this manner would keep numbers of the subcommittee -- members of the subcommittee from seeing it right away and from raising concerns because while sure that you know about the bill for protecting teens online is -- online, for example we know that social media is an integral part of teenagers lives. according to the mayo clinic, 97% of teens between the ages of 13 years old and 17 years old use a social media platform. 45% say they are online almost constantly. telling teens to take a break may seem helpful, it is probably not going to get most teenagers to stop doing what they are doing and take a break. educational tools for parents can be helpful. i'm more concerned about the things we know kids and teens are hiding from their parents. we know that facebook and instagram have encouraged teens to use secondary accounts and told them to be authentic. we all remember what it was like to be ath or. while parents may gain some insight into what they're teenagers do on their main account, what do they do about the accounts they do not even know exist? the one set instagram is encouraging them to create? instagram enos into light that it would fault -- instagram announced in july that it would default the teens yet yesterday my team created an account as a 15-year-old girl and it defaulted to public. while instagram is tabbing all the safety measures, they are not even making sure that the safety measures are in effect. for me, this is a case of too little too late. now, there is bipartisan momentum both here and the house to tackle these problems we are seeing with big tech. as senator blumenthal said, we are working on children's privacy, online privacy, data security and section 230 reforms. this is the appropriate time to pass a national consumer privacy bill as well as kid specific legislation to keep minor safe online. we also need to give serious thought to how companies like facebook and instagram continue to hide behind section 230's liability shield when it comes to content like human trafficking. sex trafficking, drug trafficking. despite congress speaking to these issues when we passed a couple bills a few years ago, there is a lot of work for us to do to improve the online experience and to protect our children and grandchildren. i think it is best if we do this together and i look forward to hearing your ideas and your testimony today. thank you for your appearance. >> thank you. i am pleased to introduce adam, he spent over 11 years at meta and overseas all functions of the instagram app, including engineering products, management and operations. >> apologies. my apologies. thank you, senator, chairman blumenthal, breaking member, members of the subcommittee. i have served as the head of instagram since 2018. the subcommittee has held a number of hearings of the safety of young people online. this is an important topic, as you said in your opening statement. it is something that we think about and work on every day on instagram. the internet has changed the way people communicate, express ourselves, how we stay connected to people we care about. it is also change what is like to be a teenager. teenagers spend time with her friends, develop new interests, explore new identities. today they are doing things on platforms like instagram, youtube, tiktok, etc. i believe that instagram and the internet can be a positive force to young people's lives. i am proud that our platform is a place where they can spend time with people they care about, where they can start incredible movements, find interests, turn a passion into a business. i also know that sometimes, young people can come into instagram dealing with difficult things in their lives. i believe that instagram can help in those moments. research has shown, it is the most important work that we can do. taking on issues like bullying and comparisons and making changes. many in this room have deep reservations about our company. i want to ensure you we have the same goals, we all want teenagers to be safe online. the internet is not going away, but i believe there is important work we can do together. industry and policy members. to base the standard across the internet to better serve and protect young people. the reality is that keeping people safe is not just about any one company. a survey suggested that more teens are using tiktok and youtube then instagram. this is a challenge that requires industrywide solutions and standards. we have a specific proposal that we believe should be industry wide when it comes to the three important issues, how to verify age, how to build age-appropriate experiences and build parental court -- controls. we should have input, standards should be high and protection universal. i believe a company like ours should earn section 230 protections by adhering to the standards. we have been calling for regulation for nearly three years. there is no area more important than user safety. i understand the developing policy takes time. we will continue to push forward on safety and well-being of young people online. age verification, we are developing new technologies to address this challenge. you're creating a menu of options to allow kids to verify that they are old enough to use instagram and extends beyond relying on an id card. where developing new technology to remove accounts belonging to those under the age of 13 years old. we are also using technology to understand about users below the year of 18 so we can build a more appropriate and sagal for them. adults can no longer message those under the age of 18 that no longer follow them. they can no longer tack or mention teens that do not follow them as well. we also provide tools for parents. parents and guardians know what is best for their teenagers. we are launching the first set of parental controls in march of next year, allowing them to see how much time there teenagers are on instagram and to set time limits. we will also give teenagers new options to notify their parents if they report someone, giving them an opportunity to talk about it with them. i care a great deal about creating an online world that is safe for my children and allows them to benefit from all the amazing things you has to offer. as the head of instagram, i recognize the gravity of my role in making this happen, not only for my kids before generations to come. i'm hopeful we can work together to reach that goal. thank you. >> i will take the first round of questions. a short time ago, -- was at that table and they all committed to making internal research algorithms and data about the effect on children available to pray -- available to independent researchers, we commit to do not -- to doing that? >> we believe it is important to be transparent. i can commit to you that we will provide meaningful data so that third-party researchers can design their own studies and make their own conclusions about the effects of well-being on young people and on ranking. i'm committed to do all i can to ask plato ranking works, to find other ways to be transparent about algorithms. >> we support the requirement that independent overseers and researchers not only have access to the data but also, check the way algorithms are driving content and recommend changes that you will adopt? >> i would be happy to have my office work with you on that. that is an important one. we do a number of things in this area already, we provide information on the effects of algorithms -- >> will you commit to a legal requirement that axis be provided and that an independent -- independently appointed body -- not an industry body -- an independent overseer and researcher who will have access? >> on the specifics, i am not a legal expert but, yes i think there should be requirements for how companies like ours are transparent about data and algorithms. >> in industry body is not government -- and industry body is not government regulations that mark zuckerberg and facebook and others have called for. it is not the same as an independent one. let me ask you, shouldn't children and parents have the right to report dangerous material and get a response? we have heard resources and parents would try to report and have heard no response. my officemate a report and got no response until cnn made the report to press relations. shouldn't there be an obligation that instagram will respond? >> yes. i believe we try and respond to all reports. if we fail to do so, that is a mistake that we should correct. >> instagram is addictive. that is the view that has been repeated again and again by people who are experts in this field. parents know it and for teenagers who see instagram's algorithms encouraging eating disorders, they find it almost impossible to stop. should we have a similar rule to the u.k. in the united states? >> i do not believe the research suggests our products are addictive. it shows that of 11 and 12 difficult issues, teenagers who are struggling says instagram helps more than harms. we care about how people feel about their experience on the platform and is my responsibility to do everything i can to help people safe. we will continue to do so. >> we can debate the meaning of the word addictive but the fact is that teens who go to the platform find it difficult, maybe sometimes impossible to stop. part of the reason is that more confident is driven to them to keep them on the site, to activate the emotions that are seductive and openly addictive. the u.k. recognizes it. i am imposing that design restriction, that the same be done in the united states. will you commit to make the pause on instagram permanent? stop developing an app for children under 13? >> the idea of building instagram fork 10-year-olds-12-year-olds was trying to solve a problem. we know that they are online, they want to use the platform like instagram and is difficult for companies like ours to verify age for those that are so young they do not have an id. the plant is that always -- the plan has always been that no child has been with an instagram whose parents do not consent. should we ever manage to build that instagram, they will not have it without parental consent. >> thank you for answering my questions. >> thank you mr. chairman. staying on instagram kids, i know you are doing research into -- and pulling together data on eight-year-olds and i assume that that was in relation to instagram kids. are you still doing research on children under age 13? >> i do not believe we ever did any research with eight-year-olds for instagram kids. we are not doing that today. >> if you were to completely remove that project, who would make that decision? >> it was my decision to pause instagram kids. >> it would be your decision to do away with it? >>'s editor, i am responsible for instagram and it would be my decision. -- senator, i am responsible for instagram and it would be my decision. >> let's talk about jane doe versus instagram. i'm sure you do not have details because they are still deciding to take the case. it says that they are enabling tracking of a minor, it raises concerns about what we are seeing and how people are using instagram. do you prohibit known sex offenders from creating instagram accounts? >> human trafficking and any exploitation of children is abhorrent and we do not allow it on platforms. >> to you require minors to link their accounts to a parent or guardian's account? >> no. if they are over the age of 13, you can sign up for an account. we do believe that parental controls are important which is why march of next year we have the program. >> the controls will be important but an industry group will not give the controls that are needed and probably not an independent, that is why we will do something with federal statute. i think it would be interesting to know how many people that are in human trafficking trafficking, sex trafficking and drug trafficking that have been indicted or convicted that were using instagram, could you provide that number? >> i would be happy to talk to the team and get back to you. >> excellent. my staff created in instagram account for a 15-year-old girl and it defaulted to public, i mentioned it earlier. is the opposite supposed to happen and have you considered turning off the public option altogether or minor accounts? >> i learned about this morning. it turns out that we default those under the age of 13 for private accounts for the mass amount for androids, we miss the and we will correct it. >> when they created the account, it defaulted to the statement "include your account when recommending similar accounts people might want to follow." is this a feature that should remain on default for minors? >> we think it is important that regarding your age it is easy to find a county would be interested in. >> even if you are under 18? >> i believe teenagers have interests. >> yes teenagers have interests but what we are turned to address are the adverse and negative effects that are happening to children because they are on your platform. can adults not labeled as suspicious still find, follow and message minors? >> if your account is private, if someone follows you you have to approve it. adults -- you have the decision whether or not they can follow you. >> you said you removed accounts because they did not meet your minimum age requirement. these accounts were removed because that users do not provide identification showing that there were at least 13 years old. why did you say you do not want to know when jojo siwa said she had been on instagram since she was eight years old, is that your general attitude toward kids who are on your platform? >> absolutely not. we do a lot to try and identify those under the age of 13 -- >> at that moment when you responded to her, why did you use that as a teaching moment? >> i would say it was a missed opportunity. >> indeed, it was a missed opportunity and it sends the wrong message. it looked as if you were encouraging kids that want to be online to get online earlier and to build an audience. this is a part of our frustration with you with instagram and with these platforms. thank you, mr. chairman. >> thank you. i am looking at this from a perspective of parents. so many of them have told me that they have done everything they can to try and get their kids off your product, kids who are addicted at age 10 and they are scared and want their kids to do their homework and not get addicted to instagram and yet we then find out that what your company did was to increase our marketing budgets to try and get more teens from 67 million in 2018 to over $300 -- over $300 million focus on kids this year. when i care you will develop a new way to see if really young kids are on there, you could have been spending this money, $390 million or years, you have the money to do it. i think we are -- goals, parents goals and the goals of your company. that is what is going on. when you look at the marketing budget and when your company's have done to try and get more and more of them on board, looking at a quote, your company used losing teen users as a quote "existential threat." a threat to their families. my first question is, is that in fact the truth that you have been increasing money advertising money to woo more teens, kids aren't your platforms? >> no, i do not believe the statistics are correct. we increase our overall marketing budget but i would would not characterize it as towards teens. that is not true. >> have you not done things to get more teenagers interested in your project -- product? are you not worried about losing them to other platforms. you better tell the truth, you are under oath. >> we try to make it relevant as possible for of all ages, including teens. teens do amazing things on instagram every day. we also invest more than anyone else, i believe, in keeping teenagers safe. we spent $5 billion this year alone and have over 400,000 people working on it. >> do you think three hours a day is an appropriate amount of time or kids to spend on instagram? i am asking because when you put up new rules, that was an option for parents. three hours of time. it was in your tools that you just put out there. verse option given to kids and parents was three hours a day. i put that on the record. thank you. >> i'm a parent. i can understand that parents are concerned about how much screen time their kids have, every parent feels that way. i think that the parent knows best of what is best for the teen. the appropriate amount of time should be by parents about a specific teen. if one parent want to set the limit at 10 minutes and the other want are set at three hours, who am i to say they do not know what is best for the children? >> to believe your company has invested enough in in -- identifying that young children are not on the platform when you know they are not supposed to be on their making sure you are registering and using technology, not just to increase profits, but to make sure kids are not on there, do you think you have done enough? >> yes, i believe we have invested more than anyone else. i believe it is still a very challenging issue industrywide. there are a number of things we can do industry wise. i believe it would be much more effective to have age verification, have parents who give the 14-year-old the device to tell the phone that the child is 14. there are millions of apps out there trying to verify age. we understand that my not happen, it may take time. in the meantime, we will invest heavily in getting more sophisticated in how we identify the age of people under the age of 18. >> is a truth that someone said it was an "existential threat" if you lost teen users? >> senator -- >> is that true because we have a document? >> then i guess it is true. >> do understand it is our job to protect the children. -- children? one parent explain to me it was like a water faucet going off and she is standing there with a mop trying to figure out the tools you trying to give them. i think i will -- at some point the accountability is with you guys. maybe, if we had actual competition in this country instead of meta-, new owning most of the platform, maybe we could have another platform that would have privacy protections that you have not been able to develop in terms of keeping teenagers off your platform that are not old enough to be on their. food for thought for all of you, the opposition to the competition policy, capitalism, pro-capitalism ideas that we have. i will send it back to the chair. >> thank you. thank you for your leadership on this issue in our committee. i will ask a couple questions because we have a vote on going. your suggestions for tech companies to earn section 230 protections has a certain appeal to me because i am the author of the act along with senator graham. that is not government regulation. the question is, will you support the u.k. children's code that instagram has to obey in the u.k.? shouldn't kids in the united states have protection as good as the kids in the u.k. enjoy? >> absolutely. a few quick things. one, i believe the age-appropriate design, something that we support. we support the safety standards for kids everywhere, including here in the u.s. if you indulge me, i would like to clarify my proposal is an industry body that sets standards with input from civil society, policymakers, parents. once those standards are proposed, it would be approved by policymakers like yourself. i believe that policymakers are regulators and should make the decision on whether any individual company like mine is adhering to the standards. >> and then enforce them, bring lawsuits? >> we believe that a strong incentive would be to tie some of the section 230 protections and that could be a decision by regulators. >> would the attorney general of the united states or may have a state like connecticut where i was attorney general, have the power to enforce those standards? >> we believe in enforcement, specifically it is something we would like to work with your office on. >> it is a simple yes or no. >> i agree, enforceability is important. without enforcement it is just words. >> the attorney general of the united states could enforce the standards -- which means they would be within the statute? >> i do not know. i'm not a legal expert. in general, i think it should happen at the federal level. it is something i would be happy to have my teamwork with you on. >> individuals who are harmed could bring an action against meta-? >> i believe it is important that companies like ours are held accountable to high standards. the most effective way of doing so is to define industry standards and best practices at the federal level and to seek enforcement. >> these are yes or no questions. they are pretty clear. i know you are knowledgeable. i hope you will answer them more clearly than the answer you are providing in writing. >> i can do this on record. you denied the idea that marketing budget went from 67 million to $390 million and that much of the budget was allocated to wooing teens, that was reported on by "the new york times" from internal documents of your company. you still deny this as a fact? >> in this case, i believe that report, the article said the majority of our budget was focused on teens, i know for a fact that was not the case. >> much of the budget? >> i do not remember. >> can you give me the number of the percentage of the budget that was focused on teens? you must be able to break it down that way. that would be helpful. >> i would be happy to follow up on that. >> thank you you, mr. chairman. 30% of teenagers say when they feel better about their bodies, instagram makes him feel worse. 6% of american teenagers use -- users trace the desire to kill themselves to instagram. that is your own research. faced with these findings, did facebook it -- pursue options -- no, it is the opposite. they put their research on even younger users, that is appalling. you have publicly doubled down on instagram kids and said that is "the right thing to do her coat your statement -- right thing to do." instagram sees a dollar sign on these kids, parents should see a stop sign when it sees instagram. you support my bipartisan legislation with senator blumenthal, to update the children's online protection act to give 13-year-olds and 14-year-olds control over their data? yes or no? >> it is important that we are clear on what the research shows. any thoughts of life to suicide or any other reason is a tragedy. 6% is inaccurate. it was 1%. it is something we take seriously. you have my support. i do strongly support federal regulation, not industry regulation when it comes to safety. if you move the age from 13 to 16, we know that 13-year-olds and 14-year-olds want to be online and can lie about their age. you will make challenge for age verification more difficult. i do believe that 13 is a magic number. people have needs as they grow and of all that we should build age-appropriate based on age. >> will you give 13-year-olds to 15-year-olds the right to have all of their information expunged? you support legislation that would give parents and children the ability to have their records expunged? >> you can already delete your account and data -- >> would you support national legislation that mandates that each parent and child is given the ability to expunge? >> senator -- >> would you support legislation to make it mandatory? >> i would support legislation that required a companies like ours to allow people to delete their data, yes. >> just to make a permanent protection. would you support legislation to ban targeted ads for children? towards teens? >> we believe that anyone should have an age-appropriate experience on instagram or any social platform. we have different rules for ads on instagram and on facebook. we only allow advertisers to target on age, gender, location. we do not allow things like weight-loss ads and dating ads for those under the age of 18 or alcohol related ads for those under the age of 21. >> t support legislation that would ban targeting of ads to children? >> i believe it is valuable for ads to be relevant but i believe some measures should be taken to keep children safe. i would support something in the direction of what we do, limit targeting abilities on platforms. >> yes or no? mandate or no mandate? that is a question. that is why we have to make sure that facebook and instagram do not reserve the right to be able to target these kids. yes or no? >> i'm trying to be specific about what i would support. we build a limited targeting option. >> i just keep coming back to the fact that your answers are too vague to make it possible for us to make these decisions in a legislative way to do so in the very near future which is what i think we have to do. the chilling truth, unfortunately, continues to be that in the absence of regulation the big tech has become a threat to our democracy, society, and to the children in our country. let's be clear, facebook, instagram opposes legislation of your idea of regulation, an industry group creating standards that the company follows. that self-regulation, that status quo, that won't cut it. the revelations this subcommittee has made public, we do need laws. we need laws passed by a governing body. we have to ban targeted ads and make sure that is the law in our country and everything that this subcommittee has unveiled continues to make that a necessity, including the testimony you're getting today. >> thank you. senator. >> because i went over to vote. it may have come up because this have come up because this is work i did with senator klobuchar but i joined senators klobuchar and cappito for writing to meta about how instagram is combatting eating disorder content and the harms it brings the users, particularly young people. in response to our question about how the platform is working to remove this content, meta indicated it uses a , i'm calling out from the letter -- quoting up from the letter, of reports from our community and artificial intelligence to find and take down materials this violate your terms of service. i also spoke with francis hogan, that shaaban -- frances haugen and when i asked her about what more instagram could do to remove content like this glorifying eating disorders. she said that more human review is the key. according to her many reports are not investigated and artificial intelligence cannot successfully identify patterns and distribution points for problematic content. given that meadows platforms according to your own data have 3.6 billion monthly active users . that 15,000 reviewers would seem to pale in comparison to the , depth content of those 3.6 billion monthly active users would post. and if so, will your company commit to strengthening investment in human review? senator, thank you for the question. at a high level, people are better at nuance and technology is better at scale. we are going to continue to do so. >> i know that this hearing is focused mostly on youth. in how many countries as instagram available? >> instagram is available in over 70 languages. i don't know the number of the top of my head. it's available in 70 languages. >> how many are language specific sort of monitoring content in each of those 70 languages? >> i misspoke. we review content in 70 languages. we have more that are being used. >> so there are gaps then in terms of human review and those areas? >> senator, we're also looking to improve not only through language coverage but through building more accurate classifiers, through improving the efficiency of our reviewers. because it helps to keep it more safe. >> i may have some follow-ups that are more specific. but we also know there's tremendous social pressure for kids to utilize social media platform and services. and while it is the industry standard to block or restrict access to block access to those younger than 13, we know that younger teens and tweens are still signing up for social media accounts. i appreciate that your company decided to press pause on it service focused on kids. but i understand that meta is looking to introduce parent controls. i'm concerned about what you are doing today to keep kids under 13 off the platform. tell me a little bit more about what you are doing to strengthen age verification and why, given the problems in which you are already well aware, with services focused on younger kids like messenger kids. why have you waited to institute parental controls or other steps protecting young users? >> on the parental controls question, i believe as a parent it will be more responsible to develop an age appropriate version of instagram for those under 13. but i paused that project and took the exact work they were building. no one will have access unless they had their parents consent. we pivoted that to teens because 13 isn't a magical number. we also asked how do we verify the age under those -- of those under age 13. it is difficult given that they do not have ids. we build classifiers and ask what we call classifiers and ask people to prove their age. we look at things like certain countries, sweet 16 is a cultural norm here in the u.s. so we look for do people say that on someone's birthday and does it line up with the age they said. we get better over time as we get more signals. i want to be very clear, it's not perfect. which is why i believe there are better industrywide ways for verification. it's an industry challenge not unique to facebook or instagram. >> thank you. chairman blumenthal will be back from his part in a moment. we will be starting our second round. we have some other members coming from their first round. i am the meantime -- in the meantime i want to return to a question i asked you about those that are human traffickers, sex traffickers, drug traffickers, and their utilization of your site. i know that in 2020, you sent over 21 million sex abuse images on your platform. facebook sent these. and you sent these to -so- i thank you for doing that. that was the right thing to do. i'm interested to know whether you include -- if in your report do you include traffickers and do you include those violations when you make that report with these images? >> i have to get back to you, but we do allow you to report any image that violates any standards. and you can see how well we are at reducing the prevalence. >> get back to us. and let us know if you are also reporting these individuals that are posting and sharing the sexual abuse images of children. senator, you're recognized for five minutes. >> thank you, madam chair. thank you for holding today's hearing. the lack of transparency from big tech companies and the effect the companies have on consumers is concerning to the public and rightfully so. because of the secrecy with which big tech protects their algorithms and content moderation practices, we have no idea how they use them. to amplify the behavior of users without their knowledge. tomorrow the committee on which is herbal take a closer look at the effects of this persuasive technology. and i look forward to hearing from the panel how algorithms on artificial intelligence are designed on internet platforms to manipulate users as well as the bigger picture about what the future might hold for us. we must find ways to improve more transparency and accountability in the algorithms deployed on internet platforms that billions of people share every day. we now have more insight into facebook and instagram's troubling practices with regard to how it uses algorithms, and it is long past time for congress to enact legislation to hold these companies accountable. it is also time to provide more options when engaging with internet platforms, which is why i have introduced in bills to address that. i look forward in the time that i have two discussion the -- to discussing these issues with you mr. mosseri. >> i've worked on ranking ranking and algorithm for years. that's not how we work. we try to connect people with the friends they find meaningful and try to keep people safe. >> the wall street journal revealed that instagram often ignored warnings about harmful impact of platforms, particularly on girls. with that being said, do you believe consumers should be able to use instagram without being manipulated by algorithms that are designed to keep them hooked on the platform, and would you support giving consumers more options when engaging on instagram's platform? for instance, providing consumers a feed that is not controlled by algorithms. >> we can give people the option to get a chronological feed. >> a senator introduced legislation that would require platforms like instagram to provide for more due process for users regarding the moderation and censorship practices, and submit public transparency reports about content that has been removed or deemphasized, do you believe this would help to build trust with instagram's users? >> we believe in more transparency and hunt ability, and we believe in or control. that is why we provide a number of ways for you to see content moderation on our platform. you can go to the account center and see any of the content that has been taken down. that's why we will give people more control over their experience, including transparency about how instagram works. >> do you believe that explanation or algorithm transparency are appropriate policy responses? >> i believe very strongly in algorithm transparency. i think it would be hard to find someone who tried as much to explain how it works. there are a number of ways to be transparent. in some cases, the most effective is to look at the output of algorithms, like we do in our community guideline enforcement reports. and in other cases it's more appropriate to explain how they work, instead of releasing the code. >> and could you just elaborate, you talked about creating and giving consumers a feed that is not being fed to them by an algorithm or that is in a chronological order. you said you are going to implement that policy at the beginning of next year. how you came to that, and so more specifically, what the dates for that implementation and if you could elaborate on just exactly what that might look like. >> so we have been focused for a few years now on how to give people more control over their experience. one idea we experimented with publicly is called favorites. you can have people show up at the top of the feed. another thing we have been working up a month is a chronological version of instagram. right now we're targeting the first quarter of next year. >> and we would like to take what you are proposing to do and codify it. that is what the filter bubble act does. >> thank you very much. i want to thank every one for calling this hearing. tomorrow i will be convening a hearing titled dangerous algorithms. addressing the harms of persuasive technology. and we will discuss solutions to online amplification and the physical and mental well being of our children and promotes extremism. one of the lines of questions i had today based on questions that came before from other colleagues earlier in this important hearing lies around data retention and deletion. there was a line of questioning from one of my colleagues as well. does instagram have in place practices to abide by the principle of data minimization, especially for sensitive personal information? can you provide this to the committee? >> is that a yes? >> appreciate that. how long does instagram store data related to what websites users visit and what links they click from the app? >> i'll be happy to get back to you with the specifics there. >> do you know how long instagram stores location information for a user? >> if you post a photo with the location, that will be stored as long as the photo is up. for other cases, we have retention policies that are quite short. we will get back to you on specifics. it will depend on the usage of location data. >> when was the last time instagram reviewed and updated its data decision deletion and retention policies? >> senator, we operate as one company. it would be for both instagram and facebook. so there wouldn't be a specific retention policy for instagram. i don't know the last time the policies were specifically updated. i can tell you that we're working on privacy, making sure we can do more to help people protect their data. >> would you support federal policy legislation that enforces data retention limits and data deletion requirements? >> senator, not only would we support that, we believe it's important for there to be privacy regulation here in the u.s. and that's something we've been very public about for years now. >> when users request to download their data from instagram, are users given all information that the company holds on them, including assumptions that the company has about them based on their behavior? mr. mosseri: senator, when you download your data, we give you everything that is associated with you as far as i know. but i want to make sure i double-check that. there are certain instances where -- actually, i can't think of any exceptions. but i'll get back to you. >> the other way i would ask that question, maybe it requires follow up is, is there any information that instagram does not share with users when they request their data? mr. mosseri: senator, we do our best to share all the data or all your data when you ask to download your data as we add new features, we add them to what we call download your data to make sure you have all the data. >> instagram has the option for users to request to delete their data that instagram holds, is that correct? >> yes. and if you delete your account, we can delete your data if you request it. >> is there any data that instagram holds for a user after a user deletes their information? mr. mosseri: not that i know of, senator, no. >> is that something you can get back to me on as well? mr. mosseri: absolutely. but i can also assure you that we do all we can to delete all your data if you ask us to. to do otherwise would be incredibly problematic. >> i appreciate that. there was a question that i asked mr. zuckerberg back in 2017 about facebook's collection behavior about non-users, to which he responded to me that facebook did not collect nonuser information. facebook about a week later released a correction to that. that mr. zuckerberg must have been mistaken or had a lapse in how he responded to that particular question. but nonetheless, i really want to get to the bottom of that, especially with the rampant collection of data from individuals as well. and then the last question, madam chair, that made it into the record is, the work that has to be done in non-english language disinformation and misinformation. it's a huge problem. i think it's important for facebook, for instagram, meta, i get confused with which term i should be using here with re-branding. i guess i'm not so good with it. mr. mosseri: i'm comfortable with any term you would prefer. >> that we're able to get disaggregated data as we requested in this hearing from a facebook witness and we still have not been responded to. and i think it's very important for the committee's wishes to be respected, which were done in a bipartisan way. mr. mosseri: senator, i appreciate that point. since we talked about that the other day, i started to look into that. there are a number of ways we might be able to break out data around our community standards enforcement policy. one possibility is what language the content is in. another possibility is, what language the person who sees that content speaks, another possibility is what country. and so we're going to get back to you on what we think is the most efficient and responsible thing we can do in this space. and i'll personally make sure that we do that and get back to you promptly. >> thanks, senator lujan. senator lee. >> mr. mosseri, does instagram promote weight loss or plastic surgery for teenage girls under the age of 18? >> i missed the second word of your question? >> does instagram recommend weight loss or plastic surgery for girls under the age of 18? mr. mosseri: absolutely not. we don't recommend an eating disorder related content to people of any age. >> very glad to hear that. i beg to differ here. leading up to this hearing i've heard about a lot of complaints from people across utah and elsewhere who have told me about inappropriate content available through the explorer page on instagram available specifically to children. and so i was encouraged to see -- to look into it myself. so i had my staff create a fictitious 13-year-old account, for a fictitious 13-year-old girl. the explorer page yielded fairly benign results at first when all we did was create the account. knowing that it was a 13-year-old girl. but instagram also provided that same account, for this fake 13-year-old, that included some recommendations. list of recommendations of accounts that we should follow including multiple celebrities and influencers. so we followed the first account that was recommended by instagram which happened to be a very famous female celebrity. now, after following that account, we went back to the explorer page and the content quickly changed. you see right at first, all that came up when we opened up the account were some fairly benign hairstyling videos. that's not what we saw after we followed this account, the account that was recommended by instagram itself. and it expanded into all sorts of stuff, including content that was full of unrealistic standards for women, including plastic surgery, commentary on women's height, content that could be detrimental to the self-image of any 13-year-old girl. and if you need any kind of evidence on the kinds of harms that this can produce, you can look to the report that i recently issued through my joint economic committee team, specifically on this topic last week. so, mr. mosseri, why did following instagram's top recommended account for a 13-year-old girl cause our explorer page to go from showing really innocuous things like hairstyling videos to content that promotes body dysmorphia, sexualization of women and content otherwise unsuitable for a 13-year-old girl? what happened? mr. mosseri: eating disorder related content are very complicated issues in society. >> and they're complicated enough without a social media site recommending it. mr. mosseri: senator, we know that a lot of -- i've personally spoken to teens in multiple countries around the world that use instagram to get support when suffering from things like eating disorders. we absolutely do not want any content that promotes eating disorders on our platform. we do our best to remove it. i believe, i will get the specifics, it's roughly 5 in every 10,000 things viewed. my responsibility is that we get that number as close to zero as possible. we believe that every company, snapchat, tiktok, youtube, should be public like we are about exactly what the prevalence of important content problems are on our platform. >> right. i get that. and i can only take your word for it here. i understand what you're saying about the overall numbers. that's not how it appeared on this account. that's not how it happened at all. it was hairstyling videos and innocuous stuff one minute. the next minute after we followed a famous female celebrity it changed and it went dark fast. it was not 5 in 1,000 or 5 in 10,000. it was rampant. the thing that gets me is, what changed was following this female celebrity account and that female celebrity account was recommended to this 13-year-old girl. so why are you recommending that somebody follow a site with the understanding that by doing that you're exposing that girl to all sorts of other things that are not suitable for any child? mr. mosseri: senator, i appreciate the question because it is an incredibly important and difficult space. if we recommended something that we shouldn't have, i'm accountable for that. i'm the head of instagram. but you said a second ago, you have to take my word for it, and i don't believe you should. our enforcement report, this next quarter, this quarter we're in right now, is going to be independently audited by ernst & young. and we're committed to doing independent audits going forward. >> that's great. and i'm independently audited myself. what i'm saying to you, is i'll take your word for it in the 5 in 10,000 point. it was not 5 in 10,000 on this page for this for, unsuspecting lb at fate -- albeit fake 13-year-old girl. i'm running out of time. i'm also running out of patience from a company that has told us over and over and over again we're so concerned about your children, we're so concerned. we're commissioning a blue ribbon study to be done or we're doing a review. and stuff like this is still happening. meanwhile, the tech transparency project recently conducted another experiment. it demonstrated how minors can use their instagram accounts to search for prescription and illicit drugs and connect with drug dealers. in fact, according to tdp, it only took two clicks to find drug dealers on that platform. so why are children's accounts even allowed to search for drug content to begin with, much less allow it to do so in a way that leads them to a drug dealer in two clicks? mr. mosseri: senator, accounts selling drugs or any other unregulated goods are not allowed on the platform. >> apparently they are. mr. mosseri: senator, respectfully, i don't think you can take one or two examples and indicate that's indicative of what happens on the platform more broadly. >> two clicks. it only took two clicks. mr. mosseri: senator, i'm not familiar with that specific report. i'm more than happy to look into it. i want to be clear. i've been talking about the consumer -- the community standards enforcement report a lot. i know it sounds like numbers. but i know that behind every one of those numbers is a person who is experiencing something difficult. if there's room for us to improve, i embrace that. that's why we invest more than i believe anybody else, $5 billion this year over 40,000 people. in industry standards and industry accountability and that's why we're calling on the entire industry, youtube, tiktok, snapchat, to come together, to set industry standards that are approved by regulators like here in the u.s. in order to make the internet more safer for not only kids online, but for everyone. >> there's a scene from the movie monty python and the search for the holy grail. there's a big fight. they conclude the discussion by saying, let's not bicker and argue about who killed who here. i think we have reached a point where we realize some real bad stuff is happening. and you're the new tobacco, whether you like it or not. and you got to stop selling the tobacco in quotation marks to kids. don't let them have it. don't give it to them. thank you. >> thanks, senator lee. senator sullivan. >> thank you for holding this important hearing and a series of hearings that you and the ranking member have been holding. have you read the surgeon general's report that he issued yesterday, protecting youth mental health? mr. mosseri: senator, i started to read it. i haven't finished it. from what i've read so far, it makes it clear that teens in this country are struggling -- i'll get into it a little bit. i agree very sobering reading. it mentions in 2021 emergency room visits for suicide attempts by adolescent girls are up 51%. 51%. i mean, that's just shocking. and the surgeon general said our obligation is to act -- it's not just medical. it's moral. so the way i read it, it's kind of a witch's brew of two things that are driving so much of these horrendous statistics related to mental health and suicide. it's been the pandemic and the negative impacts of social media. one of the -- that's in the report. one of the recommendations from the surgeon general is limiting social media usage. do you agree with that? mr. mosseri: -- >> let me ask my question again. the surgeon general of the united states makes as one of the recommendations to address what clearly is a mental health crisis for teenagers, particularly teenage girls in america. one of his recommendations is to limit social media usage. so do you agree with it? mr. mosseri: senator, two things. one -- >> answer the question. mr. mosseri: senator, i believe parents should be able to set limits for their children. i believe a parent knows best . which is why we're currently developing parental controls that let parents not only see how much time their teens spend on instagram, but set limits. i also -- >> this is a really important question. if we have experts saying we need to limit social media usage, which is what the surgeon general just said yesterday, to help address mental health issues, does that go against the business model of instagram or facebook or meta? isn't your business model to get more eyeballs for a longer time on social media? isn't that what you're about. mr. mosseri: senator, if people don't feel good about the time they spend on our platform, if for any other reason people want to spend less time on the platform, i have to believe it is better for our business over the long run. mr. mosseri: do you make more money when people spend more time on your platform or less? mr. mosseri: senator, on average, we make more money when people spend more time on our platform because we're an advertising business. >> but you agree with the surgeon general that people should limit their social media usage? my point is, they seem to be in direct contradiction with each other. what the surgeon general is saying, we need to better the health of your young americans and what your basic business model proposition -- they seem to be actually colliding with each other. mr. mosseri: respectfully i disagree. over the long run, it has to be better for us as a business for people to feel good about the time that they spend on our platform. it has to be better for parents to not only have a meaningful amount of control, but be able to exercise that control over how much time their teens spend on our platform. and we take a very long view on this. >> do you have internal data relating to mental health and suicide and usage on your platform or facebook or meta? mr. mosseri: senator, we do research to make instagram better and safer. as a parent that's exactly what , i would want. i believe we lead the industry and do more than anyone else. >> you're not answering my question. >> let me answer it. >> do you have an internal data related to the issue of teen suicide and usage of your platform? mr. mosseri: i'm not sure i understand your question specifically. yes, we do research and talk to third-party experts and academics about difficult issues like suicide, which is inspired work, not allowing content that talks about the methods of suicide. connecting people who seek out that type of content with expert back to resources, and if someone is hurting themselves proactively reaching to local emergency services on their behalf not only here in the u.s. but in a number of countries around the world. >> i'm just -- one final question. can i ask very quickly. i looked into a little bit of this issue of your announcement on instagram for kids. just that phrase kind of makes me nervous. it sounds like a gateway drug to more usage. why did you put a pause on that and are you going to permanently pause that, and do you worry that you're going to get kids hooked on more usage with instagram for kids? mr. mosseri: senator, the idea was trying to solve a problem. we know that 10 to 12-year-olds are online. the average age i believe when you get a cell phone in this country i believe is currently 11 or 10. we know they want to be on platforms like instagram and instagram quite frankly wasn't designed for them. the idea was to give parents the option to give their child an age-appropriate version of instagram where they could control not only how much time they spent, but who they could interact with and what they could see. it was always going to be a parent's decision. i personally as the head of instagram am responsible for instagram. and i decided to pause that project so that we could take more time to speak to parents, to experts and to policymakers to make sure that we get it right. >> thank you, mr. chairman. >> thanks, senator sullivan. senator young by webex. >> thank you, chairman. welcome mr. mosseri. in ms. haugen's testimony she discussed how instagram generates all farm and self-hate, especially for vulnerable groups like teenage girls. now i happen to have three young daughters and two teenage daughters and this issue hits home to me, but it hits home to a lot of americans. so you're here today. you're the head of instagram. you have an opportunity to tell your side of the story. and i do believe if we're not receiving some constructive, actionable and bold measures to deal with what is popularly believed to be a serious health -- and significant public health issue, congress will act. because our constituents insist that we act. we've held a lot of hearings now. we've done our best to educate ourself. but frankly, since you run the platform, since you know the technology, since you spend so much time working on these matters, you could really help us. if you don't, we're going to feel an imperative to act. that's just the reality of it. so with that said, with that foundation laid, do you believe there are any short-term or long-term consequences of body image or other issues on your platform? mr. mosseri: senator, i appreciate the question. the research that we have found shows that many teens use instagram to get support when suffering from issues like body image issues. for 11 out of 12 issues for teenage girls and for 12 out of 12 issues like body image, anxiety and depression, we found more teens who are struggling, found that instagram made things better than worse. the one exception was body image for teenage girls which is why i personally, actually before we even did this research, started the social comparison team. it has inspired ideas like take a break which nudges which we're currently working on which encourages you to switch topics if you spend too much time on any one topic. i'm not here to say there's any one perfect solution, but just to give an update on what the research says and what we're doing to make instagram safer. >> got it. i'm familiar with nudges. i'm somewhat familiar with behavioral science. i know that is something that can be harnessed by our tech community to generate traffic. what is engagement-based ranking, mr. mosseri? mr. mosseri: senator, i appreciate the question. i worked on ranking and algorithms for years. the term is used to describe trying to connect people with content that they find interesting. what we do when you open up instagram is we look at all of the posts from all of the people and all of the accounts that you follow and we try and show you the one that we find most relevant and try to make sure to take out anything that might be against our community guidelines in order to keep people safe. at a high level, that's usually what people refer to when they say engagement-based ranking, to the best of my knowledge. >> so is there a behavioral bias for teenage girls to look disproportionately at content that adversely impacts their self-image? mr. mosseri: not that i know of, senator. i do think that it's important that teens don't have negative experiences on our platform. i do think it's important that we try to understand the issue that you're raising, which i appreciate, which is social comparison or body image social comparison. and we're trying to understand how we can best help and support those who might be struggling. >> so there is no negativity bias. just as adults have a negative news bias, which is why so much of the news and current events coverage online can be so caustic and so corrosive to our public discourse, because so often people marinate in the negative. there's no similar bias that you have discovered -- i won't hear from one from any of your internal experts pertaining to negative self-image for teenage girls? mr. mosseri: sorry to interrupt you, senator. i appreciate the question. i think it's important to call out that social media allows you to connect with anyone you're interested in. in a world where the definition of beauty here in the united states used to be very limited and very focused on a very unrealistic definition of beauty, social media platforms like instagram have allowed -- not allowed, but have helped important movements like body positivity to flourish. so that if you're a teenage girl of color or if you are a plus-size teenage girl, you can see models of color, plus-size models. it has helped diversify the definitions of the bd. -- beauty. and that is something that we think is incredibly important. i don't know of any specific bias to answer your question very directly. i want to call out that we help people reach a more diverse set of not only definitions of beauty but points of view and perspectives. >> do you have behavioral scientists who work internal to instagram? mr. mosseri: senator, we have data scientists who try and understand how people use the platform in order to make instagram both better and safer. >> and they would inform you, i presume, they would be informing you if they ever discovered a negativity bias in the research or in the behaviors of your user community as it relates to teenage girls and self harm or self-hate, right? mr. mosseri: senator, i expect my data scientists as i expect my researchers to keep me abreast of any important developments with regards not only to safety but to instagram and the industry more broadly. >> thanks, senator young. >> thank you. >> senator? >> thank you, mr. chairman. companies like instagram are often designing technology to maximize the collection of our data. and subsequently to sell visibility into users private lives and interests. that's why when a company called signal bought information to show us the information it collects about us, those ads were banned by instagram's parent company. it's the black box of highly secretive algorithmic systems that companies like instagram deploy that operate largely undetected by the user and which allow them to continue to operate free of meaningful scrutiny. this is not an open source system. sunlight disinfects. and congress must not scroll past this critical moment without properly addressing the harms young people are encountering on these platforms. mr. mosseri, thank you for being here. in your testimony you stated that instagram has limited advertisers options for serving ads to people under 18, to age , gender and location. but your testimony neglects to mention that any similar prohibition on instagram's own machine learning ad delivery system. does instagram's machine learning ad delivery system target ads to children using factors other than age, gender and location? mr. mosseri: senator, i appreciate the question. there's one ads delivery system both for instagram and facebook. we only allow advertisers to target those under 18 based on age, gender and location and overall activity that teens use within the app to make sure that ads are relevant to teens. >> you don't limit yourselves. you hold yourselves to a lower standard than your advertisers? mr. mosseri: senator, we do limit ourselves in that we don't use any off-platform data. but we do also use activity in the apps to make sure ads are relevant. for instance, if i'm not interested in a specific band because i'm in another part of the country or i don't like that type of music, it doesn't make sense for me to see that type of ad. >> mr. chairman, i ask unanimous content to enter into the record a report from fair play that shows meta is still using an ai delivery system to target ads at children. >> without objection. >> thank you, mr. chairman. your head of safety and well-being recently stated that any one piece of content is unlikely to make you feel good or bad or negative about yourself. it's really when you're viewing, say, 20 minutes of that content, or multiple pieces of that content in rapid succession that may have a negative impact on how you feel. so to me, her statement says that your company knows that time spent on the platform increases the likelihood of real-world negative impacts. so how do you square a business model that prioritizes user time and engagement with knowing there's a direct correlation between time and harm? mr. mosseri: senator, respectfully, using our platform more will increase any effect, whether it's positive or negative. we try and connect people with their friends, we try to help them explore our interests. we even try to help them start new businesses. but if people don't feel good about the time they spend on our platform, that's something i personally take seriously. and why we build things like daily limits and why we're currently working on parental controls that are focused on time. >> does instagram make money on ads that are seen when placed next to highly viewed and also harmful content that violates the rules of your platform? mr. mosseri: senator, we don't allow content that violates our rules on the platform. we release publicly how effective we are at removing that content and we receive revenue based on ads shown. >> is the money returned to the advertisers then? mr. mosseri: senator, not that i know of, no. >> is it your position that instagram will always comply with the laws of the country in which it operates? mr. mosseri: senator, we're going to do our best to always comply with the law. >> ok. will you commit to releasing those guidelines to members of this committee? mr. mosseri: senator, which guidelines? >> it would be guidelines related to how you comply. so let me give you an example. if an authoritarian regime submitted a lawful request for your platform to censor political dissidence, would you comply? if not, what are your guidelines on something like that. let me give you real-world examples. if a government, let's say uganda, criminalizes homosexuality and if the government submitted a lawful request for data on users that are members of the lgbtq community, would instagram comply? mr. mosseri: we try to use our best judgment in order to keep people safe. i believe that transparency on the specific issue is incredibly important. i will double-check and get back to your office. i believe we are public about incoming requests we get at a high level. >> so one of the reasons that i'm concerned about the fact that this is sort of secretive data collection based on a non-open source algorithm is because it gives you that veil of secrecy. people don't know what information is being collected about them. yet, if a hostile government is able to identify people like women who are learning against the law, being educated against the law, or someone who is homosexual where in a government where homosexuality is punishable by death and there are governments that do this, if you turn over that data and it's collected by artificial intelligence, that artificial intelligence is not going to discern that they're putting a human being in danger. artificial intelligence, when not guided and not open sourced can be a real problem. mr. chairman, thank you. i yield back. >> thanks, senator. senator cantwell. >> thank you, chairman blumenthal and thank you for this hearing. i know we've had great attendance from members. i'm so impressed by the questions that all of our colleagues have been asking. i hope it will lead us to some good legislative solutions and appreciate mr. mosseri for being here today. obviously a big new day on the job. i wanted to ask you specifically about privacy violations. do you believe that claims of privacy violations by kids should go to arbitration? that is, do you believe that one -- when people are signing up for your service, when they're 14-year-olds, they understand that they are giving away their rights when they sign up to your service? mr. mosseri: senator, respectfully, i disagree with the characterization that anyone gives away their rights when they sign up for our service. i think privacy is incredibly important and we do the best we can and we invest a lot of resources in making sure that we respect people's privacy. >> if a child has suffered harm of the magnitude and they try to get those issues addressed, do you think they should be in arbitration? mr. mosseri: i'm not sure i understand the exact hypothetical, but i believe if a child is at risk, specifically of hurting themselves -- >> no. one of my constituents who is working with a mother who's 14-year-old was groomed by adults on instagram, ultimately was lowered -- lured into sex trafficking and was taken across state lines for prostitution. under instagram's terms of service, instagram can argue a child's only recourse against instagram would be in arbitration with no open court, no discovery, no judge, no jury, no appeal. i'm asking you what you think about when real harm is created against children and what should be the process. mr. mosseri: senator, that story is terrifying. we don't allow child or human trafficking of any kind. we try to be as public as we can about how well we deal with difficult problems like that one. and we believe there should be industry standards, industrywide accountability and the best way to do that is federal legislation which is specifically what i'm proposing today. >> on those points, what we're trying to get at is, when users, in this case, particularly young children, are signing up for service, what they're signing up to in a checkmark is that you are signing up to binding arbitration with a company. so if there's a dispute about something that happened -- we've been considering privacy legislation. our colleagues have been trying to protect young children in other ways and we've found some very egregious situations of late. the only recourse they have is to go into binding arbitration with you as a company. we're saying when there's something as egregious as a privacy violation, that they should have other recourse. i'm asking you whether you believe they should have other recourse. mr. mosseri: senator, i believe the most responsible approach in this area more broadly, not only nor privacy, but for safety, is federal regulation in the u.s. >> do you think everybody has to go through you or one of your other software companies to get redress? do you think that the only redress consumers should have is through binding arbitration with a company? mr. mosseri: senator, i believe that whatever the law states should apply to all companies like ours equally. >> i'm asking you what you think as a company. mr. mosseri: senator, i'm not familiar with the specifics -- >> ok, i'm going to ask you for the record. that way, you'll get a little more time and you can consider it. these are serious issues about the fact that serious issues are happening to children and the only redress they have is to go into binding arbitration with you. while that might be, hey, i don't like your service, something happened, you overcharged me, this happened, that might be great for binding arbitration, but serious harm to people i don't think should be sent to binding arbitration. back to the advertising question for a second. my colleagues have done a good job about asking about this. but, obviously, people have been talking about the ability to make money off of specific content whether in the facebook that was described as potential reach metric. i think we've been talking about that. people have been discussing that. are you aware of inaccuracies in the potential reach metric? mr. mosseri: senator, i'm not aware of any specific inaccuracies. but we hide -- do our best we can to make sure advertisers understand reach before they spend using tools like that. >> do you think there's hate speech that's not taken down by , that's included in that? would you agree to informing advertisers in the public how much hate speech there is or if it's taken down or not taken down? mr. mosseri: senator, respectfully, i believe the potential reach tool allows you to get a sense of how many people you will reach, which is different than how much content on hate speech content specifically. in our community standards enforcement report, you can see that i believe 3 in 10,000 pieces of content seen qualifies as hate speech, by our definition. >> do you think that advertising can be inaccurate or misleading based on certain metrics? mr. mosseri: senator, as an advertising business, i believe it's in our interest to be as accurate as possible. i think when we make mistakes, it undermines our credibility and advertising businesses are based on trust. >> right. and they're also based on being truthful to your advertisers. and so what i'm getting at. when ms. haugen testified, she's saying that facebook made a decision to keep up metrics to drive more traffic even though she knew that the company it knew it included things that were related to hate speech. that that certainly motivated more traffic and when presented with the information, the company and various members of the company decided to continue using that metric. and so what i'm saying there could be instances where instagram has also continued to have advertisers not fully aware. do you believe advertisers should be aware if there was any content that was related to hate speech? that they should be aware that that metric, that they should be aware of what content they're being served with? mr. mosseri: senator, i believe advertisers should have access to data about how much hate speech is on the platform, just like everyone should have access to that kind of data. i'm not familiar with what you are specifically referencing with regard to her testimony. but it doesn't line up with any of my experience during my 13 years here at the company that we would intentionally mislead advertisers. that would be a gross violation of trust and it would come out and inevitably undermine our credibility. >> you don't think there's any deceptive practices with advertisers, that you're involved with at instagram? mr. mosseri: senator, not only do i not believe that. i think that would be -- >> you think advertisers know everything about your algorithm and what it is attached to and giving them page views and information? mr. mosseri: senator, i believe deeply in transparency. i've spent an immense amount of times over the year not only trying to be transparent about how our algorithms work, but also that we are making apparent how much problematic content is on our platform. and i believe you can see that in our community standards report and i believe other companies should do the same. >> i see my time is way over. i'm going to ask you questions for the record on this as well. the point is, if companies are involved in deceptive practices with advertisers and they haven't told them how they're artificially increasing their traffic and it's related to something that the advertisers aren't aware of, that is a deceptive practice. thank you, mr. chairman. >> thank you, senator cantwell. and thanks for your excellent work on this issue and your help and support in these hearings. senator cruz? >> thank you, mr. chairman. mr. mosseri, welcome. thank you for being here. thank you for testifying before the committee and thank you for being here in person. as you're aware, i and many members of this committee have had significant concerns about instagram's practices and facebook's practices and big tech more broadly. in september 2021 "the wall street journal" published a series of investigative articles titled the facebook files. and as you know, the wall street journal reported that researchers inside of instagram found that 32% of teen girls using the product felt that instagram made them feel bad about their bodies. the wall street journal further reported that 13% of british users and 6% of american users traced their desire to kill themselves to instagram. those are deeply troubling conclusions. are you familiar with the research that was cited by "the wall street journal? mr. mosseri: senator, yes. but if we're going to have a conversation about the research, i think we need to be clear about what it actually says. it actually showed that 1 out of 3 girls who suffer from body image issues and found that instagram makes it worse. it doesn't mean it's not series. session serious. on suicide it was actually on any one life lost to suicide is an immense tragedy. on suicide it was 1% who traced their thoughts back to instagram. and i think it's important that we're clear about what the research says. >> i'm glad to see that we have found some common ground. you just said twice it's important for us to be clear what the research said. i agree. at prior hearings i have asked your colleagues repeatedly for copies of the research. and to my knowledge, you have refused to produce it. will you committee now to produce the research to this committee so we can, as you just said, be clear about what the research says? mr. mosseri: senator, i really appreciate this question because it's incredibly important that we're transparent about research. i commit personally to doing all i can to release the data behind the research. the two challenges that i need to let you know of are, one, privacy in certain cases, and, two, and often cases we do not have the data anymore due to our data retention policies, but given that, i can also commit to you that we will provide meaningful access to data, to third party researchers, outside the company, so they can draw their own conclusions and design their own studies to understand the effects of not only >> in what format was this research communicated to you? you reference the slide that have bullet points. you criticize the committee for not appreciating the full contents of the research, when you have not given us the research. in what form did this come to you? how was it memorialized and presented to you? mr. mosseri: senator, two forms. the most important is the data because that allows any researcher and i'm committed to making sure researchers can have access to that to draw their own conclusions. then presentations like power points. we have made that public. i believe the most important thing over time as we provide regular access to meaningful data about social media usage across the entire industry to academics, experts to design their own studies and rather own conclusions. i am concerned about anyone, we are talking about people and not numbers >> let's talk about numbers. did instagram do anything to quantify how many teenage girls have killed themselves because of your product? mr. mosseri: we talk about suicide and self-harm, and that research has inspired much of our work to make sure we not only have clear policies -- >> did you quantify it or not? you do research to estimate, to count how many teenage girls have taken their lives because of your product. mr. mosseri: senator, we do research to understand problems. in the case of suicide, to make sure we take down suicide related content from the platform, to connect anyone who looks like they are a threat of hurting themselves. >> how did you change your policies as a result of this research to protect young girls? mr. mosseri: senator, i appreciate the question. we use research to not only change our policy but our product on a regular basis. restricts ally you to protect yourselves when someone is harassing you, because we learned teens struggle during moments of transition, suicide and self injury, we learned we have to be incredibly careful because often teens suffering from these issues use instagram to find support, and we need to make sure they can find that support and talk about recovery. >> big tech loves to use eloquent phrases about bringing people together, but the simple reality and why so many americans distrust big tech is you make money the more people are in your product, the more people are engaged in viewing content that is harmful, you are making money. when your colleagues have been asked the same question as a result of this research, what policies did you change, this committee is unable to get a straight answer about what is different, and the reason for that is, if you change policies to reduce the eyeballs you would make less money. why is that inference not correct? mr. mosseri: senator, if people don't feel safe on our platform, if they don't feel good about the time they don't -- the time they don't -- time they spend on our platform, they will use other services. competition has never been stronger. i have to believe over the long run, if not only incredibly important we keep people safe, but they feel good. >> my time has expired, but i want to make sure the commitment you have made. as i understand it, you have committed to providing this committee with the raw data from the research you did on users of your product, in particular body image issues and tendencies toward suicide, and also with the powerpoint presentations that memorialized that raw data, is that correct? mr. mosseri: senator, i'm committing to do everything i can do -- >> is there a reason you can't do everything i just said? mr. mosseri: the challenge on the data is in many cases we no longer have it. >> how about the powerpoint presentations? mr. mosseri: i think the most responsible thing to do is to provide access to data to external researchers. >> we would like both. is there a reason you were hiding the powerpoint presentations? maximum transparency would be showing the presentations that were prepared for you, presumably you had some reason to trust them because they were prepared for your consumption. mr. mosseri: i believe you have the presentation which is why we are focused on the data. we think any researcher should be able to draw their own conclusions based on the raw data, that is most important and knowledgeable part of the process. unfortunately, much of the data we no longer have, which is why i am very committed to making sure we can allow access to meaningful engagement data to researchers outside of the company to focus specifically on the effects of social media on well-being, and i'm calling for the rest of the industry to do the same. >> your commitment is to provide all of the data you have and powerpoint presentation. mr. mosseri: our commitment is to provide meaningful access to data based on what researchers will is, because i think that is the most responsible thing for us to do. >> we are requesting right now. mr. mosseri: to do a study on the effects of social media on well-being, you would have to design the study. i would like to talk to, we have worked with pew, harvard, i would like to talk to the researchers and understand what specific data they would like access to. we can just provide all data we have, that's an untenable thing to do. >> the data that was the basis for the study quoted in the wall street journal report, that is the data i am asking about. the presentations that summarized the conclusions of that. mr. mosseri: i would love to provide that data. i'm trying to find out if there is anyway we can provide it in a safe way, i think that's important, i have been working on that. i do not want to overpromise and under deliver which is why i am more focused on making sure researchers have access to data going forward. >> i just want to say, mr. mosseri, the datasets are not enough. this answer is in my view completely unsatisfactory. we want the studies, we want the research, we want the survey, the whistleblower has disclosed a lot of them, and the answer you have given, very respectfully, simply want get it. it's in your file, it was destroyed? we want to know about that, but information is absolutely the coin of the realm when we go about devising legislation. i must say, there is a disconnect here. senator sullivan asked you about content relating to suicidal or self harm, i think i'm quoting directly, you said there isn't any. well, we have a team account, with all of the protections on. we searched "slit wrists" and the results, i don't feel i can describe in this hearing. they are so graphic. that is within the past couple of days. i described to you and account that looked at eating disorders, and attracted the same deluge of self-harm, anorexia coaches. so, i just feel there is a real lack of connection to the reality of what is there in the testimony you were given today, which makes it hard to have you as a partner. maybe we need to have some kind of compulsory process. senator cruz and i don't always agree, but on this point, on the need for information, i think you have heard here the bipartisan call for a reality check in for access. the fact that this content continues to exist on the site despite your denials, i think really is hard to accept. instagram suggestion as a solution to note teens -- knowledge -- nudge teens. it takes more than a nudge. i'm well beyond the teenage years with my four children, but if you said to a teen who was on instagram, fixated on eating disorders, why don't you try snorkeling in the bahamas, that just won't work. instagram has a real asymmetric power. it drives teens in a certain direction, and then makes it very difficult for the teen, once in a dark place, to find light again, and to get out of it. my question to you is, don't we need enforceable standards set by an independent authority, not an industry body, objective, independent researchers, with full access to your algorithm? when you commit to support full disclosure of your algorithms and a commitment to an independent authority? mr. mosseri: senator, directionally we are very aligned. we agree on the importance of transparency. >> then you would make available all of the studies like the ones frances haugen presented to us. mr. mosseri: i'm confident we are more transparent than any other tech company in the industry. >> that's a pretty low bar. you are in the gutter in terms of transparency, because they committed to make available their algorithms, but only after we pressed them to do it. we still are awaiting full compliance. mr. mosseri: we have been publishing research for years. we will publish 100 things this year alone. i believe there is an immense amount of data in our quarterly reports, i believe we will start having them audited by ernst & young starting this quarter, i believe that our ad library provides more transparency than any other business in any industry, tech or otherwise. yes, i believe there is more to do, yes i believe in federal legislation, yes i believe policymakers should be actively involved, and i'm looking forward to having our teams work with yours on shipping without looks like. directionally, we believe strongly in transparency and account ability. i am not familiar with every provision in that act. >> if you believe what you have testified, you would say yes, i support the act. mr. mosseri: senator, respectfully, i don't think it would be appropriate for me to commit to something that i have not read in full, but i really do want our team to work with yours. as i said, we are calling for industrywide regulation, we believe it's important, it is why we are having this hearing, it's why i appreciate these questions even though they are difficult, because we believe there is nothing more important than keeping teens safe online. >> to support prohibition on advertising and marketing to teens for products that are illegal for them to consume? mr. mosseri: senator, i believe we already prohibit that. in the case of under 18-year-olds, we don't allow ads for tobacco to any age, but we don't allow things like gambling, alcohol for under 21. >> would you support legally enforceable prohibitions where you could be held liable? mr. mosseri: senator, we sued -- support industry standards, accountability, part of the standards as i call out in my testimony, include age-appropriate design which will include content rules. >> if you host child sexual abuse material, should the victims be able to see you? -- sue you? mr. mosseri: child explication is an incredibly serious issue. an earlier senator mentioned how we collaborate on this. i believe we will continue to invest more than anyone else in this phase, and i believe federal regulation is the best form of accountability, with enforcement. >> i will turn to senator blackburn. i have a few more questions. >> in the question of fact, i wanted to know if you were or traffickers -- report traffickers, and if that is important to do, you need to come back to that. you mentioned referring children to local authorities, you need to let us know how many we are talking about, the hundreds, thousands, give us those numbers so we have that data. also, i would like to know how long you hold the data on your research. you mentioned you can give us the data and then say you may not have some of this data, we need to know how long you are holding this data on minors. those children you are data mining. you mentioned in response to send barely's -- senator lee's question that you hoped you did not sound callous. sir, you did sound callous. every single life matters. every light matters. this is why we need this research. you basically give teen girls know reese -- recourse if they get into a dark spot using instagram. we have a lot of parents who come to us. this is why we are doing these hearings, parents are concerned. i asked you yesterday when we talked if you ever talked to these parents whose children have taken their lives or have ended up having to have mental health services because of what they have incurred, and you said yes you do. i want to give you one minute to speak to parents who are struggling, their children have attempted suicide, or maybe some of them have taken their lives. take the next minute and speak directly to these parents. as i told you, i talked to a lot of parents, they never heard one word from instagram or facebook or meta, and they are struggling with this. senator cantwell brought this up to you. the next 60 seconds, the floor is yours. speak to these parents because we are not talking to people that have ever had any kind of response from instagram, and you have broken these children's lives, and you have broken these parents hearts. the floor is yours. mr. mosseri: thank you, senator. i'm a father of three. any parent whose lost a child or have a child hurt themselves, i can't begin to imagine what that would be like from one of my three boys. as the head of instagram, it's my responsibility to do all i can will save, i have been committed to that for years, i will continue to do so. whether or not we invest more than every other company or not doesn't matter for any individual, if any individual harms themselves, has a negative experience, that is something i take seriously. i talked a lot about parental controls. i do believe apparent knows what is best for their giant -- child, but i also know a lot of parents are busy. i have three kids and i have a lot of support, i can imagine having four kids or being a single parent. i do not want to rely on parental controls. i think it's incredibly important the experience is safe and appropriate for your age no matter what it is. if you have the time and interest, i also think as a parent you have the right to be able to understand what your kids are doing online and you should have control to shape that experience and do what is best for them, and if you don't have time, that is ok. it's my responsibility to do all i can to help to keep young people safe, and anyone who uses our platform. >> we are telling you children have inflicted self-harm, they are getting information that is destroying their young lives. and we are asking you, have some empathy and take some responsibility. it seems as if you just can't get on that path. where going to continue to work on this issue. i wish that your response had been more empathetic. >> i just have one more question , and i think we are going to make the deadline. we want to be respectful of your time. i understand that it is important to have your internal discussions and debate as any company, but the studies and research are really important for parents to make decisions, and i am reminded of some work i did when i was state attorney general in connecticut. we were one of the first states, with the help of a company in connecticut to require warnings about small parts on toys, which i urged the legislature to do and then i fought the industry, challenged that labeling. we won against challenges based on the commerce clause and other constitutional claims, and the supreme court denied cert., and the industry denied it wanted a federal standard because it did not want to deal with state-by-state requirements. the point was, the law requires companies to disclose risk. it encouraged them to compete over values that were positive and promoted safety. that is the kind of competition we need in your industry, as senator klobuchar mentioned earlier, we have been working on antitrust measures. some went to work on safety. disclosure. disinfected in sunlight is important. what instagram report -- support legal requirements on social media platforms to provide public disclosures about the risks of harm, the risk of harm in content? mr. mosseri: i would support federal legislation around the transparency of data, access to data from researchers, around the prevalence of content problems. i think all of those are ways parents or anyone can get a sense for what a platform is doing, what its effects on people are. i believe deeply that transparency is important, which is why i'm confident we will continue to lead the industry on being incredibly transparent about what happens on our platform. >> you said repeatedly that you are in favor of one or another direction. i find that term incomplete when it isn't accompanied by specific commitments, i guess orono. we're going to move forward directionally with specifics, baby steps that you have suggested so far, very respectfully, are underwhelming. a nudge? a break? that ain't gonna phase kids from the addictive effects. there is no question there are addictive effects of your platform. i think you will sense on this committee a pretty strong determination to do something well beyond what you have indicated you have in mind, and that is the reason i think self policing based on trust is no longer a viable solution. we thank you for being here today. we're going to continue the effort to develop legislation, many of us, working with senator blackburn, this hearing record will be made open for two weeks if you feel you want to supplement any of your answers, if my colleagues would like to submit questions to the record, do so by december 22. we ask your responses be returned to the committee as quickly as possible. that concludes today's hearing. thank you for being here. prof. bridges: mr. mosseri: mr. mosseri: i appreciate your time. [indiscernible] bookstore in washington, dc. good evenive

Related Keywords

Tennessee , United States , United Kingdom , Washington , California , Togo , Utah , Bahamas , The , Connecticut , Americans , American , Shaaban Frances , Wason Instagram , Facebook Sawa , Francis Hogan ,

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.