comparemela.com

Pressure and our willingness to fight, and negotiate as if there is no fighting, if we are going to secure the release of these hostages alive. And ultimately bring peace to israel. We are going to leave the last few moments of this to hear the ceos testifying on child equitation. We have the ceos of meta. We also have evan spiegel. This is live coverage on cspan3. Members of the Senate Judiciary committee are arriving here with different ceos, talking about child sexual explication. The hearing comes amid roman concerns that social media is harming americas youth. The story says that the u. S. Surgeon general issued a Public Health advisory saying that social media use presents a profound risk of harm for teen and kids Mental Health. In written testimony, soderbergh Mark Zuckerberg announced that meta has introduced controls for how long children can use their services. We will hear from him, and the ceos of snap, tiktok, discord, and capital ask in a few minutes. Speaking on the Senate Judiciary committee will come to order. I think all those in attendance. I want to preface my remarks by saying i have been in congress for a few years. Senator graham has as well. If you do not believe this is an idea whose time has come, take a look at the turnout here. Today, the Senate Judiciary committee will continue its work on an issue on the mind of most american families. How to keep our friends safe from Sexual Exploitation and harm in the internet age. Online child Sexual Exploitation includes the line of Online Platforms to target and groom children in the production and endless distribution of child sexual abuse material. Can haunt victims for their entire lives, and in some cases take their lives. Everyone will agree this conduct is abhorrent. I would like to turn to video to hear directly from the victims, the survivors, and the impact these crimes have had on them. I was sexually exploited on facebook. I was sexually exploited on instagram. I was sexually exploited on x. This is my daughter, olivia. Look at how beautiful miriam is. My son riley died from suicide after being sexually exploited on facebook. The child that gets exploited is never the same ever again. I reported this issue numerous times and it took over a decade before anyone helped me. You may be able to tell i am using a green screen. Why is that . In the internet world, my past abusers can contact me. Fans of my abuse material as a child can find me and contact me. As a 17yearold child, after being extorted for four consecutive years, i was Strong Enough to resist any more pictures but there are dozens more who are not. We got a phone call to find my son was in his room and suicidal. He was only 13 years old at the time. Him and a friend had been exploited online and trafficked. My son reached out to twitter or not x. The response was, thank you for reaching out. We reviewed the content and did not find a violation of our policy so no action will be taken at this time. How many more kids like matthew . Like olivia . Mike reilly . How many more kids will suffer because of social media . Big tech failed to protect my child from Sexual Exploitation. Big tech failed to protect me from Sexual Exploitation. And we need congress to do something for our children and protect them. It is not too late to do something about it. Online Child Exploitation is a crisis in america. In 2013, the National Center for missing and exploited children known as ncfmaec received 1080 cyber tips to day. Just 10 years later, the number of cyber tips has risen to 100,000 cyber reports a day. That is 100,000 daily reports of child sexual abuse. In recent years, we have also seen a explosion in the so called financial sextorition in which a person tricks a minor into sending photos or videos and threatened to release them unless the victim sends money. In 2021, nick make received a total of 129 reports of sextortion. 2021. In 2023, through the end of october alone, this number skyrocketed to more than 22,000. More than a dozen children have died by suicide after becoming victims of this crime. This disturbing growth in child Sexual Exploitation is driven by one thing. Changes in technology. In 1996, the worlds best selling cell phone was the motorola startac. While Ground Breaking at the time, the clamshell style cell phones not much different than the traditional phone. It allowed users to receive calls and send Text Messages but that was about it. Fast forward to today. Smart phones are in the pockets of seemingly every man, woman, and teenager on the planet. I would like to start with todays smart phones allow users to receive phones and text but they can also take photos and videos, support Live Streaming, and offer countless apps. With a touch of your finger, that smart phone can entertain and inform you, and it can become a back alley where the lives of your children are damaged and destroyed. These apps have changed the ways we live, work, and play. But as investigations have detailed, social media and messaging apps have also given predators powerful new tools to sexually exploit children. There are carefully crafted algorithms and they can be a powerful force on the lives of our children than even the best intentions of the parent. Discord has been used to groom, abduct children. Metas instagram has helped to promote a network of pedophiles. Snapchat as disappearing messages have been coopted by criminals who financially extort the young victims. Tiktok has become a, quote, platform of choice for predators to access, engage, and groom children for abuse. And the prevalence of csam on x has gutted the safety workforce. Today, we will hear from the ceos of those companies. They are not only the companies that have contributed to this crisis but are responsible for many of the dangers our children face online. They are designed choices. They are failures to adequately invest in trust and safety. Their constant pursuit of profit over basic safety have put all our kids and grandkids at risk. Coincidentally, coincidentally, several of these companies implemented commonsense child safety improvements within the last week, days before their ceos would have to justify their lack of action before the committee. But the Tech Industry alone is not to blame for the situation we are in. Those of us in Congress Need to look in the mirror. In 1996, the same year that motorola startac was flying off shelves and years before social media went mainstream, we passed section 230 of the Communications Decency act. This lot immunized the then fledgling internet platforms for user generated content. Interesting. Only one other industry in america has immunity from civil liability. We will leave that for another day. For years, section 230 has remained largely unchanged allowing big tech to grow into the biggest industry in the history of capitalism without fear for liability for unsafe practices. That has to change. Over the past year, this committee has unanimously reported five bills that would finally hold Tech Companies accountable for child Sexual Exploitation on their platforms. Unanimous. Take a look at the opposition in the Senate Judiciary committee and imagine if you will that if there is anything we can agree on unanimously that these five bills are the object of agreement. One of these bills is to stop csam act. It would lead victims through online providers who aid or abet Child Exploitation or who host or star csam. The stand against child Sexual Exploitation is bipartisan and absolutely necessary. Let this be a call to action that we need to get kids Online Safety legislation to the president s desk. I now turn to the Ranking Member senator graham. Thank you, mr. Chairman. This will answer the call. Every one of them. Ready to work with you and our democratic colleagues on this committee to prove to the American People that while washington is certainly broken, there is a ray of hope and it is here. It lies with your children. After years of working on this issue with you and others, i have come to conclude the following. Social Media Companies, as they are currently designed, and operates, are dangerous products. They are destroying lives, threatening democracy itself. These Companies Must be reined in, or the worst is yet to come. Gavin guffey is a representative republican representative from South Carolina. And the rock hill area. To all the victims who came and showed us photos of your once, dont quit. It is working. You are making a difference. Through you, we will get to where we need to go so other people will not have to show a photo of their family. The damage to your family has been done. Hopefully, we can take your pain and turn it into something positive so no one else has to hold up a sign. Gavins son got online to instagram and was tricked by a group in nigeria that put up a young lady posing to be his girlfriend and as things go at that stage in life, he gave her some photos. Compromising sexual photos. And it turned out that she was part of a Extortion Group in nigeria. They threatened the young man that if you did not give us money we are going to expose these photos. He gave them money but it was not enough. They kept threatening and he killed himself. They threatened mr. Guffey and his son. These wereby any known definition. Mr. Zuckerberg, you and the companies before us i know you do not mean for it to be so but you have blood on your hands. You have a product. You have a product that is killing people. When we had cigarettes killing people, we did something about it. Maybe not enough. You want to talk about guns, we have the atf. Nothing care. Theres not a thing you can do about it. You cant be sued. Now senator blumenthal and blackbird has been the dynamic duo here that found emails from your company where they warned you about the stuff. And you decided not to hire 45 people that could do a better job of policing this. So the bottom line is you cannot be sued. You should be. And these emails would be great for punitive damages, but the courtroom has closed every american abuse by all the companies in front of you. Of all the people in america we could give blanket Liability Protection to, this would be the last group. It is now time to appeal section 230. This committee is made up of the ideologically most different people you could find. We have come together through your leadership mr. Chairman to pass five bills to deal with the problem of exploitation of children. I will talk about them in depth in a little bit. The bottom line is all these bills have met the same fate. They go nowhere. They lead to committee and they dye. Now, there is another approach. What you do with dangerous products . You either allow lawsuits, statutory protection to protect consumers, or you have a commission of sorts to regulate the industry in question. They take your license away. They find you. None of that exists here. We live in a america in 2024 where there is no regulatory body dealing with the most profitable Biggest Companies in the history of the world. They cannot be sued, and theres not one law in the book that is meaningful protecting the american consumers, and other than that we are in a good spot. So here is what i think is going to happen. I think after this hearing today, we are going to put a lot of pressure on our colleagues. Leadership in the republican and Democratic Senate to let these bills hit the floor and vote. And im going to go down starting in a couple of weeks, make unanimous requests unanimous route requests to do cspan. Do the csam. Do all the bills. Become famous. I am going to give you a chance to become famous. Now Elizabeth Warren and Lindsey Graham have almost nothing in common. I promised i would say that publicly. The only thing worse than me doing a bill with Elizabeth Warren is hardly a bill with me. We have sort of parked that because elizabeth and i see an abuse here that needs to be dealt with. Senator durbin and i have different political philosophies but i appreciate what you have done on this committee. You have been great so to all my democratic colleagues thank you, very, very much. To all of my republican colleagues, thank you very, very much. Save the applause for when we get a result. This is solid right now. But there will come a day when we keep pressing that we get the right answer for the American People. What does that answer . Accountability. Now, these products have an upside. They have enriched our lives in many ways. Mr. Zuckerberg, you have created products i use. The idea that i think when you first came up with it to be able to talk your friends and family and have a place where you can talk your friends and family about going on things in your life. There is an upside here. But the darkside has to be dealt with. It is now time to deal with the darkside because people have taken your idea, and they have turned it into a nightmare for the American People. They have turned it into a nightmare for the world at large. Tiktok. We had a great discussion about how maybe Larry Ellison could protect american data from Chinese Communist influence. But tiktok, the representative in israel, quit the company, because tiktok is being used in a way to basically destroy the jewish state. This is not just about individuals. I worry that in 2024, our democracy will be attacked again through these platforms by foreign actors. We are exposed. And a. I. Is just starting. So to my colleagues, we are here for a reason. This committee has a a history of being tough, but also doing things that need to be done. This committee has risen to the occasion. There is more that we can do but to the measures of this committee, lets insist that our colleagues rise to the occasion, also. With the hundred and 18th congress, we have the votes that can fix this problem. All you can do is cast your vote at the end of the day, but you can urge the system to require others to cast their vote. Mr. Chairman, i will continue to work with you and everybody on this committee to have a day of reckoning on the floor of the United States senate. Thank you, senator graham. Today, we welcome five witnesses who i will introduce. Jason cintron, the ceo of this court, inc. Mark zuckerberg, the founder and ceo of meta. Evan siegel, the cofounder of snap inc. I will note for the record that mr. Zuckerberg and mr. Chu are appearing voluntarily. I am disappointed that our other witnesses have not done that. The others are here pursuant to subpoenas and mr. Cintron on the excepted service after u. S. Marshals were sent to discords headquarters at taxpayer expense. I hope this is not a sign of your commitment or a lack of commitment to addressing the serious issue before us. After the witnesses, each witness will have five minutes to make a statement. Senators will ask questions any opening ground for each of seven minutes. I expect to take a short break at sometime during questioning to allow the witnesses to stretch their legs. If anyone is in need of a break at any time please let my staff know. Before i turn to the witness side, i would also like to take a moment to acknowledge that this area has gathered a lot of attention as we expected. We have a large audience. The largest i have seen in this room today. I want to make clear as with other Judiciary Committee hearings, we ask people to behave appropriately. I know there is high emotion in this room for justifiable reasons, but i ask you to please follow the traditions of the committee. That means no standing, shouting, chanting, or applauding witnesses. Disruptions will not be tolerated. Anyone who does disrupt the hearing will be asked to leave. The witnesses are here today to address a serious topic. We want to hear what they have to say. I think you for your cooperation. Can all the witnesses please stand to be sworn in. You confirm the testimony you are about to give before the committee will be the truth, the whole truth, and nothing but the truth . So help you god . Let the record show that all witnesses have answered in the affirmative. Mr. Citron, please proceed with your Opening Statement. Good morning. My name is Jason Cintron and i am the cofounder and the ceo of discord. We are an American Company with about 800 employees living and working in 33 states. Today, this court has gone to more than 150 million monthly active users. Discord is a Communications Platform where friends hang out and talk online about shared interests, from Fantasy Sports to writing music to video games. I have been playing video games since i was five years old. And as a kid, that is how i had fun and found friendship. Me of my fondest memories are playing video games with friends. We built this court so that anyone can build friendships playing video games from minecraft to wordle and everything in between. Games have always brought us together and discord made it happen today. Discord is one of the Many Services that have revolutionized how we can locate with each other and the different moments of our lives. Imessage zoom, gmail, and on and on. They create communities, accelerate commerce, healthcare, and education. Just like with all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes. All of us here on the panel today, and throughout the Tech Industry have a solemn and urgent responsibility to ensure that everyone who uses our platform is protected from these criminals, both online and off. Discord has a special responsibility to do that because a lot of our users are young people. More than 60 of our active users are between the ages of 13 and 24. That is why safety is built into everything we do and is essential to our mission and our business. And, most of all, this is deeply personal. I and a dad with two kids. I want this court to be a product that they use and loved. And i want them to be safe on discord. I want them to be proud of me for helping to bring this product to the world. That is why i am pleased to be here today to discuss the important topics of the Online Safety of miners. I have written testimony and it provides a comprehensive overview of our safety programs. Here are a few examples of how we protect and empower young people. First, we put our money into safety. The tech sector has a reputation of Larger Companies buying smaller ones to increase user numbers and boost Financial Results. But the largest acquisition we have ever made it discord was a Company Called sentra fee. It did not help us expand our market share because it uses a. I. To help us identify, band, and report criminals and bad behavior. It has actually lowered our user account to get rid of bad actors. Second, you have heard of end to end encryption that blocks anyone, including the platform itself from seeing users communications. It is a feature on dozens of platforms, but not on discord. That is a choice we have made. We do not believe we can fulfill our safety obligations at the messages of teens are fully encrypted, because encryption would block our ability to investigate a serious situation and when appropriate report to Law Enforcement. Third, we have a zero tolerance policy on child sexual abuse material or csam. We scan images, upload it to discord, to detect and block the sharing of this abhorrent material. We have also built a innovative tool, tenet safety assist, that blocks images and helps block and report unwelcome conversations. We have also developing new semantic hatching technology for detecting novel forms of csam called c. L. I. P. Finally, we recognize that improving on my safety requires all of us to Work Together, so we partner with nonprofits, Law Enforcement, and our tech colleagues to stay ahead of the curve in protecting young people online. We want to be the platform that empowers our users to have better online experiences. To build true connections, genuine friendships, and to have fun. Senators, i sincerely hope today is the beginning of a ongoing dialogue that results in real improvement in Online Safety. I look forward to your questions and helping the committee learn more about discord. Thank you, mr. Citron. And mr. Zuckerberg . Members of the committee, every day teens and young people do Amazing Things in our services. These are apps to create new things, express themselves, explore the world around them, and feel more connected to the raw they care about. Overall, teens tell us this is a positive part of their lives, but some face challenges online and they work hard to produce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to medicate with our kids and connect to their lives but it can also make parenting more complicated, and it is important to me that our services are positive for everyone who uses them. We are on the side of parents working hard everywhere to raise their kids. Over the last eight years, we have built more than 30 different tools and features so parents can set time limits for their teens using their apps and see who they are following or report someone for bullying. For teens, we have added nudges to remind them when they have been using instagram for a a while or they should go to sleep. And we have added words and ways for people without people finding out. By default, counts for under 16 are set to private and have the most restrictive content settings and cannot be messaged by adults they do not follow or are not connected to. Was so much of our lives spent on mobile devices and social media, it is important to look into the effects on tenet Mental Health and wellbeing. I take this very seriously. Mental health is a complex issue and the existing body of scientific work has not shown a causal link between using social media and young people having worse Mental Health outcomes. A recent National Academies of science report evaluated over 300 studies, and found that research, quote, did not support the conclusion that social media causes changes in adolescent Mental Health and the population level. End quote. They also suggested that they can provide significant positive benefits when young people use it to express themselves and connect with others. Still, we will use this research to inform our road map. Keeping young people safe online has been a challenge since the internet began. And is criminals evolve their tactics, we have to evolve our defenses, too. We have to find bad actors and bring them to justice, but the difficult reality is that no matter how effective we invest or how effective our tools are, there is always more to learn and more improvements to make it. But we remain ready to work with members of this committee and parents to make the internet safer for everyone. I am proud of the work that our teams due to improve online child safety on our services and across the entire internet. We have around 40,000 people overall working on safety and security and we have invested more than 20 billion since 2016, including 5 million in the last year alone. We have many teams dedicated to child safety and we lead the industry and a lot of the areas we are discussing today. We built technology to tackle the worst online risks and share to help the whole industry get better, like project lantern which helps Companies Share data about people who break child safety rules and we are founding members who help prevent nude images from being spied on line. We also go beyond legal requirements and use Sophisticated Technology to proactively discover abusive material, and as a result, we find and report more inappropriate content than anyone else in the industry. As the National Center for missing and exploited children put it this week, meta goes above and beyond to make sure there are no portions were this type of activity occurs. I hope we can have a substantial discussion today with legislation that delivers what parents say they want, a clear system for age verification, and control over what apps their kids are using. Three out of four parents want app store age verification. And four out of 51 parental approval of whatever whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and should not have to float their i. D. Every time. That is what app stores are for. We also support setting Industry Standards on age appropriate content and limiting signals for advertising to teens to age and location and not behavior. At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before i wrap up, i want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent on every platform and i am committed to continuing to work on these areas, and i hope we can make progress today. Thank you. Mr. Spiegel . , Ranking Member graham, and members of the committee, thank you for moving forward an important legislation to protect Children Online. I am evan siegel, the cofounder and ceo of snap. We created snapshot, a Online Service used by more than 800 Million People online to communicate with their friends and families. I know many of you have been working to protect Children Online since before snapshot was created and we are grateful for your longterm dedication to this cause and your willingness to Work Together to help keep our communities safe. I want to acknowledge the survivors of online harms in the families were here today who have suffered the loss of a loved one. Words cannot begin to express the profound sorrow i feel with a service we have made to bring people happiness and joy that has been used to create harm. I want to be clear we understand our responsibility to keep the community safe. I also want to recognize the many families who have worked to raise awareness on these issues, push for change, and collaborated with lawmakers like the cooper davis act which can help save lives for christ are the building snapshot with my cofounder bobby murphy when i was 20 years old. We designed snapchat to solve some of the problems we have experienced online when we were teenagers. We did not have an alternative to social media. Picture shared online were permanent and subject to popularity metrics and it was not very good. We built snapchat differently because we wanted a way to communication that was fast, fun, and private. A picture is worth 1000 words of people communicate with images and videos. We do not have public likes and comments when you share your story with friends. Snapchat is private by default meeting people have to opt in and choose who can contact them. One we built snapchat, we had services deleted by default. Like prior generations of enjoyed the privacy of phone calls which are not recorded, they have enjoyed moments that may not be pictures are perfect but can instead convey emotion without permanence. Even though they are deleted by default, we let everyone know that images and videos can be saved by that recipient. When we take action, we also retain the evidence for a extended period which allows us to support Law Enforcement and hold criminals accountable. To help prevent the spread on snapchat, we as a combination of automated processes and human review. We apply our content rules consistently and fairly across all accounts. We want samples of our enforcement action to quality enforcement to verify that we are getting it right. We also proactively scanned for known child sexual abuse material, drugrelated content, and other kinds of harmful content and we deactivate it and preserve the evidence to Law Enforcement and present evidence to the relevant authorities for further action. Last year, we made 690 reports leading to more than 1000 arrests. We also removed 2. 2 million pieces of drugrelated content and block 705,000 associated accounts. Even with our strict privacy settings, content moderation efforts, and Law Enforcement collaboration, bad things can still happen when people use Online Services. That is why we believe that people under the age of 13 are not ready to communicate on snapchat. We strongly encourage parents to use the device level Parental Controls on iphone and android. Weaves them in our own household and my wife approves every app that our 13yearold downloads. For parents who want more visibility and control, we have the Family Center we can review who your teen is talking to, review privacy settings, and set limits. We have worked for years with the committee on the Online Safety act and the cooper davis act which we are happy to support. I want to produce broader legislation. Know legislation is perfect but some rules of the road are better than none. Much of the work that we do protect people from our service would not be possible without the support of our industry, government, nonprofit organizations and ngos, and in particular Law Enforcement and First Responders who have committed their lives to keeping people safe. I am extraordinarily grateful for the efforts to prevent criminals from using Online Services to perpetrate their time. I feel a deep obligation to give back and to make a positive difference, and i am grateful to be here today as part of this vitally important democratic process. Members of the committee, i give you my commitment that we will be part of the solution for Online Safety. We will be honest about our shortcomings and we will work continuously to improve. Thank you, and i look forward to your questions. Thank you, mr. Spiegel. Mr. Chew . Members of the committee, i appreciate the opportunity to appear before your today. My name is shou chew and i am the ceo of tiktok, a Online Community of more than 1 billion people worldwide, including well over 170 million americans who use our app every month to create, to share, and to discover. Now, although the average age on tiktok in the u. S. Is over 30, we recognize that special safeguards are required to protect minors, especially when it comes to combating all forms of csam. As a father of three Young Children myself, i know that the issues we are discussing today are horrific, and the nightmare of every parent. I am proud of our efforts to address the threats to young people online from commitment to protecting them to our industryleading policies, and use of innovative technology, and our significant ongoing investment in trust and safety to achieve this goal. Tiktok is vigilant about enforcing its 13 and up policy and offers an expense for teens that is much more restrictive than you and i would have as adults. We make careful Product Design choices to help make our app inhospitable to those seeking to harm teens. Let me give you a few examples of longstanding policies unique to tiktok. We did not do them last week. First, that rankmessaging is not available to any users under the age of 16. Second, accounts for people under 16 are automatically set to private along with their content. Furthermore, the content cannot be downloaded and will not be recommended to people they do not know. Third, every teen under 18 has a screen time limit automatically set to 60 minutes. And, fourth, only people 18 and above are allowed to use our live stream feature. I am proud to say that tiktok was among the first to empower parents to supervise their teens on our app with our familypairing tools. This includes setting screen time limits, filtering out content and others. We made these choices after consulting doctors and Safety Experts who understand the unique stages of teenage development to ensure we have the appropriate safeguards to prevent harm and minimize risk. Now, safety is one of the core priorities that defines tiktok under my leadership. We currently have more than 40,000 trust and Safety Professionals working to work our community globally. And we expect to invest more than 2 billion in trust and safety efforts in this year alone. A significant part of that in our u. S. Operations. Our Robust Community guidelines strictly prohibit content or behavior that puts teenagers at risk of exploitation or harm, and we vigorously enforce them. It helps to quickly identify potential csam and other materials that break our rules. It automatically removes the content for Safety Professionals for further review. We also moderate the right messages for csam and relate to the material and use Third Party Tools like photo dna and take it down to combat csam to prevent that content from being upload to our platform. We continually meet with parents, teachers, and teens. In fact, i set down with a group just a few days ago. Weaves their insight to strengthen protections on our platform and we also work with leading groups like the Technology Coalition program the steps we are taking to protect teens are a critical part of a larger trust and safety work as we continue our voluntary and unprecedented efforts to build a safe and secure data environment for u. S. Users. Ensuring that our platform remains free from outside manipulation and lamenting safeguards on our content recommendation and moderation tools. Keeping teens safe online requires a collaborative effort as well as action. We need commitment to protecting people online and we welcome the opportunity to work with you on legislation to achieve this goal. Our commitment is ongoing and unwavering because there is no finish line when it comes to protecting teens. Thank you for your time and consideration today. I am happy to answer your questions. Thanks, mr. Chew. Mz. Yaccarino . Thank you. Mz. Yaccarino, can you check if your microphone is owned . How is that . Maybe if i better just might chair. My apologies. Let me start over. Chair durbin, chairman graham, and esteemed Ranking Members of the family, thank you for the opportunity to discuss xs work for the safety of minors online. Todays hearing is entitled a crisis. It calls for immediate action. As a mother, this is personal. And i share the sense of urgency. X is an entirely new company, and indispensable platform for the world and for democracy. You have my personal commitment that x will be active and a part of this solution. While i joined x only in june of 2023, i bring a history of working together with governments, advocates, and ngos to harness the power of media to protect people. Before i joined, i was struck by the leadership steps that this new company was taking to protect children. X is not the platform of choice for children and teens. We do not have a line of business dedicated to children. Children under the age of 13 are not allowed to open an account. Less than 1 of the u. S. Users on x are between the ages of 13 and 17. And those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve. In the last 14 months, x has made material changes to protect minors. Our policy is clear. X has zero tolerance towards any material that features or promotes child Sexual Exploitation. My written testimony details xs extensive policies on content or actions that are prohibited, and include growing blackmail and alleged victims of cse. We also have more tools and technology to prevent those bad actors from distributing, searching for , and engaging with atf. If cse content is posted on x, we remove it. And, now, we also remove any account that engages with cse content, whether it is real or computergenerated. Last year, x suspended 12. 4 million accounts for violating our cse policies. This is up from 2. 3 million accounts that were removed by twitter in 2022. In 2023, 850,000 reports were sent to neckmec, including our first ever auto generated report. This is 8 times more than what was reported by twitter in 2022. We have changed our priorities. We have restructured our trust and safety teams to remain strong and agile. We are building a trust and Safety Center of excellence in austin, texas, to bring more agents in house to accelerate our impact. We are applying to the Technology Coalitions project lantern to make further industrywide impact. We have also opened up our algorithms for increased transparency. We want america to lead in this solution. X commends the senate for passing the support act and we support the shield act. It is time for our federal standard to criminalize the sharing of nonconsensual intimate material. We need to raise the standards across the entire internet ecosystem. Especially for those Tech Companies that are not here today, and not stepping up. X supports the stop can can act. The kids Online Safety act should continue to progress, and we will continue to engage with it and ensure the protections of the freedom of speech. There are two additional areas that require everyones attention. First, as the daughter of a police officer, Law Enforcement must have a Critical Resources to bring these bad offenders to justice. Second, with Artificial Intelligence, offenders tactics will continue to sophisticate and evolve. Industry collaboration is imperative here. X believes that the freedom of speech and platform safety can and must coexist. We agreed that now is the time to act with urgency. Thank you. I look forward to answering your questions. Thank you, very much, mz. Yaccarino. Now we will go into rounds of questions. Seven minutes each for the members, as well. I would like to make note of your testimony, mz. Yaccarino. You were the First Social Media Company to publicly endorse the ccm act. It is our honor, chairman. Inc. You for doing that. I will still be asking some probing questions but let me get down to the bottom line here. I am going to focus on my or knowingly post childs sex materials. Secondly, intentionally or knowingly promote or aid or abet violation of child explication loss. Is there anyone here who believes you should not be held civilly liable for that kind of conduct . Mr. Citron . Good morning, chair. You know, we very much believe that this content is disgusting. And that there are many things about the stop csam act bill that are very encouraging, and we very much encourage adding more resources to the tip line and modernizing that with getting resources to neckmec, and i am very open to having conversation with you and your team. I would sure like to do that because if you intentionally or knowingly post or store csam, i think you ought to be civilly liable. I cannot imagine anyone who would disagree with it. Yes. It is disgusting content. Mr. Spiegel, i want to tell you, i listened closely to your testimony here, and it has never been a secret that snapchat is used to send sexually explicit images. In 2013, early in your companys history, you admitted this in an interview. Do you remember that interview . Senator, i do not recall the specific interview. You want, quote, go up to people and be like you can try this application. You can send disappearing photos. And they would say, oh, for sexting . Senator, when we first rated called ticaboo, and the feedback we received from people was that they were actually using the name of the application to snapchat and we found that they were using it to talk visually. As early as 2017, Law Enforcement identified snapchat is the pedophiles goto application. A 12yearold girl held up shows the danger. Over 2 1 2 years of predators sexually groomed her sending her sexually explicit images and videos of her snapchat. The man admitted he only used snapchat and not any other platforms because he, quote, knew the chats will go away. Did you and everyone else at snap really well to see the platform is a perfect tool for sexual predators . Senator, that behavior is disgusting and reprehensible. We have reporting tools for people who have been harassed or have seen inappropriate content and they can report it. We typically respond to those reports within 15 minutes so we can provide help. When lw submitted, it was dismissed under the communication decency act. You have any doubt that snap faced civil liability for facilitating Sexual Exploitation and the company would have implemented even better safeguards . Senator, we already worked extensively to proactively detect this type of behavior. We make it very difficult for predators to find teens on snapchat. There no public friends list. No public photos. When we recommend friends for teens, we make sure they have several mutual friend in, before making that recommendation. We believe that is important to preventing predators from misusing our platform. Mr. Citron, according to discords website, takes a proactive and automated a approach to safety with more than 200 members. Smaller servers remain date have Community Moderators to define and enforce behavior. How do you defend an approach to safety that lies in groups of fewer than 200 sexual predators to report themselves for things like grooming, trading with csam, or torsion . Chair, our goal is to get all of that content off of our platform and ideally prevent it from showing up in the first place or for people engaging in these kind of horrific activities. We deploy a wide array of techniques that work across every surface on discord. As i mentioned, recently launched something called teen safety assist. It is for teen users. It kind of acts like a buddy that lets them know if they are in a situation are talking with someone that may be inappropriate so they can report that to us and block that user. So we mr. Citron, if that were working we would not be here today. This is an ongoing challenge for all of us. That is why we are here today. 15 of our company is focused on trust and safety in which this is one of our top issues. This is more people we have working on marketing and promoting the company. We take these issues very seriously. It is an ongoing challenge and i look forward to collaborating with nonprofits to improve our approach. I certainly hope so. Mr. Chew, you are your organization and businesses one of the more popular ones among children. Can you explain to us what youre doing particularly and if you have seen any evidence of csam in your business . Yes, senator. You have a strong commitment to trust and safety as i said in my Opening Statement. I intend to invest more than 2 billion in trust and safety this year alone. We have 40,000 Safety Professionals you know working on this topic. We have a specialized Child Safety Team to help us identify a specialized horrific issues like material like the ones you have mentioned. If we identify any and our platform, and we do a detection, we will remove it and report them to other authorities. Why is it that tiktok is allowing children to be exploited into performing commercialized sex act . Senator, i respectfully disagree with that characterization. Our Live Streaming product is not for anyone below the age of 18. We have taken action to identify anyone who violates it and we remove them from using that service. At this point, i am going to turn to my Ranking Member, senator graham. Thank you. Mr. Citron, you said we need to start a discussion. To be honest with you, we have been having this discussion for a very long time. We need to get a result, not a discussion. Do you agree with that . Ranking member, i agree that this is an issue that we have also been very focused on since we started our company in 2015. Are you familiar with the urn it act . A little bit. Yes. You support that . We like yes or no . We are not prepared to support it today. But you support the csam act . The stop csam act . Or the shield act . We believe that the cyber tip line yes or no . We believe that the cyber tip line i will take that to be no. The childhood safety act. Do you support it . We believe that i will take that as a no. The report act. Do you support it . Ranking member graham, we look very much forward to having conversations with you and your team. You support removing section 240 Liability Protections for social Media Companies . I believe that section 230 is in need of an update. It is a very old law. You support removing it so people can sue if they are harmed . There are many downsides thank you very much. So here you are. If you are waiting on these guys to solve the problem you are done waiting. Mr. Zuckerberg. I am trying to be respectful here. The representative from South Carolina got caught up in a extortion ring in nigeria using instagram. And he was shaken down and killed himself using instagram. It is terrible. No one should have to go through Something Like that. You think they should be allowed to see you . I think that they can sue us. I think you should and you cant. So the bottom line here folks is that this committee is done with talking. We passed five bills unanimously and in their different ways. Look at who did this. Graham blumenthal. Durbinholly. Kopitar, holly. I mean, we have found Common Ground here that just is astonishing and we have had hearing after hearing, mr. Chairman, and the bottom line is i have come to conclude, gentlemen, that you are not going to support any of this. Linda how do you say your last name . Yaccarino. You support the earn it act. We strongly support the no. No. No. In english, do support the earn it act. You can actually lose when you lose best business practices. You have to give it no matter what you do. So for the members of the committee, it is now time to say that the People Holding up the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is opened to victims of social media. What percentage is that of what you made last year . Senator, it is a significant and increasing investment. 2 is what percent of your revenue . Senator, we are not ready to share the financials and public. It is a lot if you make 100 billion. If you tell us youre going to spend 2 billion, great. How much do you make . It is all about eyeballs. Our goal is to get eyeballs on you. This is not just on children. The damage being done, do you realize, mr. Chew, that your tiktok representative in israel resigned yesterday . Yes. I am unaware. He said i resigned from tiktok for living in a time in which our existence as jews in israel and israel is under attack and danger. It is from the platform known as l. A. R. K. And other iranian backed terror groups including the hootie in yemen. Senator, i need to make it very clear. The people of hamas contend in hate speech. Why did he resign . Why did he quit . Senator, we you know why he quit . My question is he quit but im sure he had a good shot. He gave up a good job because he thinks youre platform is being used to help people who want to destroy the jewish state. And im not saying that you want that. Mr. Zuckerberg, i am not think you want as an individual any of the harms, but i am saying the product you have created with all the upsides have a dark side. Mr. Citron, i am tired of talking and having discussions. We all know the having discussi we all know the answer. Heres the ultimate answer. Standby your product. Defend your practices. Open at the courthouse door. Until you do that, nothing will change. Until these people can be sued for the damage they are doing, it is all talk. Im a republican who believes in free enterprise, but i also believe that every american who has been wrong has somebody to go to to complain. There is no commission to go to that can punish you. There is not one law in the book because you pose everything we do, and that has to stop. How do you expect people in the audience to believe that we are going to help their families if we dont have some system or combination of systems to hold these people accountable . For all the upside, the dark side is too great to live with. We do not need to live this way as americans. Senator klobuchar is next. Thank you very much, chairman , and thank you Ranking Member. I could not agree more. For too long weve been seeing the social Media Companies turn a blind eye when kids joined these platforms. They fused algorithms and they provided a venue, maybe not knowingly at first, but for dealers to sell deadly drugs like fentanyl. The head of our Drug Enforcement administration said they basically have been captured by the cartels in mexico. I strongly support, first of all the stop csam act and nothing is going to change unless we open up the court room doors. I think the time for immunity is done because i think money talks stronger than we talked. Two of the five bills are my bills with senator cornyn. The other is the shield act, and i do appreciate the support of x. There have been over 20 suicides of kids attributed to online revenge in the last year. But for those parents and those families this is for them about their own child, but its also about making sure this doesnt happen to other children. I know because ive talked to these parents. Bridget lost her teenage son after he took a fentanyllaced pill that he purchased on the internet. This woman son, alexander, was only 14 when he died after taking a pill he did not know was actually fentanyl. We are starting a Law Enforcement Campaign Going to schools with the sheriffs and the Law Enforcement. The way to stop it is at the border and points of entry, but we know that people that get fentanyl are getting it off the platforms. Meanwhile, social media platforms generated 11 billion in revenue in 2022 from advertising directed at children and teenagers, including nearly 2 billion in profits derived from users age 12 and under. When a boeing airplane lost the door, nobody questioned the decision to ground the fleet of 700 airplanes. Why are we taking this same type of Decisive Action on the danger of these platforms when we know these kids are dying. We have bills that is passed through this incredibly diverse committee when it comes to her political views that passed through this committee and they should go to the floor. We should do something about liability, and we should turn to some of the other issues that a number of us have worked on when it comes to the charges for app stores and when it comes to the monopoly behavior. I will stick with this today. 12 fentanyl cases investigated over five months have direct ties to social media. Between 2012 and 2022, cyber tip line reports of online Child Exploitation increased from 415,000 to more than 32 million. As i noted, at least 20 victims committed suicide and exportation cases. I will start with that with you , mr. Citron. The shield act include a provision that would help protection and accountability for those who are threatened by these predators. Young kids get a picture and send it in and think they have a new girlfriend or boyfriend. It ruins their life and they kill themselves. Can you tell me why youre not supporting the shield act . We think its important that teenagers have a safe experience on our platform. I think the portion to strengthen Law Enforcement and hold bad actors accountable you are seeing you may support it . We would like to have conversations with you. We do welcome legislation and regulation. This is an important issue for our country and weve been prioritizing safety im more interested if you supported. There has been so much talk and popcorn throwing. Im tired of this. It has been 28 years since the internet and we have not passed any of these bills. It is time to pass them. And the reason they have not passed is because of the power of your company. Your words matter. Mr. Chew, i am a cosponsor of the stop csam act along with the lead republican, which among other things, empowers victims. Make it easy for them to have Tech Companies remove the material and related imagery from the platforms. Why we do not support this bill . We would supported. I think the spirit is aligned with what we want to do. I think if this legislation is passed we will comply. Mr. Spiegel, we talked ahead of time. I appreciate your companys support for the cooper davis act. It will allow Law Enforcement to do more when it comes to fentanyl a teenager from hastings, i mentioned his mother is here. Suffered migraines when he purchased a percocet and bought a counterfeit drug laced with fentanyl. As his mother said, all the hopes and dreams we have for him were erased in the blink of an eye and now mother should have to bury their kid. Talk about why you support the cooper davis act. We strongly support it and we believe it will help dea go after the cartels and get dealers off the streets to save more lives. Talk to the others support that bill . Last, mr. Zuckerberg, in 2021 the wall street journal reported on internal Matter Research documents asking why do we care about tweens and answering his own question by citing meta internal emails. They are a valuable but untapped audience. At a commerce hearing i asked metas ahead of safety why children ages 10 to 12 are so valuable and she said we do not knowingly attempt to recruit people who are old enough to use our app. When the 40s to state attorneys general brought their case they said the statement wasnt accurate. A few examples, she received an email from the instagram Research Director saying they are investing in targeting young ages, 10 to 12. One year employee said they are working to recruit gen alpha before they reached teenage years. An internal email said you agree that children under 13 will be critical for increasing the rate of acquisition when users turn 13 explained that with what i heard at the testimony at the commerce hearing that they were not being targeted and ask again that the other witnesses would ask why your company does not support the stop csam act for the shield act . We had discussions internally whether we should build a kids version of instagram. Weve not moved forward with that and we have no plans to do so. I cannot speak to the emails that you cited but it sounds like deliberations overall, my position is i agree with the goal. There most things that i agree with. There are some things i would do differently. Thank you parents control over the experience. Im happy to go into detail. I think these parents will tell you that the stuff has not worked the parents control. They do not know what to do. It is hard and thats why we are coming up with other solutions that we think are more helpful but this idea of getting something going on liability. I believe with the resources that you have that you could do more than you are doing. Are these parents would not be sitting behind you in this Senate Hearing room. I dont think parents should have to upload an idea or prove they are parent in every single app that there children use. The easier place to do this is in the app store themselves. Apple already requires parental consent when a child does a payment within and that. It should be trivial to pass a law to have parents have control anytime a child downloads an app. And offers consent. The research we done shows that the vast majority of parents want that and thats the kind of legislation that would make this easier. I remember a mother telling me with all these things she can do that she cannot figure out. Its like a faucet overflowing and her kids are being exposed to material. We have to make it simpler for parents to they can protect their kids, and i dont think this is the way to do it. I think the answer is what senator graham has talked about is opening up the halls of the courtroom so it puts it on you guys to protect these parents and protect these kids. And pass some of these laws that make it easier for Law Enforcement we will try to stick to the seven minute rule. It did not work very well, but ill try to give additional time on the other side as well. Senator cornyn. But theres no question that your platforms are very popular. But we know that here we have an open society and a Free Exchange of information that the authoritarian governments and their criminals who will use your platforms for the sale of drugs, for , for extortion, and the like. Mr. Chew, i think your company is unique among the ones represented here today because of its ownership by a Chinese Company and i know there have been some steps youve taken to wall off the Data Collected in the United States. The fact of the matter is is that under chinese law, all information accumulated by companies in the peoples republic of china are required to be shared with the chinese intelligence services. The initial release of tiktok was 2016. What you made was in 2021, allegedly walled off in 2023. What happened to the data that tiktok collected before that. Senator cotton thank you. We have three americans. Youre right in pointing out that over the last three years weve spent billions of dollars with project texas. It is unprecedented to wall off data from the rest of our staff. Im asking about the data that you collected prior to that. Yes, i talked about this a year ago. We are beginning face to where we will hire a thirdparty to verify the work and we will go into, for example, employees working laptops to delete that. Was the Data Collected by tiktok prior to project texas shared with the Chinese Government pursuant to the National Intelligence laws senator, weve not been asked for any data and was never provided it. Your company is unique because your undergoing review. Center, there are ongoing discussions and a lot of the project texas work is discussions with many agencies. It is designed to review Foreign Investments for National Security risks. Right . Yes, i believe so. And your company is being reviewed by this Interagency Committee at the Treasury Department for potential National Security risks. This review is on an acquisition that was done many years ago. It has been many years and a lot of discussions around how our systems work. We have a lot of robust discussions about a lot of detail. 63 of teenagers, i understand, use tiktok. Does that sound right . Senator, i cannot verify that. We know we are popular among every age groups. We are aware we are popular. You reside in singapore with your family, correct . Yes, i reside in singapore and i work here in the u. S. As well. Your children have access to tiktok in singapore . Senator cotton if they lived in the u. S. , i would give them access to the under 13 age experience. In singapore do they have access to tiktok . Or is it restricted by domestic law . We do not have the under 13 access. We have a in the u. S. We were deemed an audience at and we created the under 13 experience. A wall street journal article contradicts what your company is stating. Employees under project texas say that user data, including user emails, date of birth, ip addresses, continue to be shared with bytedance staff owned by a Chinese Company. Do you dispute that . Yes, senator. There are many things that are inaccurate. Its a voluntary project and we sent billions of dollars in their thousands of employees involved and its difficult why is it important that the Data Collected from u. S. Users be stored in the United States . Senator, this was built in response to some of the concerns that were raised by this committee and others. But that was because of concerns that the data that was stored in china can be accessed by the Chinese Communist party, according to the National Intelligence laws, correct . Senator, we are not the only company that does business that has chinese employees. We are not the only company in this room that hires chinese nationals. To address these concerns with move data into our infrastructure and build a 2000 person team to oversee the database. And then, we opens up to third parties like oracle to give them thirdparty validation. This is unprecedented. I think we are unique in taking more steps to protect user data in the u. S. You disputed the wall street journal story published yesterday. Are you going to conduct any sort of investigation. Are you going to dismiss the article . We will not dismiss them. We have ongoing security inspections not only by our own personnel but by third parties to ensure that the system is rigorous and robust. No system is perfect, but what we need to do is make sure we are improving it and testing it against people who try to bypass it. If anybody breaks our policies within the organization, we take disciplinary action against them. Senator . I would like to start by thanking the families that are here today. All the parents were here because of the child theyve lost. All the families that are here because you want us to see you and to know your concerns. You have contacted each of us expressing your grief and loss in passion and concern. The audience that is watching cannot see this. They can see the witnesses, but this room is packed as far as the eye can see. And when this hearing began, many of you held up pictures of your beloved and lost children. I benefit from and participate in social media as too many members of the committee and the nation and the world. There are now a majority of people on earth participating in and benefiting from one of the platforms youve launched or you lead or you represent. We have to recognize that there are no positives to social media. It has transformed modern life, but it has also had a huge impact on families and children. Theres a whole series of bills championed by members of this committee that try to deal with the trafficking and illicit drugs and trafficking in illicit child sexual material. Things that are facilitated on your platform that may lead to self harm or suicide. So weve heard from several of the leaders. The chair and ranking and the experience senators. The frame we look at this is Consumer Protection when there is some new technology we put in place regulations to make sure it is not overly harmful, as my friend senator klobuchar pointed out. One door flew off of one airplane and no one was hurt, but the whole fleet was grounded and a federal fit for Purpose Agency did an immediate safety review. Im quite a point not to the other pieces of legislation that i think are urgent that we take up and pass, but the core question of transparency. If you are a Company Manufacturing a product that is allegedly addictive and harmful , one of the first things we look to is safety information. We try to give our constituents , our consumers, warnings, labels, that help them understand what are the consequences of this product and how to use it safely or not. As youve heard some of my colleagues, if you sell an addictive, defective, harmful product in this country in violation of regulations, you get sued. And what is distinct about platforms as an industry is most of the families who are here are here because there were not sufficient warnings and they cannot effectively sue you. So let me dig in for a moment if i cant. Each of your companies voluntarily discloses information about the content and the safety investments you make and the actions you take. There was a question by senator graham earlier about tiktok. He said you invest 2 billion in safety. My background memo siddur revenue is 85 billion. You are investing 5 billion in safety. So what matters, what matters is the relative numbers and the absolute numbers. If theres anyone in this world who understands data, its you guys. I want to walk through whether or not these voluntary measures of disclosure of content and harm are sufficient. I would argue that we are here because they are not. Without better information, how can policymakers know whether the protection you testified about, the new initiatives, the starting programs, the monitoring and the takedowns are actually working. How can we understand how big these problems are without measuring and reporting data . Mr. Zuckerberg, you referenced the National Academy of science study that said at the population level theres no proof about harm to Mental Health. It may not be at the population level, what im looking at a room with hundreds of parents of lost children. Our challenge is to take the data and make decisions about protecting families and children from harm. Let me ask about what your companies do or do not report, and i will particularly focus on your content policies around selfharm and suicide. I will ask a series of yes no questions. And what im getting at is do you disclose enough. Mr. Zuckerberg, from your policy prohibiting content about suicide or selfharm, do you recorded estimate of the total amount of content, not a percentage, of the overall. Not a prevalence number, but the total amount of content on your platform that violates this policy . And do you report the total number that selfharm or suicide promoting content gets on your platform . Yes, senator. We pioneered a Quarterly Reporting on Community Standards and enforcement across all these categories harmful content. We focus on prevalence, but you mentioned, because what we are focused on is what percent of the content that we take down mr. Zuckerberg, you are very talented and i have very little time left. I just want to get an answer to question. Not as a percentage of the total. Its a huge number. But report the actual amount of content and the amount of views selfharm content receives . I believe we focus on prevalence. Correct, you do not. Ms. Yaccarino, do reported or you dont . Senator, we have less than 1 of our users that are between the ages of 13 and 17. Do report the absolute number accounts weve taken down in 2023. Almost 1 million posts down in regard to Mental Health. Mr. Chew, do you disclose the number of appearances of these types of content and how many are viewed before they are taking down . Senator cotton we disclose the number based on each category and how many were taken down before it was recorded. Senator we do disclose. I have three more questions i would go through if i have unlimited time. I will submit them to the record. Platforms need to hand over more content about how the algorithms work, what the content does and what the consequences are. Not at the aggregate. But the actual numbers of cases so we can understand the content. In closing, i have a bipartisan bill. The platform accountability and Transparency Act cosponsored by several senders. It is in front of the commerce committee. It would set reasonable standards for disclosure and transparency to make sure we are doing our job based on data. Yes, there is a lot of emotion in this field. But if we can legislate responsibly about the management of the content on your platform, we need to have better data. Is there anyone of you willing to say now that you support this bill . Mr. Chairman, let the record reflect silence. We are on the first of two roll calls. Please understand that theres no disrespect. There doing their job. Thank you, mr. Chairman. Tragically, survivors of sexual abuse are often repeatedly victimized and revictimized over and over again by having nonconsensual images of themselves on social media platforms. There was a study that pointed out there was one instance of csam that reappeared more than 490,000 times after it had been reported. We need tools to deal with this. We need, frankly, was to mandate standards so this doesnt happen so we have a systematic way of getting rid of this stuff. There is literally no justification and no way of defending this. One tool that i think would be particularly effective is a bill that i will be introducing later today, and i invite all my Committee Members to join me. Is called the protect act. It would require websites to verify age and verify they perceived consent of any and all individuals appearing on their site and pornographic images and it would require platforms that have meaningful processes for an individual to have images of him himself or herself removed in a timely manner. What might it take for a person to have those images removed, say, from x . It sounds like what you are going to introduce into law, in terms of ecosystemwide and user consent sounds exactly like part of the philosophy of why we are supporting the shield act. And no one should have to endure images being shared online. Without that, without laws in place, and its fantastic anytime a company, as you have described with yours, wants to take those steps. Its very helpful. They can take a lot longer than it should, and sometimes it does , to the point where someone had images shared. 490,000 times after it was reported to the authorities. That is deeply concerning. Yes, the protect act would work in tandem with the shield act. Mr. Zuckerberg cutlets turn to you next. As you know, i feel strongly about privacy and one of the best protections for an individuals privacy online and involves encryption. We know that a great deal of grooming and sharing of csam happens to occur on endtoend encrypted systems. Does meta allow them to use encrypted underaged. People under 18 . We allow people under the age of 18 to use whatsapp. Do you have a bottom level age in which they are not allowed to use . I dont think we allow people under the age of 13. What about you, mr. Citron . Do you have do you allow kids to have accounts to access encrypted messaging . Discord is not allowed for children under the age of 13 and we do not use encryption to protect messages. We feel its important to respond to Law Enforcement requests and we are also working on proactively building technology, we are working with a nonprofit to identify a grooming classifier and identify these conversations so we can intervene and give those teenagers tools to get out of that report those conversations and those people to Law Enforcement. It can be harmful , especially if you are on the site were children are being groomed and exploited. If you allow children on an end toend encryption enabled app, that can prove problematic. Lets go back to you, mr. Zuckerberg. Mr. Graham announced it will restrict all teenagers from access to eating disorder material, suicidal ideation, selfharm content and that is fantastic. What is on, what im trying to understand, why is it that instagram is only restricting its restricting access to sexually explicit content, but only for teenagers ages 1315. Why not restricted for 16 and 17yearolds as well . My understanding is we dont allow sexually explicit content for people of any age. Are prevalence metrics suggest, 99 of accounts we removed were identifies in using ai system. The other thing you asked about was selfharm content, which is what we recently restricted. I think the state of the science is shifting a bit. Previously we believed that what people were thinking about selfharm, it was important for them to express that and now more of the thinking in the field is it is better to not show that content at all. Is there a way for parents to make a request on what their kids cant see or not see on your site . There are a lot of Parental Controls. I dont think we currently have a control around topics, but we do allow parents to control the time that children are on the site and a lot of it based on monitoring and understanding what the experiences. Mr. Citron, 17 of minors use discord have had online sexual interactions on your platform. 10 have those interactions with someone that they believed to be an adult. Do you restrict minors from accessing discord servers that post pornographic material on them . Senator, we do restrict that and discord does not recommend content to people. We dont have a feeder algorithm that boosts content. We allow adults to share content but we dont allow teenagers to access that. Welcome, everyone. We are in this hearing because of a collective your platformsat policing themselves. We hear about it in congress with drug dealing facilitated across platforms. We see it and we hear about it here in congress with harassment and bullying that takes place across platforms. We see it in hear about it here in congress with respect to child pornography, Sexual Exploitation and blackmail and we are sick of it. It seems to me theres a problem with accountability because these conditions continue to persist. In my view section 230, which provides immunity from lawsuits is a significant part of that problem if you look were bullies have been brought to heel, whether it is dominion finally getting justice against fox news after a Long Campaign to try to discredit the election manufacturer or whether it is the mothers and fathers of the sandy hook victims finally getting justice against infowars and its campaign of trying to get people to believe that the massacre of their children was a fake put on by them or even now, more recently, with a writer getting a significant settlement. It courtroom proves to be the place where these things get sorted out. I will just describe one case, if i may. It is doe v twitter. A compilation video of multiple csam videos surfaced on twitter in 2019. A concerned citizen reported that video on december 25th, 2019. Christmas day twitter took no action. The plaintiff, then a minor in high school in 2019 became aware of this video from his classmates in january of 2020. You are a High School Kid. Suddenly there is that. That is a day that is hard to recover from. Ultimately, he became suicidal. He and his parents contacted Law Enforcement and twitter to have these videos removed. On january 21st and 22nd 2020 and twitter ultimately took down video on january 30th. Once federal Law Enforcement got involved. That is a foul set of facts. When the family sued twitter for all of those months of refusing to take down the explicit video of this child, twitter invoked section 230, and the District Court ruled that the claim was barred. Theres nothing about that set of facts that tells me that section 230 performed any Public Service in that regard. I would like to see very substantial adjustments to section 230 so that the honest courtroom, which brought relief and justice to e. Jean carroll after months of defamation, which brought silence, justice and peace to the parents of the sandy hook children after months of defamation and bullying by info awards and alex jones, and which brought significant justice and in and to the campaign of defamation by fox news to a Little Company that was busy just making election machines. My time is running out. I would like to have each of your Companies Put in writing what exemptions from the protections of section 230 you would be willing to accept, bearing in mind, the fact situation in doe v twitter and the damage that was done to that family by the nonresponsiveness of this enormous platform over months and months and months. Again, think of what it is like to be a High School Kid and have that stuff in the Public Domain and have the company that is holding it out there in the Public Domain react with disinterest. Will you put that down in writing for me . Five yeses. Done. Senator cruz. Thank you. Social media is a powerful tool, but we are here because every parent i know, and every parent in america is terrified about the garbage that is directed at our kids. I have two teenagers at home. The phones they have are portals to predators, viciousness, bullying, selfharm , and each of your companies could do more to prevent it. Mr. Zuckerberg, in june of 2023 the wall street journal reported that instagrams recommendation systems were actively connecting pedophiles to accounts that were advertising the sale of child sexual abuse material. In many of those cases accounts appeared to be run by underage children themselves, using code words and emojis to advertise illicit material. In other cases, the accounts included the victim was being trafficked. I know instagram has a team that works to prevent the abuse and exploitation of Children Online. What was particularly concerning about the wall street journal expose was the degree to which instagrams own algorithm was promoting the discoverability of victims for pedophiles seeking child abuse material. In other words, this material was not just living in the dark corners of instagram. Instagram was helping them find it by promoting graphic cash tax hashtags, including preteen sex two potential buyers. Instagram had the following morning scream to individuals who were searching for child abuse material. These results may contain images child sexual abuse, and then you gave users two choices. Get resources or see results anyway. Mr. Zuckerberg, what the were you thinking . All right, senator. The basic Science Behind that is when people are searching for something that is problematic, its often helpful to rather than block it to help direct them to something that could be helpful for getting them to get help. I understand get resources. In what saying universe is there a link for see results anyway. We might be wrong. We try to trigger this morning. When we think theres you might be wrong. How many times with this display . I dont know. Why dont you know . I dont know the answer. You know what . Its interesting you say you dont know off the top of your head, because i asked in june of 2023 in your oversight letter and your Company Refused to answer. Will you commit within five days to answer this question for this committee . Not i will follow up. I know how lawyers write statements they are not going to answer. Will you tell me how many this times this was display . I will personally look into it. Let me ask you this. How many times did it Instagram User who got this morning that you are seeing images of child sexual abuse, how many times did they click on see results anyway. Im not sure if we stored that, but i will look into that. What followup did instagram do when you had a potential pedophile clicking on i would like to see child pornography. What did you do next. An important piece of context is any context we think that is called a question. What did you do next . When somebody clicked, you may be getting child sexual abuse images and they clicked, see results anyway. What was your next step . You said you might be wrong. Did anybody examine if it was child sexual abuse material . Did anyone try to protect that child . What did you do next . Senator, we take down anything that we think is sexual did anybody verify if it was child sexual abuse material . I dont know did you report the people that wanted . Do you want me to answer the question . I want you to answer the question im asking. That is one of the factors that we use in reporting. Weve reported more people and done more reports like this to the National Center of missing and exploited children. We go out of our way to do this and weve made more than 26 million reports, which is more than the rest of the industry combined. You need to do much more to protect children. Mr. Chew, i want to turn to you. Are you familiar with the National Intelligence law that all citizens shall support and cooperate with National Intelligence effort in accordance with the law and shall protect National Intelligence with secrets they are aware of. Yes, im familiar with us. Tiktok is owned with bytedance. Are they subject to the law . Tiktok is not available in Mainland China and as we talked about in your office, project texas put this out of reach. Bytedance is subject to the law. Subject to the law that says shall protect National Intelligence work secrets they are aware of, it compels people subject to the law to lie to protect those secrets. Is that correct . I cannot comment on that. What i said because you have to protect those secrets . Tiktok is not available in Mainland China. But it is controlled by bytedance, which is subject to this law. You said earlier, and i wrote this down, we have not been asked for any data by the Chinese Government, and we have never provided it. I told this when we met last week, i dont believe you. And i will tell you the American People do not either. If you look at what is on tiktok in china, you are promoting to kids science and math videos, educational videos, and limit the amount of time they can be on tiktok. In the United States you are promoting to kids selfharm videos and antiisrael propaganda. Why is this such a dramatic difference. Senator, that is not accurate. You have a company that is essentially the same but it promotes beneficial materials instead of harmful materials. That is not true. We have a lot of science and math content. Okay. Let me point to this, mr. Chew. There was a report recently that compared hashtags on instagram to those on tiktok. And what ms. Yaccarino. The differences were striking. For Something Like Hashtag Taylor swift or hashtag trump, they found 2 for everyone on tiktok. The difference jumps to 81 for the hashtag to bat and it jumps 571 two Tiananmen Square and it jumps to 1741 to hong kong protest. Why on instagram can people put up a Hashtag Hong Kong protest hundred 74 times compared to tiktok. What censorship is tiktok doing fundamentally, few things happen. Not all videos carry hashtags. Thats the first thing. The second thing is you cannot select few words why the difference between taylor swift and Tiananmen Square . There was a massive protest during the time. What im trying to say is why would there be a minimal difference on taylor swift and a massive difference . Senator , can you rap up . Answer the question. I think your analogy is flawed. There is an obvious difference. Senator blumenthal. Thank you. Mr. Zuckerberg , one of your top leaders in september she was global ahead of safety. And you know that she came before a subcommittee on Consumer Protection. And she was testifying on behalf of safety on behalf of facebook. And she told us, facebook is committed to doing everything to do everything to protect their privacy, safety, and well being on our platform. And she said its a safety where where we are investing heavily. We know that statement was untrue. We know it from an internal email that weve received. He was the head of Global Affairs and he wrote the memo to you. Which you received, correct . It was written to you. I cannot see the email. I will assume that you got it correct. He summarized the policy. He said, we are not on track to succeed for our core wellbeing topics. Problematic use. And ssi, meaning suicidal self injury. He said we need to do more and we are being held back by a lack of investments. This has the date of august 28th , just weeks before that testimony from antigone davis. Correct . Im not sure what dates the testimony was. Those are the date on the email. He was asking you, pleading with you for resources to back up the narrative to fulfill the commitment. In effect, antigone davis was making promises that nick craig was trying to fulfill and you rejected that request for 45 to 84 engineers to do wellbeing for safety. We know you rejected it from another memo, his assistant, tim colburn, who said, nick did email mark to emphasize his support, but it lost out to other pressures and priorities. We have done the calculation commitment in real terms and you rejected that request because of other pressures and priorities and that is an example from your own internal documents of failing to act and it is the reason why we can no longer trust meta and, frankly, any of the social medias. This is in fact a grade their own homework and the public and particularly those in the room know we can no longer rely on social media to provide a kind of safeguard that children and parents deserve. That is the reason why passing the kids Online Safety act is medically important. Mr. Zuckerberg do you believe you have a constitutional right to lie to congress . No. But let me clarify. Let me clarify for you. In a lawsuit brought by hundreds of parents and some in this very room alleging that you made false and misleading statements concerning the safety of your platform for children, you argued and not just one pleading, but twice in december and then in january that you have a constitutional right to lie to congress and do you disavow that filing in court . s i dont know what filing you are talking about, but i would like the opportunity to respond to the previous things you shared as well. I do have a few more questions. Let me ask others who are here, because i think it is important to put you on record, who will support the kids Online Safety act . Yes or no . There are parts that it is a yes or no question and i will run out of time so i am assuming the answer is no if you can answer yes. We very much think great. We strongly supported and implemented many of its core provisions. Think you and i welcome that support along with microsofts support. If something changes, we can support it. In the present form, you supported . We do have some concerns. I will take that as a no. We support kosa and we will make sure it accelerates and will continue to offer community for those seeking that voice. We support the age appropriate content standards but would have yes or no . Do you support the kids Online Safety act . These are nuanced questions. I am asking whether you will support it or not. The basic spirit and idea is right and there are some ideas that i would debate. Unfortunately i dont think we can count on social media as a group or big tech to support this measure and in the past it has been opposed by armies and lawyers and lobbyists who were prepared for this fight, i was very very glad that we have parents here because tomorrow we will have an advocacy day. The folks who really count, and the people who support this measure in this room will be going to their representatives and their senators and their voices and faces will make a difference. Senator schumer is committed that he will work with me to bring this bill to a vote and then we will have real protection for children and parents online. Thank you. Thank you, senator blumenthal. We do have a vote and you havent voted yet also and you are next and i dont know how long it will be open but i will turn it over to you. Thank you. Let me start with you, mr. Zuckerberg. Did i hear you say there is no link between Mental Health and social media use . What i said is i think it is important to look at the science and i know that people widely talk about this and they say it is something that has already been proven and i think the bulk of the Scientific Evidence doesnt support that. Really. Let me remind you of some of the science of your own company. Instagram studied the effect of your platform and teenagers and let me redo some quotes on this and Company Researchers found instagram was harmful for a sizable percentage of teenagers, most notably, teenage girls and here is a quote from your own study. We make body image issues worse for 1 and 3 girls and heres another one, they blame for increases in the rate of anxiety and depression and this reaction was unprompted and consistent across all groups and that is your study. We do try to understand the feedback and how people feel about the services. Your own study says that you make life worse for one in three teenage girls and you increase anxiety and depression and that is what it says and you are testifying to us in public that there is no link and you have been doing this for years and for years you have been testifying under oath there is absolutely no link that your product is wonderful and full speed ahead while internally you do know full well is a disaster for teenagers and you keep right on doing what you are doing. That isnt true. Let me show you some other facts i know you are familiar with. Those are faxon its not a question. Those are not facts. Here are some more facts and here is some information from the whistleblower who came before the senate and testified under oath who worked for you, a senior executive. Here is what he showed he found when he studied your products. This is girls between the ages of 13 and 15 with 37 of them reported that they have been exposed to nudity on the platform in the last seven days and 24 said they experienced unwanted sexual advances and they have been propositioned in the last seven days and 17 said they encountered selfharm content that push them in the last seven days. I know you are familiar with the statistics because he sent you an email where he lined it out and we have a copy of it here and my question is, who did you fire because of that . We studied all of this because its important and we want to improve our services. You studied it and there was no linkage just said. I said you mischaracterized. 37 of teenage girls between 13 and 15 were exposed between unwanted nudity in a week on instagram and you know about it and who did you fire. Who did you fire . I dont think that who did you fire . I wont answer that. You didnt fire anybody . Its not appropriate to do you know who is sitting behind you . You have families from across the nation whose children are either severely harmed or gone and you think it is appropriate to talk about steps that you took or the fact that you didnt fire a Single Person . Have you compensated any of the victims . These girls . Have you compensated them . I dont believe so. Why not . Dont you think they deserve some compensation for what your platform has done . Help with Counseling Services or issues that your services caused. Our job is to build tools to keep people safe. Will you compensate them . We make sure we build industry leading tools and to build tools that empower people. You didnt take any action or fire anybody or compensated a single thing and let me ask you this and there are families of victims here today and have you apologized to them . And would you like to do so now . They are here and you are on National Television and would you like now to apologize to the victims who have been harmed . Show them the pictures . Would you like to apologize for what you have done to these good people . Why, mr. Zuckerberg, why should your company not be sued for this . Why is it that you hide behind a liability shield and you shouldnt be held accountable and shouldnt you be held accountable . Will you take personal responsibility . Senator, i think i have already answered this. Will you take responsibility . I think my job and the job of this company is to build the best tools we can to keep our communities safe. We are doing an industry where your product is killing people and you personally commit to compensating the victims. Will you set up a Compensation Fund with your money . This isnt a complicated question. Will you set up a victims Compensation Fund with your money, the money made on these families sitting behind you ask yes or no . Senator, my job it does sound like a no. Your job is to be responsible for your company is done and you have made billions of dollars in the people sitting behind you. You have nothing to help them and you have done nothing to compensate them and you have done nothing to put it right and you could do so today and you should. And before my time expires, mr. Chew. Your platform, why should your platform not be banned in the United States of america . You are owned by a Chinese Communist company, a Company Based in china and the editor inchief of your Parent Company is a communist Party Secretary and your company has been surveilling americans for years and this is for more than 80 internal tiktok meetings, china based employees of your company have repeatedly access nonpublic data of United States citizens and they have tracked journalists and properly gaining access to their ip addresses and an attempt to identify whether they are writing negative stories about you. Your platform is basically an espionage arm for the Chinese Communist party and why shouldnt you be banned in United States of america . I disagree. Many of you what you said is explained in its use by 170 million americans. But every one of those americans are in danger from the fact that you track their keystrokes and app usage and location data. We know that all of that information can be accessed by chinese employees who are subject to the Chinese Communist party and why shouldnt you be banned in this country . A lot of what you describe is correct that we dont. It is 100 accurate. Do you deny repeatedly that american data has been accessed by employees in china . This cost us billions of dollars to stop that and we built the project to do that. According to a report from wall street journal from yesterday, it hasnt been stopped and even now workers without going through official channels have access to the private information of american citizens including their birthdate, ip address and more and that is now. As you know, the media doesnt always get it right. With the Chinese Communist party does . We spent billions of dollars to build this and it is a vigorous and robust and unprecedented and i am proud of the work that they are doing, the work they are doing to protect the data. It is not protected at all and it is subject to communist Chinese Party inspection and review and your app unlike anybody else sitting here and heaven knows we have problems with everybody here, your app like any of those is subject to the control and inspection of a foreign hostile government that has actively tried to track the information of the whereabouts of every american and it should be banned in the United States of america for the security of this country. Thank you, mr. Chairman. Thank you. As we have heard, children face all sorts of dangers in social media from Mental Health harm to Sexual Exploitation and even trafficking and since trafficking is a serious problem in my home state of hawaii especially for native hawaiian victims, that social media platform are being used to facilitate this as well as the creation and distribution concerning but it is happening and for example several years ago a merger was stationed and sentenced to 15 years in prison for producing this is part of the online exploitation of a minor female and he began communicating with this 12year old girl through instagram and then he used snapchat to send her sexually explicit photographs and solicit such photographs from her and he later used these two black mail her. Just last month, the fbi arrested a neonazi cult leader and hawaii who lured victims to his server and he used it to share images of extremely disturbing child sexual abuse material interspersed with neo nazi imagery and members of his Child Exploitation and hate group are also present on instagram, snapchat, x and tiktok all of which they used to recruit potential members and victims and in many cases including the ones i mentioned, your company and it played a law role in Law Enforcement investigating them but at the time, so much damage had been done and this is about how to keep children safe online and with all of your testimony to seemingly impress the safeguards for young users and do you try to limit the time they spend or require parental consent and you have all of these tools and yes, trafficking and exploitation online and of your platform continues to be rampant. Nearly all Companies Make money through advertising, specifically by selling the intention of your users and your product is your users. And as a made up Product Designer wrote in an email, young ones are the best ones and you want to bring people to your Service Young and early. In other words, do this early and Research Published last month estimates that they make an astounding 41 by adjusting to users under 18 and with tiktok the 35 and seven of the 10 largest servers attracting many paid users are four games used primarily by teams by children. And all of this is to say that social Media Companies, yours and others, make money by attracting kids to your platforms but ensuring safety doesnt make money but it costs money. And if you will continue to attract kids to your platform, you have an obligation to make sure they are safe on those platforms because the Current Situation is untenable and that is why we have this hearing. And to ensure safety that costs money and your companies cant continue to profit off of young users only to look the other way when those users are children and harmed online. We have had a lot of comments about section 230 protections and i do think we are definitely getting in this direction and some of the other things we have talked about in this committee we talked about limiting the Liability Protections for you and last november, the subcommittee heard testimony in response to one of the questions about how to make sure that social Media Companies focus more on child safety and he said, and i am paraphrasing a bit and he said, what will change their behavior is at the moment that Mark Zuckerberg declares earnings. And these are those that have to be declared and he said last quarter we made 34 billion in the next thing you have to say is how many teenagers experienced unwanted sexual advances on this platform and will you commit to reporting measurable child safety data on your Quarterly Earnings report and calls . It is a good question and we actually already have a Quarterly Report we issue and do a called answer questions for how we enforce Community Standards that includes not just the child safety issues. Is that a yes . We have a separate call we do this on. You have to report your earnings and do you report this the data and numbers by the way because percentages dont really tell that full story but will you report the number of teenagers and sometimes you dont know that are not because they claim to be adults but will you reported the number of underage children on your platform to experience unwanted kinds of messaging that harm them . Will you commit to those numbers when you make the Quarterly Report . I am not sure it would make as much sense to do that in that filing but we do it publicly so everybody can see it and we have to follow up and talk about that and the specific thing and some of the ones you mentioned around underaged people and or services we dont allow people under the age of 13 teen but if we do find that, we remove them and am not saying that people dont lie. I wont be able to say that we wont be able to count how many there are because fundamentally, if we identify that somebody is under age we remove them. I think that is important that we get actual numbers because these are real human beings and that is why all of these parents are here because each time a young person is exposed to this kind of unwanted material and they get hooked, it is a danger to that individual. I am hoping that you are saying you do report this kind of information and it is made public and i think i am hearing that you do. I do think we report more publicly on our enforcement than any other company in the industry and we are very supportive of that. I am running out of time, zuckerberg, but i will follow up with what exactly it is you do report. And again for you when they automatically place accounts of young people and you testified on this on the most restrictive privacy and content sensitivity sections and yet teenagers are able to opt out of these safeguards and is that right . It isnt mandatory that they remain on the settings . They can opt out . Yes, senator. We default teams into a private account teenagers. But some want to be creators and have content they share broadly and i dont think that is something that should be banned blankly . Why not. I think it should be mandatory that they remain on the more restrictive settings. A lot of teenagers create Amazing Things and with the right supervision and parenting and controls i think that that i dont think that is the type of thing you want to not allow anyone to be able to do. My time is up. But i do have to say there is an argument that you have made everything we are proposing and i do share the concern that i have about the blanket limitation of liabilities that we provide all of you and i think that has to change and that is on us, congress, to make that change. Thank you. Lets cut to the chase. As tiktok under the influence of the Chinese Communist party . No. We are private. You can say that they are subject to the 2017 National Security law which requires Chinese Companies to turn over information to the Chinese Government and you concede that . Senator there is no question and you can see the that earlier. Any company has to follow the local laws. Isnt the case that they also have internal communist Party Committee . All businesses have to follow local law. Your Parent Company is subject to the National Security law that requires that and it has its own internal Chinese Communist Party Committee and you answer to that Parent Company, but you expects us to believe you arent under their influence . This is what we built. Used to work for the menu or the cfo for them . Correct. In april 2021 while you were the cfo the Chinese Communist Party Internet Investment Fund purchased a 1 stake in the main chinese subsidiary and the Technology Company and in return for that socalled 1 share the party took one of three seats at that Subsidiary Company and that is correct . It is for the chinese business. That deal was finalized on april 30 of 2021 and isnt it true that you were appointing that ceo the very next day on may 1 of 2021 . It is a coincidence. That the Chinese Communist party took its board seat in the very next day you were appointed to ceo as tiktok and that is a coincidence. It really is. It is. Okay. And then before this you at another company . I used to work around the world. Where did you live . I lived in china . I lived in beijing and i worked there for about five years. You live there for five years and is that the case that they were sanctioned by the Us Government in 2021 for being a communist Chinese Military company . I am here to talk about i cant remember. The biden ministration never reverse those sanctions just like they reversed that and it was sanctioned as a Chinese Communist military company. So you say as you often say you live in singapore and of what nation are you a citizen . Singapore. Are you a citizen of any other nation . No. Have you applied for citizenship anywhere else . I a did not. Do you have a passport from singapore or any other nations . No. Your wife is an american citizen and your children . Correct. Have you applied for american citizenship x not yet. Have you ever been a member of the Chinese Communist party . I am singaporean. No. Have you ever been affiliated with the Chinese Communist party . No. You said earlier that what happened at 10 square in june 1989 was a massive protest and did anything else happen in 10 square . I think it is well documented that was a massacre. It wasnt indiscriminate slaughter of hundreds of citizens and you agree with the Trump Administration and biden ministration that the Chinese Government is committing genocide against the weaker people . I think its important that anybody who cares about this topic any topic it is a simple question that unites both parties and our country and governments around the world is the Chinese Government considering or committing genocide against the weaker people . Yes or no. You are a worldly well up welleducated man and is the Chinese Government committing genocide against the Government People . Yes or no . You are here to give testimony that is truthful and honest and complete and let me ask you this. Joe biden said that the president of china was a dictator and you agree . Senator, i wont comment on world leaders. Why wont you answer the simple questions . Its not appropriate. Are you scared you will lose your job . I disagree. Are you scared you will be arrested and disappear the next time you go to china . You will find the content critical of china freely and ticktock back. Okay. Lets look what tiktok is doing to americas youth and does the name mason aidens ring a bell . You may have to give me more specifics. He was a 16yearold and after a breakup he went on your platform in search of inspirational quotes and positive affirmations and instead he was served up numerous videos glamorizing suicide and he killed himself by gun and what about the name cheese, a 16yearold who saw more than 1000 videos on your platform about violence and suicide until he took his own life by stepping in front of a train and are you aware his parents are suing tiktok for pushing their son to take his own life . I am aware. Okay. And finally as the federal trade commission sued tiktok during the Biden Administration . I cant talk about are you currently being sued by the federal trade commission. I cant talk about that. I think i have given you my answer. The answer is no and the company for ms. Yaccarino is being sued in mr. Zuckerbergs company is what the Chinese Company is not in are you familiar with the name christina . She was a paid advisor with your Parent Company and she was advised on how to sue mr. Zuckerbergs company. And the public reports indicate that your lobbyist had been there more than 40 times and how many times did your Company Visit last year . I dont know. Are you aware that the Biden Campaign at the Democratic National committee is on your platform and they have tiktok accounts . We encourage people to come on. They will let their staff using their personal phones. We encourage everyone to join. All of these companies are being sued and you arent and they have a former paid advisor, your parent talking about how they can sue mr. Zuckerbergs company and joe bidens Reelection Campaign is on your platform. Let me ask you. Have you or anybody else it tiktok communicated with or coordinated with the biden ministration and the Biden Campaign the Democratic National committee to influence the flow of information on your platform . We work with anyone, any creators and it is all the same process. We have a company that is a tool for the Chinese Communist party poisoning the minds of americas children and in some cases driving them to suicide and at best the biden ministration is taking this in collaboration with. Thank you. We will take a break now and can take advantage of and this break will be 10 minutes and please do your best to return. We have a break here in this hearing on child Sexual Exploitation and Bloomberg News writes the committee is considering several proposals that seek to hold Tech Companies accountable and strengthen protection for young users and stop children Sexual Exploitation online get a myriad of groups and Civil Liberties organizations have criticized many of the proposed measures is flawed and counterproductive arguing they would worsen Online Privacy and safety in advance and there is a handful of Companies Including tiktok and meta facing lawsuits in california claiming that companies were negligent and ignore the potential harm with platforms created for teenagers and as we heard, this is a brief break and we will stay here and watch as we wait for the hearing to continue. A live look here at the Senate Judiciary hearing room as we wait for members to return and for durbin to gavel into session and senators considering legislation to help Children Online and vanguard rights that currently under u. S. Law website forms are largely shielded from legal liability that is shared on their website and also liking to set up more rules to increase Online Safety and the story said new laws have been stymied by politically divided washington and intense lobbying by big tech and one proposal is the kids Online Safety act which aims to protect children from algorithms that may trigger anxiety or depression and another idea would require social media platforms to verify the age of account holders and bar children under the age of 13 from signing up and we may hear more about possible solutions when the hearing continues live here on cspan3. A live look at the Senate Judiciary hearing room as we wait for members to return and senator durbin to gavel the meeting into session and today they are considering legislation to help protect Children Online and vanguard rights that currently under United States law platforms on the web are largely shielded from legal liability with content shared and lawmakers would like to set up more rules to increase Online Safety in the story said new laws have been stymied by washington and big tech and one proposal is the kids Online Safety act which aims to protect children from algorithms that could trigger anxiety or depression and another idea requires social media platforms to verify the age of account holders and completely bar children under the age of 13 from signing up and we may hear more about possible solutions when the hearing continues live here on cspan3. The Senate Judiciary committee will resume and we have had nine senators was not asked questions yet and we will turn first to senator padilla. Thank you. Once again i am one of the few senators with younger children and i do lead with that because as we have this conversation today, my children are now all in the teenager or preteen category and with their friends i see this issue very up close and personal and in that spirit i want to take a second to acknowledge and thank parents in the audience today many of whom have shared their stories with our offices and i do credit them for finding strength through their suffering and their struggle and channeling that into the advocacy that is making a difference and i do think all of you thank all of you. I appreciate the challenges that parents and caretakers and School Personnel and others are taking to help them navigate this world of social media and technology in general and the services that children are growing up with provide them unrivaled access to information beyond what previous generations have experienced. Including learning opportunities and socialization and much more and we do clearly have a lot of work to do to better protect our children from the predators and headed terry behavior that these technologies have enabled. And yes, mr. Zuckerberg, it includes exacerbating the Mental Health crisis in america. And nearly all teenagers we know have access to smart phones and the internet and use the internet daily and while guardians have primarily responsibilities for caring for children, it does take a village so society as a whole including leaders in the Tech Industry must prioritize the health and safety of our children and here specifically platform by plat from and witnessed by witness and part of the parental tools, how many minors have caretakers that have adopt did this and if you dont have that say that quickly and provided. We can follow up with you and. How do you make sure that young people in guardians are aware of the tools you offer . We make it clear for teams on our platform what teams what may be clear to you may not be clear to the public. This feature the team safety assist which helps teenagers keep themselves safe in addition to that sent by them it cant be turned off and we also market to the users directly in our platform and relaunch the Family Center and create a promotional video and everybody opened the app. The data we are requesting. Across all of these services from instagram beyond, how many minors use these applications and how many of those have to have a caretaker that has adopted the parental supervision tool you offer . s sorry i can follow up with that. It would be helpful for you to know as a leader of your company and same question how can ensuring that young people and guardians are aware of what you offer . We run pretty extensive Advertising Campaigns on our platforms and outside and we work with creators and organizations like girl scouts to make sure this is probably there is broad awareness. How many minors use this and of those how many have caretakers that register with your there are 200,000 parents use Family Center and about 400,000 link their account to their parents. So this sounds like a big number but a smaller percentage and what should they be aware of of the tools you offer . We create a banner for Family Center on the users profile so the accounts we believe to her parents can see the entry point easily. How many minors are in tick tock and how many of those have a user that uses those tools . I have to get back to ms. Numbers and we were one of the first two do what we call family with parents and you use the teenagers qr code and it allows you to do to set time limits and filter out keywords and turnout a more restricted mode and were always talking to parents and i met a group of parents and teachers last week to talk about this more and we can provide that. How many minors use x or with Safety Measures or guidance for caretakers . Thank you, senator. Less than 1 of all u. S. Users are between the ages of 13 and 17. This is of 90 million u. S. Users. So hundreds of thousands . They are all important and being a 14monthold company we have prioritize the Safety Measures and we have just begun to talk about and discuss what we can do to enhance those with Parental Controls. Let me continue with a followup question. And this is keeping parents informed of the nature of the services and there is a lot more we need to do but for todays services will Many Companies offer a broad range of user empowerment tools it is helpful to understand whether they find this helpful and i appreciate your sharing this and how your advertising this but have you conducted any assessments of how these are impacting the use . Our intention is to give teenagers tools capabilities to keep them safe and help and we launched teen safety assist last year and i dont have the study of the top of my head but we will follow up. My time is up and i will have these questions for each of you but we will save this for the record on some of the assessment. Thank you all for being here mr. Spiegel. I see you hiding down there. What is Yada Yada Yada mean . I am not familiar with the term. Very uncool. Can we agree with what you do and not what you say and Everything Else is just cottage cheese . Yes, senator. Do you agree with that . Speak up. Dont be shy. I listened to you today and i heard a lot and i have heard you talk about the reforms you made and i do appreciate that and i have heard you talk about the reforms you are going to make. But i dont think you will solve the problem and i think congress will have to help you and i think the reforms to some extent will be like putting paint on rotten wood. And i am not sure you will support this legislation. I am not. The fact is that you and some of your and annette colleagues who are not here, you are not a companies. You are very very powerful. And you and some of your colleagues who are not here have block to everything we have tried to do in terms of reasonable regulation in terms of privacy to Child Exploitation. And, in fact, we have a new definition of recession and we know we are in a recession when google has to lay off 25 members of congress and that is what we are down to. Were also down to this fact, that your platforms are hurting children. Im not saying theyre doing some good things, but theyre hurting children. And i know how to count vote, and if this bill comes to the floor of the United States senate, it will pass. What were going to have to do, and i say this with all the respect i can muster is convince my good friend senator schumer to go to amazon, buy spine online and bring this bill to the senate floor. , and the house will then pass it. Thats one persons opinion, but i doubt it. Mr. Zuckerberg, let me ask you a question, a little philosophical here. I have to hand it to you, you have you have convinced over 2 billion people to give up all of their personal information, every bit of it, in exchange for getting to see what their High School Friends had for dinner saturday night. Thats pretty much your business model, isnt it . Thats not how i would characterize it. We give people the ability to connect with the people they care about and to engage with the topics that they care about. And you take this information, this abundance of personal information, and then you develop algorithms to punch peoples hot buttons and steer to them information that punches their hot buttons again and again and again to keep them coming back and to keep them staying longer. And as a result, your users see only one side of an issue. So to some extent your platform has become the killing field for the truth, hasnt it . Senator, i disagree with that characterization. We build ranking recognitions because people have a lot of friends and a lot of interests, and they want to make sure that they see the content thats relevant to them. Were trying to make a product thats useful to people and make our services as possible for people to connect to the people they care about and the interests they care about. But you dont show them both sides. You dont give them balanced information. You just keep punching their hot buttons, punching their hot buttons. You dont send them balanced information so people can discern the truth for themselves, and you rev them up so much that so often your platform and others becomes just cesspools of snark, where nobody learns anything, dont they . Well, senator, i disagree with that. I think people can engage in the things that theyre interested in and learn quite a bit about those. We have done a handful of different experiments and things in the past around news and trying to show content on, you know, diverse set of perspectives. I feel theres more that needs to be explored there. But i dont think that we can solve that by ourselves. Im sorry to cut you off, mr. President , but im going to run out of time. Do you think your users really understand what theyre giving to you, all of their personal information and how you process it and how you monetize it, do you think people really understand . Senator, i think people understand the basic terms. I actually think let me put it another way. We spent a couple years, we talked about this, does your user agreement still suck . Im not sure how to answer that, senator. Can you still have a dead pot body in all that legalese where nobody can find it . Senator, im not quite sure what youre referring to. But i think people get the basic deal of using these services. Its a free service. Youre using it to connect with the people you care about. If you share something with people, other people will be able to see your information. Its inherently and if youre putting something out there to be shared publicly over the private set of people, you know, youre inherently putting it out there. So i think people get that basic part of how mr. Zuckerberg, youre in the foothills of creepy. You track people who arent even facebook users, you track your own people, your own users who are your product, even when theyre not on facebook. I mean, im going to land this plane pretty quickly, mr. Chairman. I mean, its creepy. And i understand you make a lot of money doing it, but i just wonder if our technology is greater than our humanity. I mean, let me ask you this final question. Instagram is harmful to young people, isnt it . Senator, i disagree with that. Thats not what the Research Shows on balance. That doesnt mean that individual people dont have issues and that there arent things that we need to do to help provide the right tools for people. But across all the research that weve done internally, the survey that the senator previously cited, you know, there are 12 or 15 different categories of harm that we asked teens if they felt at instagram whether its worse or better. And across all of them, except for the one that senator hawley cited, more people said using instagram i gotta land this plane, mr. Zuckerberg. We just have to agree to disagree. If you believe that instagram i know im not saying its intentional. But if you agree that instagram if you think that instagram is not hurting millions of our young people, particularly young teens, particularly young women, you shouldnt be driving. It is. Thanks. Senator butler. Thank you, mr. Chair. And thank you to our panelists who have come to have an important conversation with us. Most importantly, i want to appreciate the families who have shown up to continue to be remarkable champions of your children and your loved ones for being here, and in particular to california families that i was able to just talk to on the break, the families of sammy chatman from los angeles and danielporto from santa clarita. They are here today and doing some incredible work to just not protect the memory and legacy of their boys but the work that they are doing is going to protect my 9yearold. And that is indeed why were here. There are a couple questions that i want to ask some individuals. Let me start with a question for each of you. Mr. Citron, have you ever sat with a family and talked about their experience and what they need from your product . Yes or no. Yes, i have spoken with parents about how we can build tools to help them. Mr. Spiegel, have you sat with families and young people to talk about your products and what they need from your products . Yes, senator. Mr. Chew. Yes. I just did it two weeks ago, for example i dont want to know what you did for the hearing prep, mr. Chew. I just wanted to know as an example. Senator, its an example. In terms of designing the product that you are creating. Mr. Zuckerberg, have you sat with parents and young people to talk about how you design products for your consumers . Yes. Over the years, ive had a lot of conversations with parents. You know, thats interesting, mr. Zuckerberg, because we talked about this last night, and you gave me a very different answer. I asked you this very question. Well, i told you that i wasnt that i didnt know what specific processes our company has. No, mr. Zuckerberg. You said to me that you had not. I must have misspoke. I want to give you the room to misspeak, mr. Zuckerberg, but i ask you this very question. I ask all of you this question, and you told me a very different answer when we spoke. But i wont belabor it. Can i a number of you have talked about im sorry, x, ms. Yaccarino, have you talked to parents directly, young people about designing your product . As a new leader of x, the answer is, yes, ive spoken to them about the behavioral patterns because less than 1 of our users are in that age group, but, yes, i have spoken to them. Thank you, maam. Mr. Spiegel, there are a number of parents whose children have been able to access Illegal Drugs on your platform. What do you say to those parents . Well, senator, we are devastated that we cannot to the parents. What do you say to those parents, mr. Spiegel . I am so sorry we have not been able to prevent these tragedies. We work very hard to block all search terms related to drugs from our platform. We proactively look for and detect drugrelated content. We remove it from our platform, preserve it as evidence, and then we refer it to Law Enforcement for action. Weve worked together with nonprofits and with families on education campaigns because the scale of the epidemic is extraordinary. Over 100,000 people lost their lives last year, and we need those to know one pill can kill. That reached 260 million times on snapchat. Mr. Spiegel, there are two fathers in this room who lost their sons. They are 16 years old. Their children were able to get those pills from snapchat. I know that there are statistics, and i know that there are good efforts. None of those efforts are keeping our kids from getting access to those drugs on your platform. As california company, all of you, ive talked with you about what it means to be a Good Neighbor and what californian families and all families should be expecting from you. You owe them more than just a set of statistics. And i look forward to you showing up on all pieces of these legislation, all of you, showing up on all pieces of legislation to keep our children safe. Mr. Zuckerberg, i want to come back to you. I talked with you about being a parent to a young child who doesnt have a phone, doesnt you know, is not on social media at all and one of the things i am deeply concerned with as a parent to a young black girl is the utilization of filters on your platform that would suggest to young girls utilizing your platform the evidence that they are not good enough as they are. I want to ask more specifically and refer to some unredacted Court Documents that reveal that your own researchers concluded that these face filters that mimic Plastic Surgery negatively impact youth Mental Health indeed and will be why should we believe, why should we believe that because you are going to do more to protect young women and young girls when it is that you give them the tools to affirm the selfhate that is spewed across your platforms, why is it that we should believe that you are committed to doing anything more to keep our children safe . Theres a lot to unpack there. There is a lot. The tools to express themselves in different ways. And people who use face filters and different tools to make media and photos and videos that are fun or interesting across a lot of the different products that Plastic Surgery pins are good tools to express creativity . Senator, im not speaking to that. Skin lightening tools are tools that express creativity . This is the direct thing im asking about. Yeah. This is something that any specific one of those, i think the ability to kind of filter and edit images is generally a useful tool for expression. For that specifically, im not familiar with the study youre referring to, but we did make it to that, were not recommending this kind of content to i made no reference to a study, to Court Documents that revealed your knowledge of the impact of these types of filters on young people, generally young girls i generally disagree with that characterization. With Court Documents . I havent seen any documents. Okay. Mr. Zuckerberg, my time is up. I hope that you hear what is being offered to you and are prepared to step up and do better. I know this Senate Committee is going to do our work to hold you to greater account. Thank you, mr. Chair. Senator tillis. Thank you, mr. Chair. Thank you all for being here. I dont feel like im going to have an opportunity to ask a lot of questions, so im going to reserve the right to submit some for the record. But i have heard weve had hearings like this before. Ive been in the senate for nine years. Ive heard hearings like this before, ive heard horrible stories about people who have died, committed suicide, been embarrassed. Every year we have an annual flogging, every year. At what materially has occurred over the last nine years. Do any of you all just yes or no question, do any of you all participate in an Industry Consortium trying to make this fundamentally safe across platforms . Yes or no, mr. Zuckerberg . Yes. Theres a variety of organizations. Do you participate . Which organizations . Does anyone here not participate in an industry i actually think it would be immoral for you all to consider it a strategic advantage to keep private something that would secure all these platforms to avoid this do you all agree with that, that anybody that would be saying you want ours because ours is the safest and that you as an industry realize this is an existential threat if we dont get it right, right . I mean, youve got to secure your platforms. Youve got to deal with this. Do you not have an inherent mandate to do this . Because it would seem to me if you dont, youre going to cease to exist. I mean, we can regulate you out of business, if we wanted to. The reason im saying it may sound like a criticism. Its not a criticism. I think we have to understand that there should be an inherent motivation for you to get this right. Our congress will make a decision that could potentially put you out of business. Heres the reason i have a concern with that, though. I just went on the internet while i was listening intently to all the other members speaking, and i found a dozen different platforms outside of the United States, 10 of which are in china, two of which are in russia. Their daily average active membership numbers in the billions. People say you cant get on chinas version of tiktok. It took me one quick search on my favorite Search Engine to find out exactly how i could get on this platform today and so the other thing that we have to keep in mind i come from technology. I could figure out ladies and gentlemen, i could figure out how to influence your kid without them ever being on a social media platform. I can randomly send texts and get a bite and then find out an email address and get compromising information. It is horrible to hear some of these stories, and ive had these stories occur in my hometown down in north carolina. But if we only come here and make a point today and dont start focusing on making a difference which requires people to stop shouting and start listening and start passing language here, the bad actors are just going to be off our shores. I have another question for you all. How many question people roughly if you dont know the exact number, its okay. How many people roughly do you have looking 24 hours a day at these horrible image, and just go real quick with an answer down the line and filtering it out . Its most of the 40,000 of our people who work on and, again . We have 2,300 all over the world. We have 40,000 trust and Safety Professionals around the world. We have approximately 2,000 people dedicated to trust and safety and content moderation. Our platform is much smaller than these folks. Its hundreds of people, and its looking at content 80 of our work as i mentioned, these people have a horrible job. Many of them experience they have to get counseling for all of things they see. We have evil people out there, and were not going to fix this by shouting talking past each other. Were going to fix this with one of yall being at the table and coming closer to what i heard one person say supporting a lot of the good bills, like the one i hope senator blackburn mentions when she gets a chance to talk. Guys, if youre not at the table and securing these platforms, youre going to be on it. And the reason why im not okay with that is that if we ultimately destroy your ability to create value and drive you out of business, the evil people will find another way to get to these children. And i do have to admit i dont think my mom is watching this one. But there is we cant look past good that is occurring. My mom who lives in nashville, tennessee, and i talked yesterday, and we talked about a Facebook Post that she made a couple of days ago. We dont let her talk to anybody else. That could coax my 92yearold mother with her grandchildren and great grandchildren. That makes a kid feeling awkward into school to get together with people and relate to people. Lets not throw out the good because we havent all together focused on rooting out the bad. Now, i guarantee you, i could go through some of your governance downtowns and find a reason to flog every single one of you because you didnt place the emphasis on it that i think you should. But at the end of the day, i find it hard to believe that any of you people started this business, some of you in your college dorm rooms, for the purposes of creating the evil that is being perpetrated on your platforms. But i hope that every single waking hour youre doing everything you can to reduce youre not going to be able to eliminate it. I hope theres some enterprising young tech people out there today that are going to go to parents and say, ladies and gentlemen, your children have a deadly weapon. They have a potentially deadly weapon, whether its a phone or a tablet. You have to secure it. You cant assume that theyre going to be honest and say that they are 16 when theyre 12. We all have to recognize that we have a responsibility to play, and you guys are at the tip of the spear. So i hope that we can get to a point to where we are moving these bills. If you got a problem with them, state your problem. Lets fix it. No is not an answer. And know that i want the United States to be the beacon for innovation, to be the beacon for safety and to prevent people from using other options that have existed since the internet has existed to exploit people. And count me in as somebody that will try and help out. Thank you, mr. Chair. Thank you, senator tillis. Next is senator ossoff. Thank you, mr. Chairman, and thank you to our witnesses today. Mr. Zuckerberg, i want to begin by just asking a simple question which is do you want kids to use your platform more or less . Well, we dont want people under the age of 13 using do you want teenagers 13 and up to use your platform more or less . Well, we would like to build a product that is useful and that people want to use. My time is going to be limited. Do you want them to use it more or less, teenagers, 13 to 17 years old, do you want them using meta products more or less . Id like them to be useful enough that they want to use them more. You want them to use it more. I think herein we have one of the fundamental challenges. In fact, you have a fiduciary obligation, do you not, to try to get kids to use your platform more . It depends on how you define that. We obviously are a business, but mr. Zuckerberg, our time is its not itself evident that you have a fiduciary obligation to get your users, including users under 18 to use and engage with your platform more rather than less, correct . Over the long term. But in the near term, we often take a lot of steps, including we made a change to show less videos on the platform that reduced the amount of time by more than 50 million hours. Okay. But if your shareholders ask you, mark, i wouldnt mr. Zuckerberg here, but your shareholders might be on a firstname basis with you, mark, are you trying to get kids to use meta products more or less, you would say more, right . I think over the longterm were trying to create the 10k you filed with the s. E. C. , a few things you want to note, hee are some quote, and this is a filing you signed, correct . Yeah. Our Financial Performance has been and will continue to be significantly determined by our success in adding, retaining and engaging active users. Heres another quote. If our users decrease their level of engagement with our products, our revenue, Financial Results and business may be significantly harmed. Heres another quote. We believe that some users, particularly younger users, are aware of and actively engaging with other products and services, similar to or as a substitute for hours, continues in the event that users increasingly engage in other products and services, we play get a decline in which case our business would likely be harmed. You have an obligation as the chief executive to encourage your teen to get your team to get kids to use your platform more. Senator is that not selfevident . You have a fiduciary obligation to your shareholders to get kids to use your platform more . I think the things thats not intuitive is the direction is to make the products more useful so that way people want to use them more. We dont give our the teams rub running the instagram feed or the facebook feed a goal to increase the amount of time people spend. But you want your users engaging more and using more the platform. And i think this gets to the root of the challenge because the overwhelming view of the public, certainly in my home state of georgia, and weve had some discussions about the underlying science, that this platform is harmful for children. I mean, you are familiar with not just your platform, by the way, social media in general, 2023 report from the Surgeon General about the impact of social media on kids Mental Health which cited evidence that kids who spend more than three hours a day on social media have double the risk of poor Mental Health outcomes, including depression and anxiety, are you familiar with that Surgeon General report and the underlying study . I read the report, yes. Do you dispute it . No. But i think its important to characterize it correctly. I think what he was flagging in the report is there seems to be a correlation, and obviously the Mental Health issue is very important. So its something that needs the thing is, everyone knows theres a correlation, everyone knows that kids who spend a lot of time, too much time on your platforms are at risk. And its not just the Mental Health issue. Let me ask you another question. Is your platform safe for kids . I believe it is. But theres hold on a second. Theres a difference between correlation and causation. Because were not going to be able to get anywhere. We want to work in a productive, open, honest and collaborative way with the private sector to pass legislation that will protect americans, that will protect American Children above all, and that will allow businesses to thrive in this country. If we dont start with an open, honest, candid, realistic assessment of the issues, we cant do that. The first point is you want kids to use the platform more. In fact, you have an obligation to. But if youre not willing to acknowledge its a dangerous place for children, the internet is a dangerous place for children, not just your platform, isnt it . Isnt the internet a dangerous place for children . I think it can be. Yeah. Theres both great things that people can do, and theres harms we need to work on its a dangerous place for children. There are families here who have lost their children. There are families across the country whose children have engaged in selfharm, who have experienced low selfesteem, who have been sold deadly pills on the internet. The internet a dangerous place for children, and your platforms are dangerous places for children, do you agree . I think that there are harms we need to work on. Okay. Im not going to i think why not . Why not just acknowledge it . Why do we have to do the very careful i disagree with the characterization. That the internet is a dangerous place for the children . I think youre trying to characterize our product as inherently dangerous inherent or not, your products are places where children can experience harm, they can experience harm to their Mental Health, they can be sold drugs, they can be preyed upon by predators. You know, they are dangerous places. And yet you have an obligation to promote the use of these platforms by children. All im trying to suggest to you, mr. Zuckerberg, and my time is running short, is that in order for you to succeed, you and your colleagues here, we have to acknowledge these basic truths. We have to be able to come before the American People, the american public, the people in my state of georgia and acknowledge the internet is dangerous, including your platforms. There are predators lurking. There are drugs being sold. There are harms to Mental Health that are taking a huge toll on kids quality of life. And yet you have this incentive not just you, mr. Zuckerberg, all of you have an incentive to boost maximize use, utilization and engagement, and that is where Public Policy has to step in to make sure that these platforms are safe for kids so kids are not dying, so kids are not overdosing, so kids are not cutting themselves or killing themselves because theyre spending all day scrolling instead of playing outside. And i appreciate all of you for your testimony. We will continue to engage as we develop this legislation. Thank you. Senator from tennessee. Thank you, mr. Chairman. Thank you, you four, to each of you for coming. And i know some of you had to be subpoenaed to get here. But we do appreciate that you all are here. Mr. Chew, i want to come to you first. Weve heard that youre looking at putting a headquarters in nashville. And likewise in Silicon Valley and seattle. And what youre going to find probably is the welcome mat is not going to be rolled out for you in nashville like it would be in california. There are a lot of people in tennessee that are very concerned about the way tiktok is basically building dossiers on our kids, the way they are building those on their virtual you. And also this ad information is held in china, in beijing, as you responded to senator blumer thal and i last year in reference to that question. And we also know that a major music label yesterday said they were pulling all of their content off your site because your issues on payment, on Artificial Intelligence and because of the negative impact on our kids Mental Health. So we will see how that progresses. Mr. Zuckerberg, i want to come to you. We have just had senator blumenthal and i, of course, have had some internal documents and emails that have come our way. One of the things that really concerned me is that you referred to your young users in terms of their lifetime value of being roughly 270 per teenager. And each of you should be looking at these kids. Their tshirts theyre wearing today say im worth more than 270. Weve got some standing up in those tshirts. [ applause ] now, and some of the children from our state, some of the children, the parents that we have worked with, just to think whether it is becca schmidt, david mollick, sarah glad, emily schote, would you say that life is only worth 270 . What could possibly lead you i mean, i listen to that i know youre a dad, im a mom, im a grandmom, and how could you possibly even have that thought . It is astounding to me. And i think this is one of the reasons that states, 42 states are now suing you because of features that they consider to be addictive, that you are pushing forward. And in the emails that weve got from 2021 that go from august to november, there is the staff plan that is being discussed. Ant iany dave, alex schultz, adam maceri are all on this chain of emails on the well being plan. And then we get to one, nick did email mark for emphasis to emphasize his support for the package, but it sounds like it lost out to various other pressures and priorities. See, this is what bothers us. Children are not your priority. Children are your product. Children you see as a way to make money. And children protecting children in this virtual space, you made a conscious decision, even though nick klagg and others were going through the process of saying this is what we do. These documents are really illuminating. And it just shows me that growing this business, expanding your revenue, what you are going to put on those quarterly filings, that was the priority. The children were not. Its very clear. I want to talk with you about the pedophile ring because that came up earlier, and the wall street journal reported on that. And one of the things that we found out was after that became evident, then you didnt take that content down, and it was content that showed that teens were for sale and were offering themselves to older men. And you didnt take it down because it didnt violate your Community Standards. Do you know how often a child is bought or sold for sex in this country . Every two minutes. Every two minutes a child is bought or sold for sex. Thats not my stat. That is a tbi stat. Now, finally, this content was taken down after a congressional staffer went to metas global head of safety. So would you please explain to me and to all these parents why explicit predatory content does not violate your platforms terms of service or your Community Standards . Sure, senator. Let me try to address all of the things you just said. It does violate our standards. We work very hard to take it down. Didnt take it down. Well, weve reported, i think its more than 26 million examples of didnt take it down until a congressional staffer brought it up. It may be that in this case we made a mistake and missed something. I think you make a lot of mistakes. But lets move on. I want to talk with you about your instagram Creators Program and about the push, we found out through these documents that are actually are pushing forward because you want to bring kids in early. You see these younger tween agers as valuable but an untapped audience, quoting from the emails and suggesting teens are actually household influencers to bring their younger siblings into your platform, into instagram. Now, how can you ensure that instagram creators, your product, your program, does not facilitate illegal activities when you fail to remove content pertaining to the sale of minors . And it is happening once every two minutes in this country. Senator, our tools for identifying that kind of content are industry leading. That doesnt mean were perfect. There are definitely issues that we have. But we continue mr. Zuckerberg, yes, there are a lot that is slipping through. It appears that youre trying to be the premier sex trafficking site. Of course not, senator. Senator, thats ridiculous. No, it is not ridiculous we dont want that content on your platforms. Why dont you take it down . We do take it down. We are here discussing you all need to work with us. No, youre not. You are not. And the problem is weve been working on this senator welch is over there. Weve been working on this stuff for a decade. You have an army of lawyers and lobbyists that have fought us on this every step of the way. You worked with netchoice, the cato institute, the Taxpayers Protection Alliance and chamber of progress to actually fight our Bipartisan Legislation to keep kids safe online. So are you going to stop funding these groups . Are you going to stop lobbying against this and come to the table and work with us . Yes or no. Senator, we have [ applause ] senator, we have yes or no. Of course well work with you on the legislation. Okay. The door is open. Weve got all these bills. You need to come to the table, each and every one of you need to come to the table. And you need to work with us. Kids are dying. [ applause ] senator welch. I want to thank my colleague, senator blackburn, for her decade of work on this. I actually have some optimism. There is a consensus today that didnt exist, say, 10 years ago that there is a profound threat to children, to Mental Health, to safety. Theres not a dispute. That was in debate before. Thats a starting point. Secondly, were identifying concrete things that can be done in four different areas. One is Industry Standards. Two is legislation. Three are the courts. And then four is a proposal that senator bennet, senator graham, myself and senator warren have to establish an agency, a Governmental Agency whose responsibility would be to engage in this on a systemic regular basis with proper resources. And i just want to go through those. I appreciate the Industry Standard decisions and steps that youve taken in your companies. But its not enough, and thats what i think youre hearing from my colleagues. Like, for instance, where there are layoffs is in the trusted the trust and verify programs. Thats alarming because it looks like there is a reduction in emphasis on protecting things, like you just added, ms. Yaccarino, 100 employees in texas in this category. And how many did you have before . The company is just coming through a significant restructuring. So weve increased the number of trust and safety employees and agents all over the world by at least 10 so far in the last 14 months. And we will continue to do so, specifically in austin, texas. Mr. Zuckerberg, my understanding is that there have been layoffs in that area as well, theres added jobs there at twitter, but at meta, have there been reductions in that . There have been across the board, not really focused on that area. I think our investment is relatively consistent over the last couple of years. We invest almost 5 billion in this work last year, and i think this year will be on the same order of magnitude. Another question thats come up is when its the horror of any user of your platforms, somebody has an image on there thats very compromising, often of a sexual nature. Is there any reason in the world why a person who wants to take that down cant have the very simple sameday response to have it taken down . Ill start with twitter on that. Im sorry, senator, i was taking notes. Could you repeat the question . Well, theres a lot of examples of a young person of finding out about an image that is of them and really compromises them and actually can create suicidal thoughts, and they want to call up or they want to send an email and say take it down. Why is it not possible for that to be responded to immediately . We all strive to take down any type of violentive content or disturbing content immediately. At x we have increased our capabilities of staff reporting process. If im a parent or im a kid, and i want this down, shouldnt there be methods in place where it comes down . You can see what the image is yeah. Ecosystemwide standard would actually improve and enhance the experience for users at all our platforms there is actually an organization that i think a number of the companies that are up here are a part of, called take it down. So you all are in favor of that. Thats going to give some peace of mind to people, all right . It really, really matters. I dont have that much time. So weve talked about the legislation, and senator whitehouse had asked you to get back with your position on section 230 which ill go to in a minute. I would welcome each you responding as to your companys position on the bills that are under consideration in this hearing, all right . Im just asking you to do that. Third, the court. This big question of section 230, and today im pretty inspired by the presence this is the parents, who have turned their extraordinary grief into action in hopes that other parents may not have to suffer for what is them for everyone a devastating loss. Senator whitehouse asked you all to get back very concretely about section 230 and your position on that. But its an astonishing benefit that your industry has that no other industry has. They just dont have to worry about being held accountable in court if theyre negligent. So you got some explaining to do. And im just reinforcing senator whitehouses request that you get back specifically about that. And then finally, i want to ask about this notion, this idea of a federal agency whose resourced and whose job is to be dealing with Public Interest matters that are really affected by big tech. Its extraordinary what has happened in our economy with technology, and your companies represent innovation and success. But just as when the railroads were ascendant and were in charge and ripping off farmers because of practices they were able to get away with, just as when wall street was flying high but there was no one regulating blue sky laws, we now have a whole new world in the economy. And, mr. Zuckerberg, i remember you testifying in the energy and commerce committee, and i asked you your position on the concept of a federal regulatory agency. My recollection is that you were positive about that. Is that still the case . I think it could be a reasonable solution. There are obviously pros and cons to doing that versus through the normal the current structure of having different regulatory agencies focused on specific issues. But because a lot of the things trade off against each other, like one of the topics we talked about today is encryption, and thats really important for privacy and security. Right. Can we just go down the line . Im at the end. Th senator, i think the Industry Initiative to keep those conversations going would be something that x would bees very, very proactive about. If you talk about the shield act, the stop c. Sam act, part of the safe childhood act, i think our intentions are clear to participate in the need here. Senator, we support National Privacy legislation, for example. So that sounds like a good idea. We just need to understand what it means. All right. Mr. Spiegel. Senator, well continue to work with your team, and wed certainly be open to exploring the right regulatory body for the technology. The regulatory body is something you can see has merit. Yes, senator. And mr. Siff41 . Were very open to working with you, our peers and anybody for making the internet a safer place. As you mentioned, this is not a one platform problem. Right. We do look to collaborate with other properties. Thank you, all. Mr. Chairman, i yield back. Thank you, senator welch. Well, were going to conclude this hearing. Thank you all for coming today. You probably have your scorecard out there, youve met at least 20 members of this committee and have your own impressions of their questioning, their approach and the like. But the one thing i want to make clear is chairman of this committee for the last three years is this was an extraordinary vote on an extraordinary issue a year ago. We passed five bills unanimously in this committee. You heard all the senators. Every spot on the political spectrum was covered, every single senator voted unanimously in favor of the five pieces of legislation weve discussed today. It ought to tell everyone who follows capitol hill and washington a pretty stark message. We get it. And we live it. As parents and grandparents, we know what our daughters and sons are going through. They cannot cope, they cannot handle this issue on their own. Theyre counting on us as much as theyre counting on the industry to do the responsible thing. And some believe with impressions with the companies they represent thats your right as an american citizen. You also must leave to keep the spotlight on us to do something, not just to hold a hearing, bring out a good strong crowd of supporters for change but to get something done, no excuses. No excuses. Weve got to bring this to a vote. Look what i found in my time in the house and senate is thats the moment of reckoning. Speech is notwithstanding press releases and the like the moment of reckoning is when you call a vote on these measures its time to do that i dont believe theres ever been a moment in americas wonderful history where a business or industry has stepped up and said regulate us put some legal limits on us businesses exist by and large to be profitable. And i think that we gotta get behind that and say profitability at what cost. Senator kennedy, republican colleague said is our technology greater than our humanity. I think that is a fundamental question that he asked. What i would add to it or are politics greater than technology . Were going to find out. I want to thank a few people before we close up here. Ive got several staffers who worked so hard on this, alexandria gelbert, thank you very much to alexandria. Jeff hanson. [ applause ] last point ill make, mr. Zuckerberg is just a little advice to you. I think your Opening Statement object Mental Health needs to be explained. I dont think it makes any sense. There isnt a parent in this room whos had a child thats gone through an Emotional Experience like this that wouldnt tell you and me they changed right in front of my eyes, they changed, they hold themselves up in their room, they no longer reach out to their friends, theyve lost all interest in school, these are Mental Health consequences that i think come with the abuse of this right to have access to this kind of technology. So i see my colleague. You want to say a word . I think it was a good hearing. I hope something positive comes from it. Thank you all for coming. The hearing record is going to remain open for a week for statements and maybe questions submitted by senators by 5 00 p. M. On wednesday. Once again, thanks to witnesses for coming. The hearing stands adjourned. Since 1979, in partnership with the cable industry, cspan has covered complete coverage of the halls of congress, from the house and Senate Floors to congressional hearings, party briefings and committee meetings. Cspan gives you a front row seat to how issues are debated and decided with no commentary, no interruptions and completely unfiltered. Cspan, your unfiltered view of government. Scrspan is your unfiltered view of government. Were funded by these Television Companies and more, including comcast. You think this is just a Community Center . No. Its way more than that. Comcast is partnering with 1,000 Community Centers to create wifienabled lists so student from low income families can get the tools they need to be ready for anything. Comcast supports cspan as a Public Service, along with these other television providers, giving awe front row seat to democracy. Friday night watch cspans 2024 campaign trail, a weekly roundup of cspans campaign coverage, providing a onestop shop to discover where the candidates are traveling across the country and what theyre saying to voters, this along with firsthand accounts from political reporters, updated poll number, fundraising data and campaign ads. Watch cspans 2024 campaign trail friday nights at 7 00 eastern on cspan, online at csuburban. Org or download as a podcast on cspan now, your mobile app or wherever you get your podcasts. Cspan, your unfiltered view of politics. American history tv saturdays on cspan 2 exploring the people and events that tell the american story. At 7 00 p. M. Eastern, we continue with the series free to choose coproduced by Nobel Prize Winning economist Milton Friedman and his wife, rose. This segment entitled cradle a grave looks at government welfare programs. 8 00 p. M. Eastern on lectures in history, the second parts of a lecture by university of maryland history professor michael ross on the 1925 scope monkey trial about teaching evolution and its cultural significance in 1920s america. At 9 30 p. M. Eastern on the presidency, first ladies and civil rights, a look at the complicated history of American First Ladies and race relations, from slave owner Martha Washington to michelle obama, the first African American to hold the position, hosted by the white house historic association. And at 10 30 p. M. Eastern on Historic Campaign speeches, a look at a 1988 speech by then Vice President george h. W. Bush at an event in rock hill, South Carolina, followed by 2020 democratic president ial candidate joe bidens victory speech after the South Carolina primary, exploring the american story, watch American History tv saturdays on cspan 2 and find a full schedule on your Program Guide or watch online anytime at cspan. Org history. The Judiciary Committee voted on judicial nominees and debated subpoenaing republican donor harlan cran

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.