We will come to order, please. Together the cometty gathers to discuss this meeting will come to order. Thank you. This committee gathers to discuss what the Technology Industry is doing to remove sleep. And extremist content from their platforms. This is a matter of serious importance to the safety and wellbeing of our nations communities. I sincerely hope we can engage in a collaborative discussion about what more can be done within the jurisdiction of this committee to keep our communities safe from those wishing to do us harm. To do us harm. Representatives social Worlds LargestMedia Companies. Several r from representatives. Over the past two decades the ups has led the world in the and opment of social media other services that allow people to connect with one another. Open Platform Providers like google, twitser, and facebook and roducts like Instagram Youtube have dramatically changed the away we communicate have been used positively minded groupslike to come together and shed light of power and abuses matter ut the world no how great the service these latforms provide, they can be used for evil at home and abroad. On august 3, 2019, 20 People Killed and more than two dozen were injured in a mass hooting at an el paso shopping center. Police have said that they are easonably confident that the a pect posted a manifesto to website called eight chan. Prior to the shooting. The moderators removed the users l post, though continued sharing copies. Following the shooting trump called on social Media Companies to work in state ship with local, and federal agencies to develop ools that can detect mass shooters before they strike. About inly hope we talk that challenge today. Sadly the el paso shooting representative. On march 15, 51 were killed and at shootings at two mosques in Christ Church new zealand. Perpetrator filmed the attacks using the body camera and life streamed the footage to facebook followers. Who began to reload the footage sites. Book and other access to the footage quickly stated that cebook ofremoved 1. 5 million videos the massacre within 24 hours of the attack. 1. 2 million views of the videos blocked before they could be uploaded, like the el paso the Christ Church shooter also uploaded a to chan. The 2016 shooting at the pulse nightclub in orlando, florida, 49 and injured 53 more the orlando shooter was radicalized by isis and other jihadists propaganda sources. Nline days after the attack the f. B. I. Stated that investigators were highly confident that the shooter was selfradicalized through the internet. According to an official in the investigation, analysis of the shooters revealed thatices he had consumed a hell of a lot jihadists propaganda, videos,ng isis beheading shootings of family members of vems brought a federal lawsuit social media three platforms under the antiterrorism act. Circuit dismissed the lawsuit on grounds that this was an act of international terrorism. 3. 2 billion Internet Users this committee recognizes he challenge facing social Media Companies and Online Platforms. Act and remove content threatening violence from their sites. Are questions about whether there are questions users racking of the online activity. Individualsade an privacy . Due process, or violate constitutional rights. Removal of c hreatening content may also impact an Online Platforms ability to detect warning signs. Offers t amendment strong protections against restricting certain speech. Undeniably adds to the complexity of our task. Will these witnesses speak to these challenges, and ow their companies are navigating these challenges. Connected internet society, misinformation, fake viral eep fakes, and online conspiracy theories have become the norm. This hearing is an opportunity witnesses to discuss how their platforms go about content and material that threatens violence, and potentially and immediate danger to the public. Also our witnesses will discuss how their content moderation processes work. This includes addressing how human review or technological are employed to remove or limit violence contest. With law ion enforcement officials at the federal, state and local levels protecting our neighborhoods and communities. We would like to know how coordinating with Law Enforcement, when violent or content is identified. And finally, i hope witnesses ill discuss how congress can assist in ongoing efforts to emove content promoting violence from Online Platforms and whether best practices or of conduct in this area would help increase line. Y both online and off so i look forward to hearing testimonies from our witnesses. Engage in a constructive discussion about otential solutions to a pressing issue. And im delighted at this point o recognize my friend and Ranking Member, senator cantwell. Chairman and mr. For holding this hearing. Across the country were seeing a surge of cing hate. As a result we need to think much heard and the tools and to urce that is we have combat this problem online and offline. While the First Amendment to the constitution protects free speech, speech that incites violence is not protected and should review and trengthen laws to make sure we stop the online behavior that does incite violence. In testimony before the Senate Committee in july the federal Bureau Investigator from f. B. I. Said that white supremist violence is on the rise. He said the f. B. I. Takes this extremely seriously, and has made over a hundred year. S so far this were seeing in my state over the last several years, weve shooting at the Jewish Community center in shooting of a sheikh in kent, washington, a bombing attempt at the Martin Luther parade in spokane and over the last year weve seen a rise cregs of both synagogues and mosques. The rise in hate across the also led to multiple life shootings, including the congregation in pittsburgh, the pulse nightclub n orlando and the walmart in el paso. The shooter at one high school postings said d the image of himself and guns instagram, prior to the attack on his fellow killer in el paso the published a white supremist and immigration manifesto my colleague just mentioned the streaming of live content church to the christ shooting. Fihorrific incident that there. Ed promoting violence against uslims these human lives were all cut short by deep hatred and extremism that weve more problem. E this is a particular problem on the dark web where we see websites, like the host f 247, 365 hate rallies, adding Technology Tools to mainstream websites to stop the spread of these dark websites is start but this needs to be i believe calling on theve and department of justice to make sure that we are working across the board on an international basis with companies as well to fight this issue is an important thing to be done. We dont want to push people off social media platforms only for them to then be on the dark web where we are finding less of them. We need to do more at the department of justice to shut and these dark web sites, social Media Companies need to work with us to make sure were are doing this. I do want to mention, just last week, as there was discussion in washington about initiatives, the state of washington has passed three gun initiatives by closing of the people, loopholes related to background checks, gun sales and extreme person laws, voted by a majority in our state and successfully passed. Representatives from various companies of all sizes in the Technology Industry sent a letter supporting passage of bills requiring background checks, so very much appreciate that and your support of extreme person laws to keep guns out of the hands of people who a court has determined are dangerous with the possession of that. This morning we look forward to asking you ways in which we can better fight these issues. I want us to think about ways in which we can all work together, to address these issues. I feel that working together, these are successful tools we can deploy in trying to fight extremism that exists online. Thank you, mr. Chairman, for the hearing. Thank you, very much. Oralwe will hear testimony from our four witnesses, and your entire statements will be submitted to the record without objection. We ask you to limit comments at this point to five minutes. Ms. Bickert, you are recognized. Thank you for being here. Ms. Bickert thank you, chairman wicker, Ranking Member cantwell, distinguished members of the committee. Thank you for the opportunity to be here today, and to answer your questions and explain our efforts in these areas. My name is monika bickert, facebook Vice President for Global Policy management and counterterrorism. I am responsible for our rules around content on facebook and our companys response to terrorists wouldbe attempts to use our services. On behalf of everyone at facebook, i would like to begin by exposing my sympathy and solidarity with the victims, families, communities and everybody affected by the recent, terrible attacks across the country. In the face of such acts, we remain committed to assisting Law Enforcement and standing with the community against hate and violence. We are thankful to provide a way for those affected by this horrific violence to communicate with loved ones, organize events for people to gather and grieve, raise money to help support communities, and begin to heal. Our mission is to give people the power to connect with one another and build community, but we know people need to be safe in order to build that community, and that is why we have rules in place against harmful conduct, including hate speech and inciting violence. Our goal is to ensure facebook is both a place where people can access themselves, but where they are also safe express themselves, but where they are also safe. We are not aware of any connection between the recent attacks and our platform, but we certainly recognize we all have a role to play in keeping our community safe. Thats why we remove content that encourages realworld harm. This includes content that is involving violence or incitement, promoting or publicizing crime, coordinating harmful activities or encouraging suicide or self injury. We dont allow any individuals or organizations who proclaim a violent mission, advocate for violence, or are engaged in violence to have any presence on facebook, even if they are talking about something unrelated. This includes organizations and individuals involved in or advocating for terror activities, domestic and international, organized hate, including white supremacy, white separatism or white nationalism, or other violence. We also dont allow any content posted by anyone that praises or supports these individuals or organizations or their actions. When we find content that violates our standards, we remove it promptly. We also disable accounts when we see severe or repeated violations, and we work with Law Enforcement directly when we believe theres a risk of physical harm or a direct threat to public safety. While theres always room for improvement, we already remove millions of pieces of content every year for violating our policies, and much of that is before anyone has reported it to us. Our efforts to improve our enforcement of these policies are focused on three areas. First, building new Technical Solutions that allow us to proactively identify content that violates our policies. Second, investing in people who can help us implement these policies. At facebook, we have over 30,000 people across the Company Working on safety and security efforts. This includes over 350 people whose primary focus is counterhate and counterterrorism. Third, Building Partnerships with other companies, Civil Society, researchers, and governments, so that together we can come up with shared solutions. Of the work we have done to make facebook a hostile place for people advocating violence, but the work will never be complete. We know bad actors will continue to attempt to skirt detection with more sophisticated efforts, and we are dedicated to continuing to advance our progress. We look forward to working with the committee, regulators, others in the Tech Industry and Civil Society to continue this progress. Again, i appreciate the opportunity to be here today, and look forward to your questions. Thank you. Thank you very much. Wicker, Ranking Member cantwell, members of the committee, thank you for the opportunity to discuss these important issues. Twitter has publicly committed to improving the collective health, openness and stability of public conversation on our platform. Our policies are designed to keep people safe on twitter, and they continuously evolve to reflect realities of the world we operate in. We are working faster, investing to remove content distracting from a healthy conversation reported. Is tackling terrorism and extremism and preventing attacks requires responses, including from social Media Companies. Let me be clear, twitter is incentivized to keep terrorist and violent content from our service, both from a business standpoint and in current Legal Frameworks. Such content doesnt serve our business interest, breaks are rules and is funded mentally contrary to our values. Communities in america and around the world have been impacted by Mass Violence, terrorism and violent extremism with tragic frequency in recent years. These events demand a robust Public Policy response from every quarter. We acknowledge Technology Companies have a role to play. However, it is important to recognize content removal alone cant solve these issues. I would like to outline twitters key policies in this area. First, twitter takes a zerotolerance approach to terrorist content. Individuals may not promote terrorism, engage in terrorist recruitment or engage in terrorist acts. We have suspended over 1. 5 million accounts for violations of rules connected to terrorism, and continue to see more than 90 of these accounts suspended through our own proactive measures. In the majority of cases, we take action at the account creation stage before the account has even tweeted, and the remaining 10 is identified through user reports and partnerships. Secondly, we prohibit use of twitter by violent extra missed groups, defined in our rules as groups that in statements on or off the platform use or promote violence against civilians to further their cause, whatever the ideology. Since the introduction of this policy in 2017, we have taken action on over 186 groups globally, suspending over 2000 unique accounts. Thirdly, twitter doesnt allow hateful conduct on our service. An individual on twitter is not permitted to threaten, promote violence or direct attacks on people based on protected characteristics. When these rules are broken, we take action to remove the content and will permanently remove those promoting terrorism or violent extremism. Fourthly, rules prohibit the selling, buying or facilitating transactions of weapons, including firearms, ammunition and explosives, or instructions on making weapons, explosive devices or 3d printed weapons. We will take appropriate action on any accounts engaging in this activity, including permanent suspension where appropriate. Additionally, we prohibit the promotion of weapons and weapon accessories globally through paid advertising policies. Collaboration with our industry peers and Civil Society is critically important to address the threats of terrorism globally. In june 2017, we launched the Global Internet forum for counterterrorism. This facilitates among other things informationsharing, technical cooperation, research collaboration, including with academic institutions. Twitter and Technology Companies have a role to play in addressing violence, and ensuring our platforms cant be exploited by those promoting violence. This cannot be the only Public Policy response, and removing content alone wont stop those determined to cause harm. Quite often when we remove content from our platforms, it moves those views and ideologies into the dark corners of the internet where they cannot be challenged and held to account. As our Peer Companies improve their efforts, and content continues to migrate to lessgoverned pot forms and services, we are committed to learning and proving that every part of the online ecosystem has a part to play. Addressing Mass Violence requires a whole of society response, and we welcome the opportunity to work with industry peers, government institutions, legislators, Law Enforcement, academics and Civil Society to find the right solutions. Thank. Mr. Slater . Mr. Slater chairman wicker, Ranking Member cantwell, distinguished members of the committee, thank you for the opportunity to appear before you today. Global is derek slater, director of information policy at google. In that capacity, i lead a team advising the company on Public Policy frameworks to deal with online content, including hate speech, extremism and terrorism. Before i begin, i would like to take a moment on behalf of everyone at google to express our horror at the tragic attacks in el paso, ohio and elsewhere, and express condolences. Google services were not involved in these incidents, but we engaged with the white house, congress and governments around the globe on steps we are taking to make sure that our platforms are not used to support hate speech or insight violence. In my testimony today, i will focus on three key areas where we are making progress. First, how we work with government and Law Enforcement. Second, efforts to prohibit promotion of products causing damage, harm or injury. Third, enforcement of policies around terrorism and hate speech. First, google engages in ongoing dialogue with Law Enforcement agencies, to understand the Threat Landscape and respond to threats to the safety of our users and the public. When we have a good faith belief there is a threat to life or serious bodily harm made on our platform in the United States, the Google Cybercrime Investigation Group will report it to the Northern CaliforniaRegional Intelligence Center. That center will quickly get the report in the hands of officers to respond. The cyber crimes Investigation Group is on call 24 7. Were committed to working with government, Civil Society and academia. Since 2017, we have done this through the global forum for counterterrorism. Recently they introduce joint incident protocols to emerge to respond to emerging or active events. Second, we take the threat posed by gun violence in the United States very seriously, and are advertising policies have long prohibited promotion of weapons, ammunition or similar products causing damage, harm or injury. We also prohibit promotion of instructions for making guns, explosives, or other harmful products, and employ proactive and reactive measures to ensure our policies are appropriately enforced. Were constantly improving enforcement procedures, including enhancing Automated Systems and updating incident management and manual review procedures. Third, on youtube we have rigorous policies and programs to defend against the use of our platform to spread hate or incite violence. Over the last three years, we have invested heavily in machines and people to quickly identify and remove content violating our policies. This includes Machine Learning technologies to enforce our policies at scale, hiring over 10,000 people across google to detect, review and remove content, and experts who proactively look for new trends, and improved escalation pathways for ngos and governments to let us know about content, and finally, going beyond removal by actively creating programs to promote beneficial counterspeech, like the creators for change program. Tos broad work has led tangible results. Over 87 of the 9 million videos we removed in the Second Quarter of 2019 were first flagged by Automated Systems. More than 80 of those were removed before they receive a single view. Violating ours policies generate a fraction of a percent of views on youtube. We are constantly looking for new ways to improve policies. Youtube recently further updated hate speech policy. The policy specifically prohibits videos alleging a group is superior in order to justify segregation or exclusion based on age, gender, caste, religion, Sexual Orientation or veteran status. Weve seen a 5x spike in removals and channel terminations on hate speech. We take safety seriously and value our collaborative religion relationship with Law Enforcement and government agencies. We want to be responsible actors and part of the solution. As the issues involved, we will continue investing in the people and technologies leading the challenge. We look forward to collaborating with the committee as it examines the issues. Thank you for your time, and i look forward to questions. Thank you very much. Mr. Sellm, your group prefers to be known as adl these days, correct . Mr. Sellm the Antidefamation League goes by adl for short. We appreciate you being with us today, and we are happy to receive your testimony. Mr. Sellm thank you for the opportunity to be here, with distinguished members of the committee. My name is george sellm, senior Vice President for programs at the adl, four Antidefamation League. For decades, the adl has fought against bigotry and antisemitism, exposing extremist groups and individuals who express hate to incite violence. Today, the adl is the foremost nonauthority on domestic terrorism, hate groups and hate crimes. I have served on several roles in the Government National security apparatus, the department of justice and department of Homeland Security, on the National Security council and now outside government on the front lines of combating antisemitism and all forms of bigotry at the adl. In my testimony i would like to share with you key data, findings, analysis and urge the committee to take action to counter a severe National Security threat, the threat of online white supremacist extremism that threatens our communities. The alleged el paso shooter posted a manifesto to 8chan prior to the attack. He expressed support for the accused shooter in christchurch, new zealand, who also posted on 8chan. Before the massacre in poway, california, the alleged shooter posted a link to his manifesto terroristsiting the in new zealand and the pittsburgh tree of life attack. Three killing sprees, three white supremacist manifestoes. One targeted muslims, another targeted jews, and a third targeted immigrant communities. One thing these three killers had in common was 8chan, an Online Platform that has become the go to for many bigots and extremists. Unfettered access to Online Platforms both fringe and mainstream has significantly driven the scale, speed and effectiveness of these forms of extremist attacks. Our Research Shows domestic extremist violence is trending up, and antisemitic hate is trending up. Fbi and doj data show similar trends. The online environment today amplifies hateful voices worldwide and facilitates coordination, recruitment and propaganda that fuels the extremism that terrorizes our communities, all of our communities. Whether the government or private sector or Civil Society, immediate action is needed to prevent the next tragedy that could take innocent lives. Adl has worked with platforms at this table to address hate and its rampant nature online. We have been part of the conversations to improve terms of service, content moderation programs and better support for individuals experiencing hate and harassment on those platforms. We appreciate this work greatly, but much more needs to be done. Adl has called on the comings at this hearing, and many others, to be far more transparent about the nature of hate on their platforms. We need meaningful transparency to get actionable information to policymakers and stakeholders. But the growth of hate and extreme this violence wont be solved by addressing issues online alone. We urge this committee to take immediate action. First, our nations must clearly and forcefully call out bigotry in all of its forms at every opportunity. Must makeforcement enforcing hate laws a top priority. Our communities need congress to act in a range of ways, notably to addressffices domestic terrorism and a stream is him, and create extensive, comprehensive reporting, similar to that required in the domestic Terrorism Data act. Our federal legal system lacks the means to prosecute a white supremacist terrorist as a terrorist. Congress should explore if it is possible to craft a rights protecting domestic terrorism statute. Any statute would need to include specific, careful, congressional and Civil Liberties oversight to ensure the spirit of such protection is faithfully executed. In addition, the state department should examine whether certain foreign rights of premises groups meet white supremacist groups meet the criteria for detonation as foreign terrorist organizations. We look forward to social Media Companies expanding terms of service and exploring accountability and governance challenges, aspiring to Greater Transparency in how you address these issues, and partnering with Civil Society groups to help in all these efforts. Adl stands ready, with the government and private sector, to better address all forms o f threats online. This is an all hands on deck moment to protect our communities. I look forward to your questions. Thank you. Thank you, mr. Selim. Ms. Bickert, mr. Pickles, mr. Slater. Platforms, how do you define violent content . How do you define extreme content . Ms. Bickert thank you, mr. Chairman. We will remove any content that celebrates a violent act, a serious physical injury or death of another person. We will also remove any organization that has proclaimed a violent mission or is engaged in acts of violence. We also dont allow anybody who has engaged in organized hate to have a presence on the site, and we remove hate speech. Hate speech we define as an attack on a person based on his or her characteristics like race, religion, Sexual Orientation, gender. We list them in our policies. Harder to define extreme than violent, isnt that correct . Ms. Bickert we see people use that word in different ways. What we do, any organization that has proclaimed a violent mission or has document it acts of violence, we remove them. Doesnt matter the reason, we dont allow the violence, period. Mr. Pickles, what is your platforms definition of extreme . Mr. Pickles similar to facebook, we agree that extremism itself is subjective. Could beases, people extremely active on issues. We have a three stage test for violent extremist groups. The test is, we identify their stated purpose, publications or actions as extremist, and engaging in violence, currently involved in violence presently, or promoting violence as a way to further their cause, and they target civilians. We have the three stages, the ideology, the violence, because we believe that framework allows us to protect speech and debate but also remove violent extremists from the platform. We have a broader framework, threats for violence, call for arms, wish of violence against other people, not dependent on ideology. Mr. Slater, can you add any nuances . Mr. Slater broadly similar, in ban week ban we designated foreign terrorist organizations from using our platform, as well as hate speech, so broadly similar lines. Has suggested that your three platforms need to be more transparent. What do you say to that, mr. Slater . Mr. Slater thank you, chairman. And i think that transparency is the bedrock of the work we do, particularly around online content, trying to help people understand what the rules are and how we are enforcing them. Something we need to get better at. I look forward to working with this community and mr. Selim and others. We have in the last year on youtube provided a Community Guidelines enforcement report, where you can see how many videos we removed in a quarter, for what reasons, which were flagged by machines versus users, and we break that down by violent extremism, hate speech, health and safety and other issues. We think this is a key issue that we look forward to improving. Mr. Selim, before i ask ms. Bickert and mr. Pickles to respond, perhaps you could help them understand how you frankly dont believe they are quite transparent enough, at this point . Mr. Selim thank you. To be clear, the point i am making on transparency is to make sure there are more clearly delineated categories between the point that mr. Slater was making in terms of what the machines or algorithms use to remove certain types of content or stop it from going up in the first place, and what users on any of these platforms go on to say, we think this is a violation of the terms of service. Theres degrees of inconsistencies across these platforms at the table, as well as others. To get a holistic picture of what a certain issue might be, while individuals might flag versus what algorithms pull down, there are different consistencies in that. So we are asking for transparency, looking for a much more balanced approach in that, across all the platforms. Mr. Pickles, is he touching on something that has a point . Mr. Pickles absolutely. I think the balance particularly for Companies Investing in technology, understanding what came down because a person saw it and reported it versus Technology Found it, is very important. We published a breakdown of six policy areas, the number of user reports we receive is about 11 million every year. 60 of the content we remove is because Technology Found it, not because of a user report. So telling that story in a meaningful way is a challenge. What is the percentage at facebook, ms. Bickert . Ms. Bickert mr. Chairman, when it comes to violent content and terror content, more than 99 of what we remove is flagged by our technical tools. By Artificial Intelligence . Ms. Bickert some is Artificial Intelligence. Some is image matching. So known videos, we use a software to reduce that to a digital finger print and can stop uploads of the video again. We have worked with adl for years on this, and transparency is key, we would all agree. For the last year and a half we have published not only our detailed guidelines for exactly how we define hate speech and violence, but also reports on exactly how much we are removing, by category, and how much of that, like mr. Pickles said, is flagged by technical tools before we get user reports. Thank you very much. Senator cantwell . Thank you, mr. Chairman. Mr. Selim, i think you mentioned 8chan, but what do you think we need to do to monitor incitement on 8chan and other dark websites . Mr. Selim you can really approach this from two categories. There are a number of increased measures, some of which i noted in my written statement submitted to the committee, that these companies and others can take to create a greater degree of transparency and standards, so we can have a really accurate measure of of the types of hatred and bigotry that exist in the online environment writ large. A result of that increased or better data, we can make policies that apply to content moderation, terms of service, et cetera. So really having good data is a framework for better policies and better applications and content moderation programs. So you say there is more they can do, social Media Companies . Mr. Selim yes. There is much more they can do. I look in your statement, you included auditing, thirdparty evaluations foior that transparency. As i mentioned in my opening you dont basically want to drive us all to a dark web we have less access to. What more should we be doing, together, to address the hate taking place on these darker websites, too . Mr. Selim a number of measures. The first, having our Public Policy start from a place where we are victimfocused. We know whether it is paso,urgh, poway, el any of the other cities members of the panel and committee numbers have mentioned, we need extremismhat combat and dementia terrorism beginning from preventing other such tragedies. We need to start from a place with a better understanding of hate crimes, biasmotivated crimes, etc. When we start from that place, we can make better policies and programs at the federal government, state and local, and private industry as well. One of the reasons i will be calling on the department of justice to ask what more we can do, several years ago interpol, microsoft and others worked on trying to internationally address child pornography, to Better Police crime scenes online. I would assume that the representatives today would be supportive, maybe helpful, maybe even financially helpful in trying to address these crimes that they view today as hate crimes on the dark side of the web. Do i have any responses from the companys here . Ms. Bickert thank you, senator cantwell. This is something that across the industry we have worked on for the last few years, in a manner very similar to how the industry came together against child exultation online. Online. Itation for industry to create a sort of no go zone for terrorist and violent content. As part of that, we train hundreds of Smaller Companies on best practices and make Technology Available to them. The reality is, for the Bigger Companies we are often able to build tools that stop videos at the time of upload. Much harder for Smaller Companies, which is why we provide technology to them. We have 14 Companies Involved in a sharing consortium, so we can help even Small Companies stop content at the time of upload. I appreciate and agree with mr. Slater that there mr. Selim there is more you can do on your own side. Setting that aside for a moment, what do you think we should do about 8chan and the dark websites . What do you all think we should do . Ms. Bickert i can tell you what we do on facebook. We ban any link connecting to 8chan, or anywhere else these have appeared. The manifestoes for the el paso shooting, poway, were not available through facebook. What more do you think government and Law Enforcement working together besides what you do . Anybody else . Mr. Pickles to follow up on mr. Selims point. If criminal activity is happening, a Law Enforcement response is primary. If people are promoting violence lawnst individuals, tha enforcement intervention should be looked at. If we can strengthen our cooperation with Law Enforcement, we can make sure information sharing is as strong as it needs to be to support those interventions. So you believe we need more Law Enforcement resources addressing the issue . Mr. Pickles a question of both resources, and, there was a paper from George WashingtonUniversity Last week looking at the statutory framework around these spaces. There are opportunities to strengthen them. Thats a conversation to have. I definitely believe you need more Law Enforcement resources on this issue. I look at what progress we made with interpol and the Tech Industry on other issues. I think this is something, and i hear that, more resources. Thank you all very much. Thank you. Senator fischer . Thank you, mr. Chairman. In june, senator thune held a subcommittee hearing on persuasive design. As we mentioned, facebook, youtube and twitter are engineered to capture, track and keep our a attention, whether through predictions of the next video to keep us watching or what content to push to the top of our newsfeeds. Platformsl media failed to block extremist content online, this content doesnt just slip through the cracks. It is amplified, and amplified to a wider audience. We saw that during the christchurch shooting. Terroristsands Facebook Live broadcast was up for one hour. That was confirmed by the wall street journal, before it was removed. It gained thousands of views during that timeframe. Ms. Bickert, how do you concentrate on the increased risk from how your algorithms boost content while gaps still exist in getting dangerous content off of the platform . You touched on that a little bit in your response to senator wicker, but how are you targeting solutions to address that specific tension that we see . Ms. Bickert senator, thank you for the question. It is a real area of focus. There are three things we are doing. Probably the most significant is technological improvements, which i will come back to win a second. Second, making sure we are staffed to very quickly review reports that come in. The christchurch video, once that was reported to us by Law Enforcement, we were able to remove it within minutes. That Response Time is critical to stopping the virality. Finally, partnerships. We have hundreds of safety and Civil Society organizations we partner with. If they are seeing something, they can flag it for us through a special channel. Back to technology briefly. With the horrific christchurch video, one of the challenges for us was that our Artificial Intelligence tools did not spot violence in the video. What we are doing Going Forward is working with Law Enforcement agencies including in the u. S. And u. K. Together videos that can be helpful Training Data for our technical tools. Thats one of the many efforts we have had to try to improve Machine Learning technologies, so we can stop the next viral video at the time of upload. When you talk about working with Law Enforcement, you said Law Enforcement contacted you. Is that reciprocal . Do you see something show up, and then you in turn try to get it to Law Enforcement as soon as possible, so individuals can be identified . What is the working relationship there . Ms. Bickert absolutely, senator. We have a team, our Law Enforcement outreach team. Any we identify a critical threat of harm, we reach out proactively to Law Enforcement agencies. We do that regularly. Also, when there is some sort of Mass Violence incident, we reach out to them even if we have no indication that our service is involved at all. We want to make sure they know how to submit emergency to us. We respond around the clock in a very timely fashion, because we know every minute is critical in that situation. Im a former prosecutor myself, so these things are personal to me. I know that the platforms represented today, you have increased your efforts to take down this harmful content. s stille know, there shortfalls that exist in order to get that response made in a not just timely manner, but one thats truly going to have an effect. Mr. Slater, when it comes to liability, do media platforms, do you guys need more skin in the game, so you can ensure better accountability and be able to incentivize some kind of timely solution . Mr. Slater thank you, senator, for the question. If you look at the practices we are investing in, certainly from our perspective, we are Getting Better over time. The current Legal Framework strikes a reasonable balance. It both provides protection from liability that would go too far and would be overbroad, but is a sword and not just a shield, giving us Legal Certainty we need to invest in these detect,gies to monitor, review and remove this sort of content. That way, the Legal Framework continues to work well. Mr. Selim, can you comment on this as well . Do you think that there is enough legal motivation for social media platforms to prioritize some kind of solutions out there . Thats what this hearing is about, to find solutions so we hate thathat online i think continues to grow. Mr. Selim when thinking through the issues of content moderation, the authorities that exist within the current Legal Frameworks that reside within the Companies Represented at this table is sufficient for them to take actions on issues of content moderation, transparency, reporting, et cetera. So there certainly is a degree of Legal Authority that affords these companies and others the opportunity to take any number of measures. In yourickert, testimony, you say that Facebook Live will ban the user for 30 days for firsttime violation of its platform policies. Is that enough . Can users be banned permanently . Would that be something to look at . Ms. Bickert senator, thank you for the question. One serious violation will lead to a temporary removal of the ability to use live. However, if we see repeated serious violations, we seem retake that persons account away. We do that not just with hate and inciting content, but other problems. Senator blumenthal . Thank you, mr. Chairman. Thank you all for being here today, and thank you for outlining the increased attention and intensity of effort that you are providing to this very profoundly significant area. I welcome doing more and better, but i would suggest that even more needs to be done, and it needs to be better, and you have the resources and Technological Capabilities to do more and better. The question senator fischer asked of you, answerncentives, your was that they have authority to provide them with opportunities. The question is, really, dont they need more incentives to do more and do it better, to prevent this kind of Mass Violence that may be spurred by hate speech appearing on the site, or in fact may actually violence to of come. I want to highlight that 80 of all perpetrators of Mass Violence provide clear signals and signs that they are about to kill people. That is the reason senator graham and i have a bipartisan measure to provide incentives to more states to adopt extreme Risk Protection order laws that will in fact give Law Enforcement the information they need to take guns away from people who are dangerous to themselves or others. So that information is critically important to prevent Mass Violence, but also suicides, domestic violence, and the keys and information and signals often appear on the internet. In fact, this past december, in munro, washington monroe, washington, a clearly troubled young man made a series of antisemitic rants online, bragging about planning to shoot up a expletive school on video well armed with an ar15 style weapon, and on facebook posted he was shooting for 30 jews. Fortunately, the adl saw that post. It went to the fbi, and the adls vigilance prevented another parkland or tree of life attack. Craig gutenberg of coral springs, florida met with me yesterday to talk about a similar incident involving a young man in coral springs, who said he would shoot up the high school there. Law enforcement was able to forestall that using an extreme Risk Protection order statute. Facebook,stion is, to twitter and google, what more can you do to make sure that these kinds of signs and signals involving references to guns, it may not be hate speech, but referenced to possible violence with guns or use of guns, to make that available to Law Enforcement . Ms. Bickert, mr. Pickles , mr. Slater . Ms. Bickert thank you, senator blumenthal. One of the biggest things we can do is engage with Law Enforcement to find out whats working in our relationship and what isnt. Thats a dialogue that over the past years has led to us establishing a portal through which they can electronically submit requests for content with legal process, and we can respond very quickly. What are you doing proactively . I apologize for interrupting, but my time is limited. Proactively, what are you doing with the technology you have to to identify the signs and signals that somebody is about to use a gun in a dangerous way, that someone is dangerous to themselves or others and is about to use a gun . Ms. Bickert senator, we are using technology to identify any of those early signs, including gun violence, but also suicide or self injury. Andy report it to Law Enforcement . Ms. Bickert we do. Cases8, we referred many of suicide or self injury where we detected them using Artificial Intelligence, to Law Enforcement so they were able to intervene and in many cases save lives. We have a very similar approach. When we have a credible threat, of someone at risk to others or themselves, we work with the fbi to make sure they have the information we need. Mr. Slater similarly, when we have a goodfaith belief of a credible threat, we will proactively refer to the Northern CaliforniaRegional Intelligence Center, who will found that out to the right authorities. Because my time has expired, i will ask each of you if you would to please give me more details in writing, as a followup, for how, what identification signs you use, what kinds of technology, and how you think that it can be improved, assuming that the congress approves, as i hope it will, the emergency Risk Protection order statute to provide incentives, more than just the 18 states that have them now, but others to do the same. Thank you. Thank you, senator blumenthal. Senator thune . I thank all of you for being here today. Your participation is appreciated, as this committee continues oversight of the difficult task each of your company space. Preserving openness on your platform while seeking to manage and thwart the actions of those who use your services to spread extremist and violent content. Last congress, we held a hearing looking at terrorist recruitment propaganda online, and discussed the cross sharing of information between facebook, microsoft, twitter and youtube that allowed each of those companies to identify potential extremism faster and more efficiently. Id direct his question and ask, how effective is that shared database . Ms. Bickert senator thune, thank you for the question. Through the shared database, we have more than 200,000 distinct caches of terror propaganda. I speak for facebook only, but that has allowed us to remove a lot more than we otherwise would be able to. Mr. Pickles i would add, since that hearing, the reassuring thing is that we dont just urls. We now share if we see a link to a piece of content like a manifesto, we are able to show that across the industry. Furthermore, after christchurch we recognized we need to improve. We have Realtime Communications in a crisis, so industry can talk to each other in real time operationally, so not even contentrelated but situational awareness. That partnership also now involves Law Enforcement, which wasnt there when we had the last hearing. So it is about broadening new programs to develop that further. Broadly, i would say look at how we have improved over time. Systems are not perfect. We always have to evolve to deal with bad actors, but on the whole we are doing a better and better job, in part because of technology sharing, information sharing, removing this sort of content before it has wide exposure or is viewed widely. Senator, i would only add that the threat environment we are in has evolved over the last when he 436 months, and the tactics and techniques that these platforms and others use , the evolving nature of the terrorist landscape online, whether it be foreign or domestic, needs to keep pace with the environment today. As a followup, are there partnerships to specifically add enough i Mass Violence specifically identify Mass Violence . Ms. Bickert one of the things we have done over time is to expand the mandate of the Global Internet forum for counterterrorism. We relatively recently included Mass Violence incidents, and are now sharing through our protocols a broader variety of incidents. Mr. Slater, youtubes automated Recommendation System has come under criticism for usersially steering toward increasingly violent content. I led a subcommittee hearing on the use of persuasive technologies in internet platforms, algorithm transparency and algorithmic content selection. I asked googles witness at that hearing several questions for the record about youtube that were not thoroughly answered, and i would say that providing complete answers for the record is essential as we look to combat many of the issues expressed here today. I would like your commitment to provide their responses to any questions you might get for the record. Do i have that . Mr. Slater to the best of our ability. Ok. I would like to explore the nexus between persuasive technologies and todays topic. What percentage of Youtube Video views are a result of youtube automatically suggesting or playing another video after the user finishes watching a video . I dont have a specific statistic, but i can say that the purpose of our Recommendation System is to show people the videos that they may like, or are similar to what they watched before. At the same time, we recognize the concern about recommendations for borderline content. That is content that is not removed, but brushes right up against the lines. We have introduced changes this year to reduce recommendations for those sorts of borderline videos. If you could get the number, i assume you have that somewhere, that has to be available, and furnish it for the record. The question specifically, what is youtube doing to address the risk of some of these features, which as you know are pointing the u. S. In a direction of completing the user in a direction of increasingly violent content . Mr. Slater the change we made in january to reduce recommendations, it is early days but it is working well. Have reduced by 50 just since january. As the systems get better, we hope that will improve, and i am happy to discuss that further. Thank you. Blackburn,e senator followed by senator scott. Thank you, mr. Chairman. I want to thank each of you for being here this morning. For talking with us. This committee has looked at this issue on the algorithms and their utilization for some time, and we are going to continue to do this. Looking at content, the extremist content that is online, is certainly important. We know there are a host of solutions that are out there, and we need to come to an agreement and understanding of how you are going to use these technologies to really protect our citizens. Social Media Companies are in a sense open public forums. They should be. Where people can interact with one another. Part of your responsibility in this thing is to have an andctive cop on the beat, be able to see what is happening, because you are looking at it in real time. But what has unfortunately happened many times, you dont at a objective view, consistent view. You get a subjective view. This is problematic, and it leads to confusion by the public that is using the virtual space for entertainment, for their transactional life, for obtaining their news. So indeed, as we look at this issue, we are looking for you to approach it in a consistent and objective manner, and we welcome the opportunity to visit with you today. I have a couple things i wanted to talk with you about. Theseall heard about thirdparty facilities, where contractors are working long hours, looking at grotesque and violent images, and they are doing this day in and day out. Talk a little bit about how you transition from that to using modern technologies, what facebook is going to do in order to extract its, and minimize harm. You talked about, you have 30,000 employees working on safety and security, and there are thirdparty entities working on this. So, lets talk about that impact on the individuals, and talk about the use of technologies to make up this process and it more consistent and accurate. Ms. Bickert thank you for the question, senator. Making sure we are enforcing our policies is a priority for us. Making sure that our content reviewers are healthy and safe in their jobs is paramount. One of the things we do, we make sure we are using technology to make their jobs easier and limit the amount and types of content they have to see. A couple examples. With child exportation videos, with graphic violence, with terror propaganda, we are now able to use technology to review a lot of that content so that people dont have to. Let me ask you this. Sorry to interrupt, but we need to move forward. Reviewers, are they all located in palo alto, or are they scattered around the country or the globe . Ms. Bickert the more than, we have 30,000 people working in safety and security. Some are engineers or lawyers. The content reviewers, we have more than 15,000, they are based around the world. All right. Go ahead. For any of them, not only are we using technology, and there are ways we are using, even when we cant make a decision on the content using technology alone, there are things we can do, like removing the volume or separating a video into steel frames, that can make the experience still frames, that can make the experience better for the reviewer. Mark zuckerberg in a Washington Post oped called for us to define lawful but awful speech. Tell me how you think you could lawful could define awful speech but not overreach or infringe on First Amendment rights . Ms. Bickert one of the things we are looking to is clarity on the actions governments one us to take. We have our policies that lay out clearly how we define things, but we dont do that in a vacuum. We do that with a lot of input from Civil Society organizations and academics around the world, but we also like to hear views from government, so we can make sure we are mindful of all the different out of time. Mr. Pickles, i will submit a question to you for the record. Mr. Selim, i have one i will send to you. Mr. Slater, i always have questions for google, so you can depend on me to get one to you. We do hope you all are addressing your prioritization issues, also. With that, mr. Chairman, i will yield back. Thank you very much. Senator scott . Thank you for being here today. Im glad we are here today to have a meaningful conversation about whats happening in our nation. Thattime to face the fact our society has produced a underclass of primarily violent young men who place no value on human life. They live purposeless lives of anonymity and digital dependency, acting out evil desires sometimes with racial hatred. As you know, when i was governor we had the horrible shooting at the school in parkland. Within three weeks, we passed historic legislation, including the protection orders senator blumenthal was talking about. We did that with Law Enforcement, Mental Health counselors and educators to come up with the right solution. With regard to the shooting at parkland, the killer had a long, long history of violent behavior. In september 2017, the fbi learned someone with the username nikolas cruz posted am going toideo i be a professional school shooter. In addition, he made other threatening comments on various platforms. Cruzndividual whose video posted this on reported it to the fbi. Unfortunately, the fbi closed the investigation after 16 days without ever contacting nikolas cruz, claiming they were unable to identify the person making the comment. Unfortunately, we now have 17 innocent lives lost because of nikolas cruz. Mr. Slate how is a platform like youtuber, owned by google, not able to track down the ip address and identity of the person who made the comment . When did youtube remove the comment . Did youtube report the comment to lawenforcement . If you did report the comment, did you followup . What was the process, and any followup to see if there was corrective action . Mr. Slater it was a horrendous event. We strive to be vigilant, to invest heavily to proactively report when we see an imminent threat. I dont have details on this pacific fax you are describing. Looking ahead, parkland was a to reachat did spur us out to Law Enforcement to talk about how we can do this better. Thats part of all we reached out to work out with the Regional Intelligence Center to make sure that we did have these good faith beliefs and we can go to a onestop shop to get them to the right Law Enforcement locally rather than trying to have people. In the last month, there was an incident where pbs was streaming news hour on youtube, somebody put a threat in the live chat. We were for the Regional Intelligence Center, they referred to numeral and orlando police, who took the person in custody. Thats not to say things are not perfect. We look forward to working with you and Law Enforcement on that. I think we continue improving over time. You give me the information contacted, when, when it was taken down. I cannot get an answer on what anybody did with regard to this , what youtube did, with the fbi did, nobody talked about it. If you will give me that information. That anotherrtable nikolas cruz puts something up, you have the process to contact somebody and there will be a followup . Our processes are Getting Better all the time. Area where it is an evolving challenge because technology evolves, because tactics evolve. I will be happy to follow up and get more information on how they operate and we work together. Can nicolas maduro, who is committing genocide against withholding, who is clean water, food, medicine, still have a twitter account with 3. 7 million followers . Highlight the behavior taken and the question for us is a Public Company that provides a public space for dialogue is someone breaking rules on our service . We recognize there are are world where there leaders who have twitter accounts in countries where twitter is blocked. Hope thee a view, we dialogue of that person being on the platform starts, contributes to solving the challenges you outlined. He has been doing it for a long time, it is not Getting Better in venezuela. Illustrationgood of how the role of Technology Companies and other parts of policy responses, and if we remove that account, it will not change that. Howeed to bear in mind other levers come into play. I disagree. Andalks about things continues to act like hes a world leader, and he is a pariah. It seems to me what you are doing is allowing him to continue doing that. His current account hasnt broken a rule. When he breaks a rules, he will be treated the same as every other user. We will take action when necessary. Get toare trying to other people, i would be happy to work with the senator from florida on this issue. I think we are not doing enough. The specific case i mentioned rohingya and would have been on facebook is another example, happy to work on this issue with you. You. Ank there is a vote. Im shocked to hear they are going to leave it open until 11 30. That is generally what happens. Senator duckworth. Thank you mr. Chairman. Thee i appreciate interception of extremism and social media, many would agree the hearing is another data point in congressional ham bringing on gun violence. According to the archive, since 2019 began, 250 days ago, we have witnessed 318 Mass Shootings in the u. S. More than one per day. Mass shootings are those were at least four people are shot, excluding the shooter. After 20 children, six adults lost their life in 2012, many officials, including myself, declared an end to congressional inaction. Since that day, our nation has endured 2226 Mass Shootings. Think about that number. Focused on ways to stop gun violence, but the score joe social media. Im not going to say theres no connection, but every other country on the planet has social media, video games, online harassment, crime, and Mental Health issues, but they dont have Mass Shootings like we do. Nothing highlights congress inability to highlight the crisis and then seeing 318 mass , thenngs in 260 days holding a hearing on extremism in social media. This is a chart from the Digital Marketing institute that according to their website, highlights the average number of hours social media users spend on platforms like facebook and twitter. Less,ited states and the our users a middle of the path when you are online. Do you agree that america plus use of social media is not especially unique on a per capita basis . Are you aware of specific trends to explain the amount of gun violence in the u. S. . Us, because some of us cant see the details. This is how much time, the average number of hours social media users spend no social media each day via any device. Cracks the arrow points to the u. S. . The highest is philippines, the lowest as japan, the u. S. Is in the middle. I have a four and a half year old. I have an 18monthold. It is the iphone, she knows how kids and goutube right to what she wants to watch. Usage,. Was social media would you agree its in the middle of the pack compared to yes,est of the world . According to the study which im not familiar with. Cracks in other words, are you aware of trends on your platform to explain the amount of gun violence in the u. S. . Reflects over 80 of our users outside of the u. S. I think your image speaks to itself. You brought up the rule that video games can play in online hate and harassment. I agree any dissemination of hate must be addressed, regardless of the platform used, but if a meaningful connection between video games and gun violence exists, youd think the widespread use of video games in japan and south korea would reflect a connection, correct . I think theres something to be said about the availability of guns in the u. S. If you look at the amount of time folks in japan and south korea spend on video games, it is far greater than anywhere else. We are third. You look at the number of victims of gun violence and people,or every 100,000 heres the u. S. We are not the biggest users of video games. Would this be accurate . I have not read this study, but i have one data point. According to a report looking at extremist related murders and homicides over the past decade, Research Shows 73 of extremist related murders and homicide were committed with firearms. To the extent youre making a point that extremists with weapons result in violence and homicide, we have the data that backs up that point. The world is full of individuals who use social media platforms to disparage others to cast false equivalencies and question fax. Tonymity of Online Platform spread hate. Our use of social media, video games and other variables does little to explain the 2200 26 Mass Shootings since sandy hook. The internet has emboldened and empowered hate by allowing individuals to build up on my communities and share their ideas. It is our week the models that allow that hate to become lethal. It is a clear and undeniable connection between the number of guns in the u. S. And the number of gun deaths in our communities. This is number of guns per 100 people. This is number of gun related deaths are 100,000 people. The world,rest of some of whom use more social media than we do. In morewhom engage videogames than we do. We are saturated in weaponry designed for war, but made available to anyone who attends a local gun show. A shooter can have a 100 run drum. I didnt have a 100 run drum in iraq. Yet, you can find them at gun shows. 90 of americans agree congress should expand background checks. 60 agree banning highcapacity ammunition clips is what we need to do. This is not controversial. It is well past time leader mcconnell brings it to the house, the house passed bipartisan background checks for a vote. I hope leader mcconnell will allow the keep americans safe act, the disarm hate act, and domestic terrorism prevention act. Each of these bills will keep our children and neighbors safer. I hope my republican colleagues will join in these bipartisan efforts. I yield back. Lets do this. If you would reduce those three copyrs to a size we can and build the admitted in the record at this point in the hearing without objection. Senator young. Thank you. I want to thank all of our panelists for being here. Andpreciate your testimony answering our questions. Collaborate in curbing online extremism, which i understand to be one of multiple causes that we can cite as we all think about the issue of mass casualty events and extremist events, or generally. The nation is wrestling with Mass Violence, extremism, and issues and responsibility, and digital responsibility for some of these events. Indiana,e state of hoosiers and crown point indiana, recently experience how a person can become radicalized over the internet, something i know many of your companies have studied and are working on. In 2016, a crown point man was arrested and convicted for planning a terrorist attack. After becoming radicalized by isis over the internet. Thankfully, the fbi and the Indianapolis Joint Terrorism Task force intervened before any violent attack occurred. However, that isnt always the case. We have seen this across the country. Thats why its critically important we continue working knowing collaboratively the products and platforms provide incredible value to consumers and they werent intended for this purpose. Is our responsibility in congress, definitely your responsibility to make sure we monitor how the great value you provide can be used in an illicit improper, dangerous, and nefarious manner. In one minute or less, because i have three minutes left, i would request the representatives from google, facebook, and twitter, tell us why americans should be confident that each of your companies is taking these issues seriously, and why americans should be optimistic about your efforts going forth. Indeed, google. Pointing totart by youtube Community Guidelines enforcement reform, which details a recorder videos are removed, the reasons why, how much has been flagged first by machines. Removing violent content with a combination of technology and people. Technology gets better at identifying patterns, dealing with the right nuance. Getting better at taking down the content faster and before people needed. 9 million videos we removed in the Second Quarter of this year. Flagged byrst machines. 80 were removed before a single view. It is generally better in terms of removal before wide viewing. We were already seeing advancements in Machine Learning, not just in this area, but across broadly. Machine learning has data, it learns from mistakes. Those systems will get better. Why would you be optimistic . Ideally, the systems will get better. Will they be perfect . No, but they will evolve. There is reason for optimism based on the collaboration between all of us today. Facebook. The first thing i will say is facebook wont work as a service if it is not a safe place. This is something we are aware of every day. If we want people to build a community, they have to know there safe. The incentive is there to make sure we are doing our part. Have on ourhings we team of more than 350 people who are primarily dedicated to the job of countering terrorism and hate is expertise. I lead this team, my background is was more than a decade of the criminal prosecutor. The people i have hired onto the team have backgrounds in Law Enforcement, studying terrorism and radicalization. It is something people work on at facebook because this is what they care about. They are not assigned to work on it while at facebook. This is bringing in expertise. Colleagues, we have taken steps to make will we are doing very transparent. Showeports being published a steady increase in our ability to detect terror, violence, and hate much earlier when is uploaded to the site and before anybody reports to us. More than 99 of violent videos and terrorist propaganda we remove from the site we are finding ourselves before anybody reported to us. Twitter. We can be optimistic. A few years ago, at the peak of the islamic caliphate, people challenge their industry to do more and be better. 90 of the terrorist content that twitter removes is detected through technology. I look at independent academics who talk about the community being decimated on twitter. I look at the collaboration between our companies, which did not exist when i joined twitter. All of those areas have driven better technology, faster response, and a much more aggressive posture too bad actors. In noshows benefit in other areas. We can also take confidence that no one will tell the committee are working is done. Everyone of us will leave knowing we have more to do and we can never sleep. We have to keep this like this. I can spend five days, maybe five years on this. I only have five minutes. Im are ready one minute over. Mr. Chairman. I will go vote. I will not let them close the vote until you have asked your questions. Thank you for holding this hearing. I want to thank the witnesses for being here to talk about this very real and difficult issue. The rise of extremism online is a serious threat. The internet is proven a valuable tool to extremists connecting with one another through various forms to spread hate and dangerous ideologies while were here to focus on the proliferation of extremism online, which is incredibly important, we must not lose sight of the fact that violent individuals who find communities all my to feel hatred have acted in the name of hate. We cannot ignore the fact that the sensible commonsense gun Safety Measures like background checks are allowing individuals to access dangerous weapons far too easily. We know the majority of americans wants to support that. I represent the great state of nevada. Approach, unfortunately, the twoyear anniversary of the october shooting in las vegas, the deadliest mass shooting in modern american history, we know coordination with and between Law Enforcement is more important than ever. The Southern Nevada counterterrorism center, also known as our fusion center, is an example of a dynamic ownership between 27 different Law Enforcement agencies to rapidly and accurately respond to terrorists and other threats. With las vegas hosting nearly 15 million tourists and videos each year, the center is responsible for preventing countless climbed crimes of terrorism. Us you please discuss with your coordination efforts for Law Enforcement when violent or threatening content is identified on your platform . Would you need from us as a legislative body to promote and thise and facilitate partnership to keep our communities safe from another shooting like in october . That attack was incredibly tragic and our hearts are with those who have suffered. Our relationship with Law Enforcement is an ongoing effort. That trains to ensure Law Enforcement understands how they can best work with us. That is something we do proactively. Violencehere is a mass incident, we reach out to Law Enforcement immediately, even if we are not aware of any connection between our service and the incident, we want them to know they can reach us. We have an online portal where they can submit the legal process, including emergency request. We can respond quickly. We proactively refer imminent threat to Law Enforcement whenever we find them. I want to echo her sympathies for your constituents who were victims of the horrible tragedy. The lessons we have learned since that event have continued to inform our thinking. Not waiting for the ideological infant of it to be known before acting. Is of the challenges we have you may look for an Organization Affiliation before we would say it is a terrorist attack. We at first to stop people using our services. We do cooperate with Law Enforcement to provide Credible Threats. One of the questions about the companies, met with a number of agencies yesterday to discuss how we can further deepen our collaboration. One of the questions is a huge amount of information within the Law Enforcement committee know with the umbrella. They might help us understand the threats come the trends, the situational awareness. Understanding how more information can be shared. Theould you tell us some of tools you may need to help you better cooperate to protect the communities . That was the subject of the meeting yesterday, we had a p productive conversation. Similar here in core and sympathy. In the ways we proactively cooperate with Law Enforcement to refer Credible Threats and valid requests coming emergency disclosure requests. I see my time is up. I will submit a question for the record about embedding violent antisemitism online. Have votes, i appreciate your time and commitment to solving and working on this issue. Thank you. Thank you. With a simplet yes or no question, i dont mean this to beatrice your single question. Its not either yesterday, or your chenault with a brief caveat if you need. I would like to hear from each of the three of you. Thatu provide a platform you regard and present to the public as neutral in the political sense . Yes. Our rules are politically neutral. So you aspire to political neutrality . We want to be a service for political ideals across the spectrum. Rules and theyur are enacted without ideology included. Similarly, we craft our services and regard to political ideology. We are not neutral against terrorism or violence. I appreciate you pointing that out. That is not what i talking about. That leads into the next question i wanted to raise. Important. The work youre doing in a chariot is important for anyone occupying this space to be conscious. You do a service to those who access your services by removing things like pornography and terrorism advocacy, things like that. Theres a lot of debate that surrounds this issue and the Legal Framework around it. Section 230 of the Communications Decency act has received criticism. From being a website held liable as a publisher of information by another information content provider. Significantly, section 230 Good Samaritan provision gives you the promise that you will not be held liable for taking down this type of additional content, whether it is something that is constitutionally protected or not. Witnesses, the same each of you represents a private company. Each of you are accountable to your consumers within your company. This means that in some sense, you have incentives to provide a safe and enjoyable experience and your respective platform. I have a question about section 230. Particularly the Good Samaritan provisions. Does it help you in your efforts to swiftly take down things like pornography and terrorist content of your platforms . Would it be more difficult without the Legal Certainty that section 230 provides . Absolutely. It is critical to our efforts in safety security. I would say it has been critical to the leadership of american industry and the information sector. Absolutely, yes. Point, imagine a world where this is taken away with those provisions no longer exist. Large Companies Like yours might be able to still would be able to and probably would, filter out this content between the Artificial Intelligence capabilities at your disposal and the Human Resources that you have. Douspect you could and would your best to perform the same function. What about a startup . A company trying to enter into yourspace that each of companies entered into when they were created not very many years ago . What would happen . Thank you for that question. It reminds me of industry conversations involving Smaller Companies before we formed the Global Internet form to counterterrorism in june of 2017. Were having closed door sessions with Companies Large and small to talk about the best ways to combat the terrorism online. Companies were concerned about liability. To ben 230 is important able to begin to proactively act and assess on them. Part of a fundamental maintaining a competitive online ecosystem. Without it, the ecosystem is less competitive. Section 230. As its part of the reason what we have been a leader in economic growth, innovation, and technological development. Other countries suffer. Study after study has shown that. Away,it were to be taken all three of your companys are not exactly known for being a Small Business or a business with a modest economic impact. You can identify this concern im expressing. If we took that away, you might be will to keep up what you need to do, but will be harder for someone to start a new Search Engine company, a new tech platform . Somebody starting out in the same position where your company was a couple of decades ago. Would that be extra naturally more difficult . Think it would create problems for innovators of all stripes. Small and mediumsize businesses potentially getting your arms around that significant change to the fundamental Legal Framework of the internet. My time has expired. I wanted to thank our Committee Chairman for holding this hearing. Its a vital conversation for us to be having. We need to be taking a hard look at how we address the rising tide of online extremism and its realworld consequences in our country. You onve questions for this important topic. I wanted to echo some of what my colleagues have artie said, which is there is much more the senate must do to address gun violence. Whether or not it is connected to hatred on the internet. More than 200 days ago, the passedf representatives a bipartisan universal background check bill. This commonsense gun safety measure has an extraordinary level of public support. It deserves a vote on the senate floor. Hearings,simply have but we have to act to reduce gun violence. Adl center on extremism has closely studied hate crimes and extremist violence in this country. Is it fair to say there has been an alarming increase in bias motivated crimes, including extremist killings in the last several years . That is accurate. In the case of extremist killings, what role do you feel access to firearms has played . Is a briefly alluded to earlier to expand on what i was mentioning, according to a recent report, extremists of all ideological spectrums that committed murders are homicides in the u. S. , 73 of those were committed with firearms. What impact you believe the increase of hate crimes have on the minority communities and members that have been targets of these attacks . Let me add to that question one of the unique aspects of a hate crime is it not only victimizes the targeted victims, but strikes fear among those who share the same characteristics with the victim or the victims. Months, we saw24 in calendar year of 2017 with a 50 increase in antisemitic incidents across the country. Dataojs own hate crime shows an increase in hate crimes. We continue to see the troubling statistics year after year. Imperative as part of my testimony through submitted written and my oral testimony speaks to the need of enhancement and enforcement of hate crime laws and protection for victims. I am an original cosponsor of senator bob caseys legislation the disarm hate act, which would bar those convicted of misdemeanor hate crimes from obtaining firearms. Do you agree the measure would keep guns out of the hands of individuals who may engage in extremist violence . Yes. Thank you for your leadership and all members who have supported the legislation. Efforts thatte the social Media Companies have described to combat online extremism, including to provide transparency to their users and the general public. Important tolly understand how you are addressing problems within your existing services and platforms. I would like to learn more from you about how you were thinking about this issue as you develop and introduce new products. Feel the lot of us approach of rapidly introducing a new product and assessing the is a problem. Ater ask how you plan to build combating extremism into the next generation of ways in which individuals engage online . Safety by design is an important part of building new products at our company. One of the things we have built in the past five years is a new product policy team. Their responsibility is to make sure they are aware of products andfeatures being built, explaining to these engineers thinking of the wonderful ways they can be used, all of the abuse scenarios and making sure we have recording mechanisms and other safety features in place. We are in a very adversarial space. Access will change behavior. When we have a policy decision, one of the key things is how it can be used against us. How will people change and circumvent the policy. Into learningat and share it with Smaller Companies. 200 Small Companies around the world sharing that with them to understand the challenges. It is invaluable. Our trust and safety teams are at the table with product managers and engineers from the conception of idea to the development and possible release. From ground up, it is safety by design. I want to thank the witnesses. I will be taking over as chair. I will call on myself as the next witness. You, yourask all of companies, your technology is famous for its algorithms, which seem to have the ability to pinpoint what people want. You can put an email out, some people talk about your interest , next and youters know, you have ads popping up that talk about yellow sweaters. Who knows how that happens, but to a lot of us, it happens. It is pretty impressive. Isyour Algorithm Technology so good at pinpointing things like that, particularly as it relates to ads, order the tollenges with regard directing that kind of technology to help us and help what has been talked about on both sides of the aisle, which is the people who are committing this kind of violence are particularly disaffected young males. Arent there signs, things you can do with the technology you do so well in other spaces to provide more warning signs of this kind of violence from these kind of individuals who already have a profile online . That . U working on thank you for the question. Technology plays a huge role in what were doing to enforce safety policies. In the area of terrorism, extremism, and violence, its not just the matching softwares we have to stop organized terror usinganda videos, we are Artificial IntelligenceMachine Learning to get better at identifying new content we have not seen before that may promote violence or incite violence. The incredible threat of imminent physical harm, we send that to Law Enforcement. New systems are Getting Better. Are you using algorithms to advanced technology you use in other spaces to identify those . There is cross learning along the companies. Is it a priority of yours . Absolutely. Out of all of the companies . Investing in technology to find terrorist content is absolutely a priority. Yes. Add to this part of the conversation as somebody who researched the data are these issues for nearly two decades, the environment has changed significantly. White supremacist terrorists in the u. S. Dont have Training Camps in the same way that foreign terrorist groups do. They are a Training Camp where they learn and coordinate with one another in the online space. The machine about learning, technology, Artificial Intelligence, continues to disrupt the environment and make it a inhospitable place for individuals that want to promote violent content to be disrupted. This is a bigger policy question. All of your companies have the tension between eight want eyeballs on, more clicks, more time on. Facebook, google, or twitter, i think there is showing thetudies amount of young men and women, young girls who feel the sense of loneliness from the time online. Theres indication that among teenagers, suicide rates are increasing for young girls. We are all dealing with the opioid epidemic. We are looking back wondering how we did that, how we got to this position in the 90s and americansgs, 72,000 died of overdoses last year. Were looking back and asking how this happened. Policymaking, are we going to look back in 20 years going how did we addict a bunch of Young Americans to look at their iphones eight hours a day . 20 years from now, we will see the social, physical, and psychological ramifications were we may be kicking ourselves and wondering why we allowed that to happen. It worries me. You have tension, because you youngore face time, teenagers spending seven hours a day staring at their iphones, because that helps revenue. That 15, 20 years from now we will be in the same spot that we were with opioids and wondering what we did to our kids . Your power, your negative implications of whats happening in society right now. Mother, i take questions of wellness seriously. This is something we look at and we talked to wellness groups to make sure were crafting products and policies in the best longterm interest of people who want to be a connector. We have seen social media be a tremendous place for support for those thinking of harming themselves or struggling with opioid addiction, or getting exposed to hateful content. Exploring and developing ways of linking people up with resources. Were doing that for opiate addiction, for thoughts of self people asking or searching for hateful content. Can be ank this positive thing for overall wellness. We have similar programs in place for opioids and people using terms referencing self harm or suicide. We provide them with a sort of a report. Thats what we have rolled out. And recognize things like Digital Literacy are issues that we need to invest in to make sure that people using our services have the skills and awareness to use them deciding the. Our ceo has committed the company to looking at the health of the conversation and not just using the metrics you reference, but looking at much more broad metrics, mention the health of the conversation, rather than revenue. Thank you. Senator cruz. Thank you. I will say thank you to my friend from alaska for sharing this deep void and longing in your heart. You will be getting the yellow sweater for christmas. Mr. Slater, i want to start with you. I want to talk about project dragonfly. Of 2018, it was reported google was developing a censored Search Engine onto the alias of project dragonfly. In response to those concerns, alphabet shareholders requested the Company Published a human rights Impact Assessment by october 30 of this year examining the actual and potential impacts of potential Google Search censored in china. During the show holder meeting on june 19, the proposal for the assessment was rejected. The board of directors encouraged shareholders to vote against the proposal. Alphabet commented that google has been open about it desire to increase its ability to serve users in china and other countries. We considered a variety of options to offer services in a way consistent with our mission and gradually expanded offerings to consumers in china. I want to start with clarity. Has google ceased any and all development and work on project dragonfly . To my knowledge, yes. Tohave google committed forgoing future projects that may be named differently, but would be focused on developing a censored Search Engine in china . Announcee nothing to at this time. Whatever we would do, we would look carefully at things like human rights. We work with the Global Network how ourve to evaluate principles, practices, products, with human rights and law. Roughly contemporaneously, didnt want toit work with the u. S. Department of defense. Justify having been willing to work with the Chinese Government on complex projects, including Artificial Intelligence under projects time, not at the same being willing to help the department of defense develop ways to minimize civilian casualties for a better ai . How do you reconcile those approaches . We have talked about today, we do partner with Law Enforcement and the military in certain ways offering some of our services. We draw responsible lines about where we want to be in business, including limitations on getting in the field of building weapons and so on. Cspans washington journal, live every day with news and policy issues that impact you. Lee druckman, political scientist and author, discusses the need for multiparty democracy in the u. S. Cspansspans watch washington journal, live at 7 00 eastern. Join the discussion. Thursday, the house returns at 10 00 eastern for general speeches on cspan. At noon, members are expected to take up a shortterm spending bill to fund the government passed september 30. Thespan2 at 8 00 a. M. , Environmental Protection agency holds a news conference. In the senate at 10 00 a. M. , more work on executive nominations, including deputy under secretary of the treasury. 00 a. M. , thenot confirmation hearing for eugene scalia, son of the late justice antonin scalia. Hes been nominated to serve as the next labor secretary. Tell bernie i sanders voters, i defy you to i tell Bernie Sanders voters, i defy you to tell me that you care more than i do, because you dont. But we have very Different Solutions on how to get there. Cole cer k james kay james, sunday night at 8 00 p. M. Eastern, on cspans q a. President trump visited a section of the border wall currently under construction in southern california. Joining him were acting Homeland Security sick terry Kevin Mcaleenan security secretary Kevin Mcaleenan. He also took questions about his new National Security advisor. Pres. Trump this has been a very exciting project, as you know. We have a wall the likes of which very few places have ever seen. I want to thank all of the people. General semonite with the army corps of engineers. We are very closely working with them. Kevin and your staff. I want to see some of the details of the wall. You can see a pretty good view. This is going to be close to 500 miles by the time we finish most of the areas that are most important. That should bee