comparemela.com

Card image cap

Content. Section 230 of the commune cases decency act has enabled that echo system to involve by giving online companys the ability to moderate content without equating them to the publisher or speaker of that content. Weve been able the creation of millions and billions of people to come together and interact. Today, this committee will be examining that world that section 230 has enabled both the good and the bad. Id like to thank the witnesses for appearing before us today. Each of you represents important perspective related to the content and the ecosystem. Ony of you bring up concerns this complex issue. I agree this is a complex issue. Regulate y should disinformation and hate speech. Like too many communities my hometown of pittsburgh has seen what unchecked hate can lead to. Almost a year ago, our community suffered the most deadly attack on Jewish Americans in our nations history. The shooter did so after posting antisemitic comments before finally posting that he was in. A similar attack occurred in new zealand. And the gunman streamed his despicable acts on social media sights. Many websites didnt move fast enough. And the algorithms help to make celebrity selfies go viral, help expose a heinous act. In 2015 we saw similar issues when foreign adversaries use it to disseminate disinformation and foment doubt to instill trust in our leaders and institutions. Clearly, we all need to do better. And i would strongly encourage the witnesses before us that represent these Online Platforms and other major platforms to step up. The other withins on the panel bring up serious concerns with the kind of content available on your platforms. And the impact that that is having on sofmente some of those impacts are very disturbing. You must do more to address these concerns. That being said, section 230 doesnt just protect the largest platforms or the most fringe websites. It enables comment sections on individual blogs, people to leave honest and open reviews and free and open discussion about controversial going tos. The kind of ecoo system that has been enabled hasnt riched our lives and our democracy. The ability of individuals to have voices heard particularly marginalized communities cannot be understated. The ability of people to post content that speaks truth to ower has created political moments that have changed the world we live in. We need to recognize the incredible power they have for good. Want to thank you, again, for being here. I like to yield the balance of my time to my good friend ms. Mastui. Thank you, mr. Chairman. I want to thank the witnesses. In 2018 Mark Zuckerburg said it was my mistake and im sorry. In allowing russia to influence the 2016 president ial election. Fast forward 555 days. Has that mr. Zuckerberg not learned from his mistakes. They will continue to add ads as they push falsehoods and lines. Making it fertile grounds for election and interference in 2010. The decision should not be a difficult one. The choice between deep fakes, hate speech, online bullies and a fabs driven debate should be easy. If facebook doesnt want to play referee about the truth and political speech then they should get out of the game. I hope this series introduces a robust discussion because we need it now more than ever. Mr. Chairman. I yield back. Turks gentlelady yields back. We recognize mr. Latta for five minutes for his Opening Statement. Thank you very much to our witnesses for apeering before us. Quen welcome to the hearing on content moderation in section 230 of the commune they r cases of decency act. We began last session how congress should examine the law. Accountability. And transparency first the hundreds of millionses of americans using the internet today. Were closely tied to section 230. They range from large to Small Companies as well as academics and researchers. Let me be clear, im not advocating that Congress Repeal lead w or rich that could to the death of a thousand cups s. Before we discuss whether they should make modifications of law, i we should understand how we got to this point. Its important to realize that when it was written the telecomm portion included other pro biggses on on jex or lewd contact. Provisions that were written were autably struck down by the supreme court. But the section 230 provisions remain. Cda Interactive Computer Services. Miracle online, proactively take down the content. Content as chris stateded on the house floor, we want to encourage people like prodigy like america online, like the new mike soft network to do Everything Possible for us, the consumer to help us control at the front door of our house. What comes in and what our children see. It is unfortunate that they have such a broad interpretations similarly without platforms having to demonstrate that they are doing Everything Possible. The congressman visioned. Numerous of platforms have hien bhoined the shields and avoid litigation without having to take the responsibility. Not only are Good Samaritans sometimes selective in taking down harmful or illegal activity. But they have been interpreted so badly that they can skate without accountability. Thats not to say all platforms never use the tools afforded by congress. Many do great things. Many of the bigger platforms, i mean, billions and thats with a b accounts annually. They are the exception not the rule. Today, well dig deeper to remove con dent whether its with the tools provided by section 230 or with their own selfconstructed terms of service. Hould be encouraging encourage this meeting. We can have an open discussion of the intent. Not sure whether they were held reasonably accountability for activity in the platform without drastically affecting the innovative startup. Within, mr. Chairman, i look back at the time. This is a joint hearing between our committee and the committee on Consumer Protection and commerce, and i would like to recognize the chair of that mrs. Ttee for five minutes, cekowsky. Today the internet certainly has improved our lives in many, many ways. Enabled them to actively participate in sode. Education and commerce. Section 230 of the Communications Decency act has been at the heart of the United States internet policy for over 20 years. Ny say that this law allowed free speech to flourish. Allowing the internet the internet to grow into what it is today. In the early days, it was intended to encourage Online Platforms to moderate usergenerated content. To remove offensive, dangerous or illegal content. The internet has come a long way since the law was first enacted. The amount and fist cation of ser postings has increased exponentially. Unfortunately, the number of americans who report extremism, extreme Online Harassments which stall king,cking bull yig. 37 of users say theyve experienced that this year. Likewise, extreme extremism, hate speech. Election interference and the problematic content is pro live rating. The spread of such content is problematic, thats for sure. And actually it causes some real harm that multibillion Dollar Companies like facebook, google, twitter, cant or wont fix. And if this werent enough, cause if this werent enough cause for concern, businesses re attempting to use section 2030 as a ability shield actively that they can that they have nothing to do with Third Party Content or content moderation policy in recent in a recent ashington post article, uber executives seems to be open opening the door to claiming vast immunity. From local traffic liability based on section 2030. Is would represent a major unraveling of 200ee years of ocial contract, communities, governance and the professional intend. At issue the federal commission ction 5 authority and an unfair practice. The student section 5 cases on whether websites are generated content. But the terms of Service Violations for Third Party Content may also be precluded by 230 immunety. I wanted to talk a little bit about injecting 230 into trade agreements. It seems to me that weve already seen that now. And the japan trade agreement and there is a real flush to include that now in the mexico canadau. S. Trade agreement. There is no place for that. I think that the laws in these other countries dont really accommodate, what the United States has done about 230. The other thing we are having a discussion right now. An important commercial about 2 30. And in the midst of that conversation because of all the new developments, i think it is just inappropriate right now at this moment to insert this Liability Protection into into trade agreements. As a member of the working group that is helping to negotiate that agreement, i am pushing hard to make sure that it just isnt there. I dont think we need to have 2030. Ustment to it just shouldnt be in trade agreements. All of the issues that were talking about today indicate that there may be a larger problem, the 2 30, no longer achieving the goal encouraging platforms to protect their users and today, i hope that we can discuss holistic solutions. Not talking about eliminating 2 30, but having a new look at that in the light of the many changes that we are seeing into the world of big tech right now. We want to dish look forward to hearing from our witnesses and how it can be made even better for consumers. And i yield back. Thank you. Gentlelady yields back. The chair recognizes the Ranking Member of the committee. Good morning. Welcome to todays joint hearing on online commerce management. My priority to protect consumers while preserving the ability for Small Businesses and startups to innovate. In that spirit, today we are discussing Online Platforms in section 230 of the Communications Decency act. In the early days of the internet two companies were sued for content, posted on their website by users. One company sought to moderate content on their platform, the other did not. In deciding these cases, the court found the company that did not make content decisions was immune from liability, but the company that moderated content was not. It was after these decisions that congress created section 230. Section 230 is intended to protect, quote, Interactive Computer Services from being sued while also allowing them to moderate content that may be harmful, illicit or illegal. Harmful illicit or illegal. This Liability Protection plated critical and Important Role in how we regulate the internet. To allow Small Businesses and integrators to thrive online without the fear of regrowth loss are looking to make a quick buck. Section 230 is also largely misunderstood. Congress never intended to provide immunity only to websites who are neutral. Congress never wanted platforms to simply be neutral conduits but in fact wanted platforms to moderate content. The Liability Protection extended to allow platforms to make good faith efforts to moderate material that is obscene, loose, excessively violent or harassing. There is supposed to be a balance to section 230. Small Internet Companies enjoy a safe harbor to innovate and force online while also Incentivizing Companies to keep the internet clear of offensive and violent content by empowering these platforms to act and clean up their own site. The internet revie revolutionize freedom of speech by providing a platform for every american to have their voice heard and to access an infinite amount of information at their fingertips. Medium and other online blogs provided a platform for anyone to write. Wikipedia provides free indepth information on almost any topic you can imagine through mostly user generated and moderated content. Companies that started in dorm rooms and garages are now global powerhouses. We take great pride in being the Global Leader in tech and innovation. But while some of our Biggest Companies have grown, have they mature . Today is often difficult to go online without seeing harmful, disgusting or illegal content. To be clear i fully support free speech which society benefits from open dialogue and Free Expression online. I know there has been calls for Big Government to mandate or dictate free speech or ensure fairness online and is coming from both sides of the aisle. I share similar concerns that others have expressed and are driving some of the policy proposals, i do not believe these are consistent with the first amendment. Republicans successfully fought to repeal the fcc fairness doctrine for broadcast regulation during the 1980s. I strongly caution against advocating for similar doctrine online. It should not be the fcc, ftc or any Government Agency job to moderate freespeech online. It instead we should continue to provide oversight of big tech in their use of section 230 and encourage structure of content. This is very. How do we ensure they are responsibly earning their Liability Protection. We Want Companies to benefit not only from the shield but also use the sword congress afforded them to rid their sites of harmful content. I understand its a delicate issue and certainly we renew them. I want to be very clear im not forgetting section 230, its for consumers and entities in the internet echo system. Misguided and hasty attempts to amend or repeal section 230 for biased or other reasons could have unintended consequences were free speech and ability for Small Businesses to provide new and innovative services. At the same time it is clear we reached a point where its income it upon us as policymakers have a serious and thoughtful discussion about achieving the balance on section 230. I thank you for the time and i yield back. The chair recognizes chairman of the committee for five minutes for his Opening Statement. Thank you chairman. The intermittent is one of the single greatest Human Innovations that expresses community and Economic Opportunity with trains of dollars of exchanged online every year. One of the principal laws that pave the way for the internet to flourish is section 230 of the communication decency act which is part of the Telecommunications Act of 1996. We enacted the section to give platforms the ability to moderate their sites and protect consumers without excessive risk of litigation and to be clear section 230 has been an incredible success. In the 20 years since section 230 was law the internet is more complex and sophisticated the 1986 the Global Internet reached 36 million users, less than 1 of the world population. Only one in four americans reported going online every day. Compare that to now when all of us are online almost every hour that we are not sleeping in earlier this year the internet passed 4. 39 billion users worldwide. Here in the u. S. Theres about 230 million smart phones that provide americans access to Online Platforms. The internet is a central part of our economic fabric in a way that we cannot have dreamed up when we passed the Telecommunications Act. With that complexity and growth we have seen the darker side of the internet growth. Online radicalization has spread leading to Mass Shootings in the schools, churches and movie theaters, International Terrorists are using the internet to groom recruits. Platforms have been used for the illegal sale of drugs including those that spark the Opioid Epidemic. Foreign governments and fraudsters to polluted campaigns using new Technology Like deep fakes designed for civil unrest and disrupt democratic elections. There are constant attacks against women, people of color and other minority groups. Perhaps most despicable of all is the horrendous sexual exportation of children online. In 1998 there were 3000 reports of material depicted in the children online. Last year 45 million video reports were made. While platforms are now better detecting and removing this material recent reporting shows Law Enforcement officers are overwhelmed by the crisis. These are issues that we cannot ignore in Tech Companies need to step up with new tools to address the Serious Problems in each of these issues demonstrate how online content moderation have not stay true to the values underlying section 230 and has not kept pace at the increasing importance of the goebel internet. There is no easy solution to keep this content off the internet as policymakers we have are ideas of how we might tackle the symptoms, the content moderation online and also protect free speech but we must seek to fully understand the breadth and depth of the internet today, how it is changed and how it can be made better and we have to be thoughtful, careful and bipartisan in our approach. Its with that in mind i was disappointed that investor lighthizer, u. S. Trade representative refused to testify today, the u. S. Has included language in which a section 230 in the United States, mexico and Canada Agreement in the u. S. Japan trade agreement and Ranking Member and i wrote to the ambassador in august raising concern about why the ustr has included the language in the trade deal as we debate them across the nation and i was hoping to hear his perspective on why he believes that was appropriate. Including provisions and trade agreement that are controversial to democrats and republican is not the way to get support from congress obviously. Hopefully investor will be more responsive to bipartisan request in the future and with that mr. Chairman i will yield back. The gentleman yields back. The chair would like to remind members pursuant to the Committee Rules, all members written Opening Statement shall be made part of the record. Can mine be made part of that customer. I apologize the chair yield to my good friend and Ranking Member. Times have changed. [laughter] thank you, mr. Chairman and i want to welcome our witnesses, thank you for being here its important work and ill tell you on the offset we have another subcommittee meeting upstairs to all be bouncing in between but i have your testimony and look forward to your comments. Its without a question of experts in this field. Were blessed to have you here. Last congress we held significant hearing that jumpstarted the state of online protections as well as the legal basis of the modern internet ecosystem and of course the future of content moderation and algorithms determine which of what we see online, thats an issue the constituents want to know more about. Today we will undertake a deeper review of section 230 of the Communications Decency act portion of the 1996 telecommunication act. In august of this year chairman and i raise the issue of the appearance of export language nearing section 230 and trade agreement. We did that in the letter to the u. S. Trade representative robert lighthizer. We express concerns of the internet policy being taken out of the content of its intent and in the future the United States trade representative should consult our committee in advancing negotiating on the very issues. Rather than cherry picking just a portion. I want to go back to the trade case. I thought the letter to the ambassadorwas going to send the right message. Dont try to blow up us mca. I voted for everytrade agreement, and i am a big free trader but were getting blown off on this and im tired of it. Then wefound out its in the japan agreement. Clearly theyre not listening to our committee or us so we are serious about this matter and ive heard from the usd are and this is a real problem so take note. If we refer to section 230 as the words that created the internet as has been popularized by some were already missing the mark since by our word count which you can use software to figure out, that includes the Good Samaritan obligation so we should Start Talking more about that section as the 83 words that can preserve the internet area all the sections and provisions should be taken together and not a part and many of our concerns could be addressed if Companies Just enforced their terms of service but in better context i believe a History Lesson is in order. Todays internet looks different than compuserve when messages dominated the internet in the 90s. While the internet is more dynamic and content rich there were problems in its infancy managing the vast amounts of speech online. Chris cox, former member of the legislation along with this committee did out on the house floor during debate on his amendment, no matter how big the army of bureaucrats, its not going to protect my kids because i dont think the federal government will get there in time so congress recognized then as we did now we need companies to step up the to the plate and curb illegal content from their platforms. The internet is not something to be managed by government. The act bestowed on providers and users the ability to go after the legal and human content without being held liable. While the law was intended to empower we have seen social media platform flow to clean up while being quick to use immunity for such content. In some cases internet platforms and shirk the responsibility for the content on their platforms. The liability shield in place through common law as obscured the central bargain that was struck and that is the internet platforms with User Generated Content are protected in exchange for the ability to make good faith efforts to moderate harmful content so let me repeat for those that want to be included in the Interactive Computer Services definition. And force your own terms of service. I look forward to a discussion on the shaded speech from illegal content, how we should think of cda 230 protections versus large ones and various elements of the ecosystem shape what consumers see or dont see area thank you for having this hearing and i look forward to getting all the feedback from the witnesses but i have to vote for the other hearing clicks the administration doesnt listen to you guys either ill let my statement speak for itself clearly, we will find out if there listening or not. I will reiterate that members pursuant to the Committee Rules all members written statements will be made part of the record. We want to introduce our witnesses for todays hearing, mister Steve Hoffman, ceo of reddit, welcome. Miss danielle citron, professor of law at Boston University school of law. Doctor mcsherry, legal director of the Electronic Frontier foundation, welcome. Miss gretchen peters, executive director of the alliance to counter prime online. Miss catherine oyama, head of International Policy property for google and doctor farid, professor at theuniversity of california at berkeley. We want to thank you for joining us today. We look forward to your testimony. At this time the chair will recognize each witness for five minutes to provide Opening Statements. For we begin i like to explain our lighting system area in front of you is a series of lights, the light will initially be green. A light will turn yellow when you have one minute remaining. Please wrapup your testimony at that point. When the light turns red, we cut your microphone off. We dont, but try to Mission Fishing finished before then. Chairpersons, Ranking Members, members of the committee, thank you for inviting me. I name is Steve Hoffman and im the cofounder and ceo of reddit and im here to share why 230 is critical to the open internet. It moderates content in a fundamentally different way, we empower communities that rely on 230. Changes to 230 close existential threats not just us but thousands of startups across the country. It would destroy what competition remains in our industry. A College Roommate and i started reddit in 2005 as a forum to find news and interesting content. Since then its grown into a vast Community Driven site where people find not just news and a new laugh but perspective and a sense of belonging. Reddit is communities that are created and moderated by our users. A model thats taken years to develop with Lessons Learned along theway. I left the company in 2009. For a time, reddit lurched from crisis to crisis over questions of moderation that were discussing today. 2015 i came back because i realized the vast majority of our communities were providing an invaluable experience and reddit needed better moderation. The late reddit handles moderation is unique in the industry, a model akin to our democracy where everyone follows rules to self organize andultimately share some responsibility for how the platform works. First we have our content policy fundamental rules that everyone on reddit must follow. Think of these as our federal laws. We employ a group collectively known as the antievil team to enforce these policies. Below that each Community Creates their own rules. These rules written by our volunteer moderators are carried through the unique needs of their communities and tend to be more specific and complex. Self moderation our users do is the most scalable solution to the challenges of moderating content. Individual users play a crucial role as well. They can vote up or down on any piece of content and reportedto our antievil team. Through this system , users can accept or reject any piece of content thus turning every user into a moderator. The system isnt perfect. Its possible to find things on reddit to break the rules but effectiveness has improved. Analysis has shown our approach to be largely effective in curbing bad behavior and when we investigated russian attempts at manipulating our platform we found tried, us than one percent made it past routine defenses of arcing , Community Moderation and down votes from everyday users. We constantly evolve our content policy and since my return wevemade a series of updates addressing violent content, pornography, controlled goods and harassment. These are just a few of the ways weve worked moderate in good faith which brings us to the content of what it would look like without 230. For starters, we are forced to defend against anyone with enough money to bankroll a lawsuit. Its worth noting the cases most commonly dismissed under 230 are regarding defamation. Its an open platform where people are allowed to voice opinions would be a prime target for these, enabling censorship through litigation. Even targeted in its february 2, 1930 will create a Regulatory Burden on the industry, benefiting the Largest Companies by placing a significant cost on smaller competitors. We have 500 employees and a large user base, more than enough to be considered a Large Company and we are an underdog compared to our nearest competitors where our Public Company attend to be 100 times our size but we recognize that theres truly harmful material on the internet and we are committed to fightingit. Its important tounderstand rather than helping , keeping narrow changes and undermine the power of community and horrible mobile. It could be Opioid Epidemics, raising discussions on 230. We have many communities where users modeling can find support tohelp them on their way to sobriety. Without carveout on this area, they become too risky, forcing us to close them down. This would be a disservice to the people struggling that this is exactly the type of decisionthat restrictions on 231 forced on us. Section 230 is a uniquely american law. Its a balanced approach that has allowed the internet to flourish while also incentivizing good faith attempts to mitigate the unavoidable downsides of Free Expression. While these downsides are serious and demanding attention of us and industry and you in congress , they do not outweigh the overwhelming good at 230 has enabled. Ilook forward your question. Your recognized for five minutes. Thank you for having me and having such a thoughtful badge. With me on the panel. When congress adapted section 230 20 years ago, the goal was to incentivize Tech Companies moderate content and although Congress Course one to the internet, what they could imagine at that time to be open and free, they also knew that openness would risk offensive material and im going to use their words. And so what they did was devise an incentive, a legal shield ordinance americans who are trying to clean up the internet. Both accounting for the failure so under filtering and over filtering of content. The purpose of the statute is clear but if interpretation, the words werent so what weve seen our courts massively overextending section 230. Two sites that are irresponsible in the extreme and thatproduce extraordinary harm. Weve seen the liability shield the applied to sites whose entire Business Model isabuse. Revenge porn operators and sites that all they do is cure users the fake sex videos, they get to enjoy betty immunity and interestingly, not only is it bad samaritans who have enjoyed illegal shields from responsibility but it also cites that have nothing to do with speech. That traffic in Dangerous Goods like arms list. Com. And the costs are significant. This overbroad interpretation allows bad samaritans sites reckless irresponsible sites to half costs on peoples lives and im going to take the case of Online Harassment because ive been studying the last 10 years. Costs are significant to women and minorities. Online harassment is often hosted on these sites is costly to peoples central Life Opportunities so when a group search of your name contains rate threats, your new photo without your consent, your home address as youve been docs and life defamation, its hard to get a job and its hard to keep a job and for victims, they are driven off line in the face of online assaults. There terrorized, there often change their names and they move. So in many respects, the calculus, the freespeech calculus is not necessarily a win for free speech as we are seeing diverse viewpoints and individuals being chased off line. So now the market i think ultimately is not going to solve this problem. So many of these businesses, they make money off of Online Advertising and salacious content that attracts eyeballs so the market itself i dont think we can rely on to solvethis problem. Of course, legal reform. The question is how should we do it. . We have to keep section 230, it has tremendous upside but we should return it to its original purpose which was to condition the shield on being a Good Samaritan, on engaging in what been witness and i have called reasonable contentmoderation practices. There are other ways to do it and in my testimony i sort of draw some solutions, but weve got to do something is doing nothing has costs. It says the victims of online abuse that their speech and their equality is less important than the business process of some of these sufferers. The chair recognizes doctor mcsherry for five minutes. As legal director for the Electronic Frontier foundation i want to thank the chairs, Ranking Members, numbers of the committee for the opportunity to share our thoughts with you today on this important topic. For nearly 30 years, efs has represented the interests of Technology Users both in court cases and broader policy debate to ensure that law and Technology Supports our civil liberties. Like everyone in this room, we are well aware online speech is not always pretty. Sometimes its extremely ugly and causes serious harm. We all want an internet where we are free to me , create, organize, share, debate and learn. We want to have control over our online experience and to feelempowered by the tools we use. We want our elections refrom manipulation and for women and marginalized communities to be able to speak openly about their experiences. Chipping away at the legal foundations of the internet in order to pressure forms to Better Police the internet is not the way to accomplish those goals. Section 230 made it possible for all kinds of voices to get their message out to the whole World Without having to acquire a broadcast license, owning a newspaper or learn how to code. The law has thereby helped remove much of the gatekeeping that once stifled social change and perpetuated power imbalances. And thats because it doesnt just protect tech giants. It protects regular people. You forwarded an email, a news article, a picture or piece of political criticism, youve done so with the production of section 230. If you maintain an online forum for a neighborhood group, youve done so with the production of section 230. If you use wikipedia to figure out where George Washington was born, youve benefited from section 230 and if you are viewing online videos, documenting events realtime in northern syria, youre benefiting from section 230. Intermediaries the social media platforms, new states or email forwarders are protected bysection 230 just for their benefits , they are protectedso they can be available to all of us. Theres another practical reason to resist the impulse to amend the law to pressure platforms to moderate user content. Simply put, theyre bad at it. Theres efs and many others have shown, they regularly take down all kinds of valuable content partly because its often difficult to rock your clear lines between lawful and unlawful speech scale. Those mistakes often silence the voices of already marginalized people. Moreover increased liability risk will inevitably lead to over censorship. Its a lot easier and cheaper to take something down and to pay lawyers to fight over it. Particularly if youre a smaller business or a nonprofit. Automation is not a magical solution context matters. Very often when youre talking about speech and robots, theyre pretty bad nuance. For example, in december 2018 blogging platform, announced a ban on content. In an attempt to explain the policy, they identified several types of content that would be unacceptableunder new rule. Shortly after tumblers on filtering technology flagged those images as unacceptable. The last reason. New legal burdens are likely to stifle competition. Facebook and google afford to throw millions that moderation, automation and litigation. But smaller competitors or wouldbe competitors dont have that kind of budget. So in essence we would have opened the door to a few companies and then land that door shut for everyone else. The free and open internet has never been fully free or open and the internet can amplify the worst of us as well as the best. But the internet still represents and embodies an extraordinary idea that anyone with a Computing Device and connect with the world, tell their stories, organize, educate and learn. Section 230 else make that idea a reality and its worth protecting. Thank you and i lookforward to your questions. Iq. Mister peters, you are recognized for five minutes. This thing with members of the subcommittee is an honor to be here today to discuss one of the premier security threats of our time, one that congress is wellpositioned to solve. I am the executive director of the alliance the counter crime online. Our team is made up of academics, security experts, ngos and citizen investigators who come together to eradicate Serious Organized Crime and terror activity. I want to thank you for your interest in our research and for asking me to join a panel of witnesses to testify area like me i hope to hear the testimony of the us trade representative because you think cda 230language out of americas trade agreements is critical to our national security. I have a long history of tracking organized crime and terrorism. I was a war reporter and i wrote a book about the taliban and the drug trade that got me recruited by us military leaders to support our intelligence community. Ive met Terror Networks for special operations command, the eight and centcom. In 2014 i received the apartment funding wildlife supply chains and thats when my team discovered the largest Retail Markets for endangered species are actually located on social media platforms like facebook. Founding the alliance to counter crime online looks at crime more broadly than wildlife as taught me the incredible range and scale of Illicit Activity happening online. It is far worse than i ever imagined. We can and must get this under control. Under the original intent of cd 230, cda 230, there was supposed to be a shared responsibility between tech platforms, Law Enforcement and organizations like acco but tech firms are failing to uphold their end of the bargain by broad interpretations by the court they have safe harbor. Deflecting the stream they tried to convince you that most illegal activity is confined to the dark web thats not the case. Surface web platforms provide the same anonymity, Payment Systems and in much greater reach of people. Were talking illicit groups ranging from mexican drug cartels chinese triads that have webinars social media platforms. Im talking about us publicly let listed social media platforms to move a wide range of goods. Now were in the midst of a Public Health crisis, the Opioid Epidemic which is claiming the lives of 60 million americans each year but facebook a large Worlds Largest social Media Company only began tracking drug postings last year and within six months the Firm Identified 1. 5 million posts selling drugs. Thats whatthey admitted to removing. To put that in perspective thats 100 times more postings than the notorious dark web site the silk road ever carry. Study after study by acco members and others have shown widespread use of google, twitter, facebook, reddit, youtube to market and sell fentanyl, oxycodone and other addictive substances to us consumers in direct violation of us law, federal law in every major internet platform has a drug problem. Why wes and mark there is no law that holds tech firms responsible even when a child dies buying drums on internet platforms. Tech firms for an active role in spreading harm. Their algorithms originally designed wellintentioned to connect friends also help criminals and terror groups connect to a global audience. Isis and other terror groups use social media to recruit, andres and spread their propaganda. The acco Alliance Includes an Incredible Team of syrian archaeologists regarding the trafficking of thousands of artifacts plundered from sites and sold in many cases by isis reporters. It is a war crime. We are tracking groups on instagram, google and facebook wherein endangered species are sold. Items including rhino horn and elephant ivory to chimpanzees and cheetahs. The size of these markets is threatening species withins extension. I could continue to sit here and horrify you all morning. Illegal dogfighting, live videos of children being sexually abused, humanremains , counterfeit goods. Its all just a few clicks away. The Tech Industry routinely claims that modifying pda 230 is a threat tofreedom of speech. But pda 230 is a law about liability, not freedom of speech. Please try and imagine another industry in this country has ever enjoyed such an incredible city from congress, total immunity the matter what harm their product brings to consumers. Firms that have implemented internal controls to prevent Illicit Activity from occurring it was cheaper and easier to stare while looking the other way. They were given this incredible freedom and they have no one to blame but themselves for squandering it. He wants uniforms to the law district communities for hosting terror and crime content to regulate the firms must report crime and terror activity to Law Enforcement and appropriations to Law Enforcement to contend with this data. If its illegal in real life, ought to be illegal to hosted online it is imperative we reform cda 230 to make the internet as safer place for all. The gentle lady yields back. Ms. Oyama, you are recognized forfive minutes. Ranking member and rogers, distinguished members of the committee, thank you for the opportunity to appear before you today. I appreciate your leadership on these issues and welcome the opportunity to discuss both work in these areas. My name is katie oyama and im the head of it policy at google and i advised the company on Public Policy frameworks for the management and moderation of online content of all kinds area and at google our mission is to organize and make the world information universally accessible and useful. Our services and many others are positive forces for creativity, learning and access to information. Thiscreativity and innovation continues to yield Enormous Economic benefits to the United States. However, like all means of medication that came before it the internet has been used for both the best and worst of purposes this is why in addition to respecting local law we have robust policies, procedures and Community Guidelines that govern what activity is permissible on our platform and update their regularly meet the needs of our users in society and my testimony today will focus on three areas. The history up to 30 and how it has helped the internet grow, how to 30 contribute to our efforts to take down content and googles policies across our content. Section 230 has created a robust internet ecosystem where commerce, innovation and Free Expression dry while also enabling providers to take aggressive steps fight online abuse. Digital platforms millions of consumers and legitimate content across the internet, facilitating 29 trillion in online commerce each year. Addressing legal content is a shared responsibility and our ability to take action on content is underpinned by 230. Law not only clarifies when services can be held liable for thirdparty content. The Legal Certainty necessary for services takes with action against harmful content. Section 230 Good Samaritan permission was introduced to incentivize the selfmonitoring and facilitate content moderation. It does nothing to alter platform liability for violations of federal criminal laws are expressly exempted from the scope of the cda. Over the years the importance of section 230 has grown and is critical in ensuring economic growth. A study found over the next decade to 30 will contribute an additional 4. 25 million jobs and 440 billion in growth to the economy. Investors in the Startup Ecosystem have said weakening online safe harbor would have a recession impact on investment and internationally 30 is a differentiator for the us. China, russia and others a different approach to innovation and censoring speech online. Sometimes including speech that is critical of political leaders. Perhaps the best way to understand the importance is to understand what might happen if it werent employed. Without 230, search engines, political blogs, review sites of all kinds would either not be able to moderate content at all or they would over block, either way harming consumers and businesses that rely on their Services Every day. Without 230, platforms could be sued or decisions around the removal of content such as pc, mature content or videos related to appear and it seems and because of 230 we can and do enforce the policies that ensure that our platforms are safe, useful and vibrant. For each product we have a specific set of rules and guidelines that are suitable for the type of platform and the rest of harm. These are clear content policies and Community Guidelines lighting mechanisms to report content that violates them to increasingly effective Machine Learning that can facilitate removal of content at scale for a single human user has ever been able to access it. In threemonth period from april to june 2019 they moved over 9 million videos from a platform for violating our guidelines and 87 percent of this content was flagged by machines first rather than by humans and of those detected by machines 81 percent of that content was never viewed by a single user. We now have 10,000 people across google working on content moderation and have invested hundreds of millions of dollars for these efforts and in my written testimony i go into further detail about our policies and procedures or tackling content on search, google ads and youtube. We are committed to being responsible actors are part of the solution. Google will continue to invest in the people and technology to meet this challenge and we look forward to continued collaboration with the committee as it examines these issues. Like you for your time and i look forward to taking your questions. Doctor free, you have five minutes. Members of both subcommittees, thank you for the opportunity to speak today. Technology as youve heard and internet have had a remarkable impact on our lives and society. Many educational entertaining and inspiring things have emerged from the past two decades and innovation at the same time many horrificthings have emerged. A massive proliferation of child sexual abuse, the radicalization of international terror. The distribution of deadly drugs, disinformation campaigns designed to so civil unrest and disrupt democratic elections. The proliferation of deadly conspiracy theories. The routine and deadly harassment of women and underrepresented groups in threats of Sexual Violence and revenge and nonconsensual pornography, small and large scale fraud and failures to protect our personal and sensitive data. How in 20 short years did we go from the promise of the internet to democratize access to knowledge and make the world more understanding and enlightened to this litany of corners. A combination of nacvetc, ideology, will growth at all costs have led the titans to fail to install safeguards on their services. The problem that they face today is not new. As early as 2003 was wellknown the internet was able or child predators. Despite Early Morning the Technology Sector drive their feet are the mid2000 and not respond to the known problems at the time, nor did they put in place the proper safeguards to contend with what should have been the anticipated problems we face today. In defense of the Technology Sector, they arecontending with an unprecedented amount of data. 500 hours of video uploaded to youtube every minute, from 1 billion daily uploads to facebook become 500 tweets per day, on the other hand these same companies have had over a decade to get their houses in order and have failed to do so and at the same time they have managed to profit handsomely by harnessing the scale and volume of the data thats uploaded to their Services Every day. These services dont seem to have trouble dealing with unwanted material serves their interest. They remove Copyright Infringement and effectively remove legal adult pornography because otherwise their services would be littered with pornography during away advertisers. During its 2000 congressional testimony Mister Zuckerberg and vote Artificial Intelligence as the savior for content moderation and we are told 5 to 10 years. Putting aside not clear what we should doin the intervening decade or so , this claim is almosthurt me overly optimistic. For example earlier this year facebooks chief Technology Officer showcased facebooks latest Ai Technology for discriminating images of broccoli from images of marijuana. Despite all the latest advances in ai and pattern recognition the system is only able to perform the tasks with an average accuracy of 91 percent. This means approximately one in 10 timesthe system is wrong. A scale of 1 billion uploads a day this technology cannot possibly automatically moderate content. And this discrimination test is much easier than the task of identifying a broad class of child location, extremism and disinformation material. The promise of ai is just that, a promise and we cannot wait a decade or more with the hope that ai will improve by nine orders of magnitude when itmight be able to contend with content moderation. To complicate things even more, earlier this year Mister Zuckerberg announced facebook is implementing endtoend encryption on services preventing anyone, the government, facebook from seeing the contents ofany communications. Implementing end to end encryption will make it more difficult to contend with a of abuses enumerated at the opening of my remarks. Canada must be better when it comes to content contending with the most violent dangerous and hateful content online. I reject the naysayers argue it is too difficult on the policy or technological perspective or those that say reasonable and responsible content moderation will lead to the stifling of an open exchange of ideas. I look forward to takingyour questions. Thank you doctor free. We concluded our openings, were going to move to member questions. Each member will have five minutes to ask questions of our witnesses and i will start by recognizing myself for fiveminutes. I have to say, when i said at the beginning of my remarks is isa complex issue , its a very complex issue and i think weve all heard the problems. What we need to hear his solutions. Let me just start by asking all of you just buy a show of hands who thinks that Online Platforms could do a better job ofmoderating their content on their websites . So thats unanimous. I agree and i think its important to note that we all recognize content moderation online is lacking in a number of ways and that we all need to address this issue better and if not you who are the platforms and the experts in this technology, and you put that on our shoulders, you may see a lot of that you dont like very much and that has a lot of unintended consequences for the internet. I would say to all of you, you need to do a better job. You need to have an industry getting together and discussing better ways to do this. The idea that you can buy drugs online and we cant stop that, to most americans hearing that, they dont understand why thats possible. Why it wouldnt be easy to identify people that are trying to sell illegal things online and take those sites down. Child abuse, its very troubling. On the other hand i dont think anybody on this panel is talking about eliminating section 230. So the question is what is the solution between not eliminating 230 because of the effects that would have just on the whole internet and making sure that we do a better job of policing this . Read it, a lot of people know reddit but its a relatively Small Company when you place it against some of the giants and you hook many communities and you rely on your volunteers to moderate discussions but i know that you shut down a number of controversial some reddit that have spread keepsakes, violent the content and dangerous conspiracy. But what would reddit look like if you were legally liable for the content your users posted or for your companys decision to moderate usercontent in communities . What reddit would look like was we be forced to go to one of two extremes. In one version we would stop looking. We would go back to the pre230 era which means if we dont know, we are not liable. And im sure is not what you intend and its not what we want. It would be not aligned with our mission of bringing community and belonging to everybody in the world. The other extreme would be to remove any content orprohibit any content that could be remotely problematic. And since reddit is a platform where 100 percent of our content is created by our users, fundamentally undermines the way reddit works. Its hard to give you an honest answer of what reddit look like as im not sure reddit as we know it could exist in the world where we had to remove all User Generated Content. Doctor mcsherry, you talk about the risk to free speech if it were to be revealed or altered but what other tools could congress use to incentivize Online Platforms to moderate dangerous content and encourage a deadlier online ecosystems. What youre with your recommendation should beshort of eliminating 230 . From a number of the problems that we talked about today which i think everyone agrees are very serious and i want to underscore that, are actually often addressed by existing laws that target the conduct itself. For example, in the harmless case he had a situation where what arms list, the selling of the guns that was so controversial was actually perfectly legal under wisconsin law. Similarly, many of the problems that we talked about today are already addressedby federal criminal law. They already exist so section 230 is not a barrier because of course theres a car for federal criminal laws so i would urge this committee to look carefully at the laws that target the actual behavior that we are concerned about and perhaps start their. Miss peters, you did a good job glorifying us with your testimony. What solution do you offer of revealing 230 . I dont propose repealing 230 three and i think we want to continue to encourage innovation in this country. Our core economic, core driver of our economy but i do believe that if, if cda 230 should be revised so that if something is illegal in real life, it is illegal to hosted online area i dont think that is an unfair burden for tech firms. Certainly some of the wealthiest firms in our country should be ableto take that on. I have a small business. We have to run checks to make sure when we do business with foreigners that we are not doing business with somebody that on a terror blacklist. Is it so difficult for Companies Like google and reddit to make sure theyre not posting an illegal pharmacy . I see my time is getting expired but i think you and i think we just of your answer. Chairman now yields to my Ranking Member for five minutes. Again, thanks to your witnesses. I dont know if i could share with you a recent New York Times article outlined the horrendous nature of child abuse online and how it has exponentially growing over the last decade. My understanding is Tech Companies only legally required to report images of child abuse when they discover it and that requires you to actively look for it. I understand you make voluntary efforts to look for this content, how can we encourage platforms to better enforce the terms of service or proactively use their Service Provided by subsection c2 of 230. Is a good faith efforts to create accountability within platforms. You for the question and particularly for focusing on the importance of section c2 to incentivize platforms to moderate content. I can say for google we think transparency isimportant so we publish our guidelines. We publish our policy. We publish on youtube a quarterly transparency report where we show across the different categories of content what is the volume of content weve been removing and also allow for users to appeal their content is stricken and they think that was a mistake i also have the ability to appeal and track what is happening with the appeal we understand this piece of transparency is critical to user trust and for discussions with policymakers on these important topics. Miss citron, a number of defendants have claimed section 230 immunity in the courts, some of which are techplatforms that may not use content and all. Was section 230 intended to capture those platforms mark. Platforms are solely responsible for the content. Theres no, the question is theres no User Generated Content and their grading content, thatthe question be covered by the legal shield up to 30. Im asking, is that the question . No, they would be responsible for content created and developed so section 230, that legal shield would not apply. Mister farid, are there tools available like total dna or copyright iv flag the sale of Illegal Drugs online mark the ideas platforms to be incentivize to scan their platforms and takedown blatantly illegalcontent , shouldnt the words or other indicators associated with opioids the searchable through anautomated process . The short answer is yes. Theres two ways of doing content moderation. Once materials have been identified, typically by human moderator thats child abuse , Illegal Drugs, terrorism related material, whatever that is, Copyright Infringement and the fingerprinted and then stopped from future upload and distribution. Technology has been well understood and deployed for over a decade. I think its been deployed dynamically across platforms and not merely aggressively enough thats one form of content moderation that works today. The second form is what i call the bay zero, finding the christchurch video on upload. That is difficult and still requires Law Enforcement, journalists for the platforms themselves to find the winds that content has been identified it can be removed from future upload and ill point out that today you can go on to google and type five fentanyl online and it will show you in the first page illegal pharmacies where you can click and purchase fentanyl. That is not a difficult fine. Were not talking about things buried on page 20, it is on the first page and in my opinion theres no excuse for that. Followup because you said its anemic when some of these platforms might be doing out there. Last year in this room we passed 60 pieces of legislation dealing with the drug crisis we have in this country. Fentanyl being one of them and you mentioned you can type in fentanyl and find it. Again, what were trying to do is make sure we dont have 72,000 deaths that we had in this country over a year ago and with over 43,000 associated with fentanyl. How do we go in to the platform is that weve got to enforce this because we dont want this stuff flowing in from china, how do we do this . This is what the conversation is. We dont repeal 230 but we make it a con responsibility, not a right. If your platform can be westernized in the way weve seen across the board from the litany of things i had in my remarks, surely something is not working. If i can find on page 1 and not just me, my colleagues on the table, investigative journalists. We know this content is fair and we have to ask the question if a reasonable person and find this content surely google can find it as well and now what is the responsibility and you said earlier to you should enforce your terms of service. If we dont want to talk about 30, lets talk about terms of service. The terms of service of most major platforms are good, its just that they dont do much to enforce them in a transparent way my timeis expired and i yield back. Chair now recognizes miss housekeeper five minutes. Thank you mister chairman. Miss oyama, you said in one of the comment it says you presented to us that without 230, i want to see if theres any hands that would go on that we should abandon 230. Has anybody said that . This is not the issue. This is a sensible conversation abouthow to make it better. Mister hoffman, you said, and i want to thank you for we had a productive meeting yesterday explaining to me what your organization does and how its unique but you also said in your testimony that section 230 is a unique american law. And so, yet when we talked yesterday you thought it was a good idea to put it into a trade agreement in mexico and canada. If its a unique american law, let me just say that i think trying to fit into the regulatory structure of other countries at this time is inappropriate. And i would like to just quote, i dont know if hes here from a letter that both chairman sloan and Ranking Member of walden wrote some time ago. To mister like kaiser that said we find it inappropriate for the United States to export language mirroring section 230 while such serious policy discussions are ongoing and thats whats happening right now were having a serious policy discussion, but i think what the chairman was trying to do and what i want to do is trying to figure out what do we really want to do to amend or change in some way. If the three of you have talked about the need for changes, let me start with what you want to see in 230. Id like to bring the statute back to its original purpose was to apply to Good Samaritans who were engaged in responsible and reasonable content moderation practices. We can, we have the language to change the statute that would condition, that were not going to treat a provider or user ofan interactive service. That engages in a reasonable content moderation practices. So it would keep immunity. Let me suggest if theres language, i think wed like to see suggestions. Miss peters, i think you pretty much scared us. As to what is happening and then , how we can make 230 responsive to those concerns. We would love to share some proposed language with you about 230 to protect better against organized crime and terror activity on the platforms. One of the things im concerned about that a lot of tech firms are involved in is when they detect Illicit Activity or it gets flagged to them by users, their response is to delete it and forget about it. Whatim concerned about is two things. Number one, essentially is destroying Critical Evidence of a crime. Its helping criminals to cover their tracks as opposed to a situation like what we have for thefinancial industry and even aspects of the transport industry. If they know that Illicit Activity is going on they have to share it with Law Enforcement and do it in a certain timeframe. I want to see the content removed but i dont want to see it deleted and i think that is an important distinction. Id like to see a world where the big tech firms work collaboratively with Civil Society and with Law Enforcement to root out some of these people im going to cut you off because my time is running out and i want to get to the doctor with the same thing so i welcome concretesuggestions. I agree with my colleague mister citron. I think 230 should be a privilege not a right. We should be worried about the small startups. If we start regulating now, the ecosystem will become even more monopolistic so we have to think about how we carveouts for small platforms can now compete where these companies did not have to deal with that regulatory pressure and the last thing i will say is the rules have to be clear, consistent and transparent. Thank you, i yield back. Chair recognizes miss Morris Rogers for fiveminutes. Section 230 intended to provide Online Platform with a shield from liability as well as a sword good faith efforts to filter law or otherwise address offensive content online. Professor citron, do you believe companies are using the swordenough and if not, why you think that is . We are seeing the dominant platforms, ive been working with facebook and twitter for about eight years so i would say the dominant platforms and focus on this at this point are engaging in what i would describe at a broad level as fairly reasonable content moderation practices. I think they could do far better on transparency, about what they mean by when they boarded speech, what do they mean by that . Whats the harm theywant to avoid . It could be more transparent about the processes that they use when they made decisions. To have more accountability. But what really worries me on the sort of renegade sites as well. Those who fully incitement is no moderation. Dating apps that have no ability to manage person is and have ip addresses and sometimes its the biggest of providers, not the small ones who know they have to legality happening on their platforms and do nothing about it. Why are they doing . Because of section 230 immunity though the dating writer comes to mind hosting impersonations of someones and the person was using writer to send thousands of men to dismantle. Writer earned 50 times from the individual being targeted , did nothing about it when they responded after getting a lawsuit, there sense is ability doesnt allow us to track ip addresses but writer is fairly dominant in the space. When the person went to cross, its a smaller dating site, the impersonator was posing as the individual sending men to his own and scrub and we can down the ip address and take care of it so the notion that smaller versus large by my lights is theres good practices, responsible practices and irresponsible harmful practices. Thank you for that. Mister hoffman and miss oyama, your policies prohibit illegal content or activities on your platforms. Regarding your terms of service, how do you monitor content on your platform to ensurethat it does not populate your policy. Maybe illstart with mister hoffman. In my Opening Statement i describe the three layers of moderation that we have our company moderation and our team, this is the group that both rights the policies and enforces the policies. Primarily the way they work is enforcing these policies that seal the looking for aberrational behavior, looking for nonproblematic sites or words. We participate in a cross industry sharing which allows us to find images for example exploit the children that are shared industrywide or fingerprints thereof area next though are our community moderators. These are the people who, these are users and following that the uses themselves. Those groups participate together in removing content inappropriate for their community and in violation of our policies. We have policies against hosting, policies are not very long but one of the points is no illegal content so no regulated goods. No drugs, no guns, nothing of that sort. Your speaking out in the find it, you get it offthe platform. 230 doesnt provide us the Liability Protection so we are not in the business of committing crimes or helping people commit crimes. Would be problematic for our business. We do our best make sure its not on the platform. Would you address that and just what you are doing if you find that illegal content . Across youtube we have clear content policies. We publish those online that we have videos give more examples in some specific ways to understand. Were able to detect of the 9 million videos that we removed from youtube in the last quarter 87 percent of those were detected first by machines automation is one very important way and then the second way is human reviewers so we have Community Planning where any user that he problematic content and follow what happens with that complaint. We have human reviewers that and were transparent and in explaining that. When it comes to criminal activityon the internet , ca2 30 has a complete carveout in the case of grindr, we have policies against harassment but in the case of grindr where there was criminal activity understanding is there a defendant in that case and theres a criminal case for harassment and stalking proceeding against him in certain cases, opioids again, controlled substance under criminal law theres a section that says controlled substances on the internet, will happen. Since his, thats a provision and incases like that where theres a Law Enforcement role , their correct legal process, we would work with Law Enforcement to provide information under due process or subpoena. Mister get, recognized for five minutes. I really want to thank this panel. Im a former constitutional lawyer though im always interested in the intersection between criminality and freespeech and in particular, professor citron, i was reading your written testimony which you confirm with miss schakowsky over how section 230 should be revised to continue to provide amendment protections but also return the statute to its original purpose is to act, Companies Act more responsibly, not less and that they , i want to talk during my line of question about Online Harassment because this is a real Sexual Harassment, this is a real issue that has just only increase and Antidefamation League reported that 24 percent of women and 63 percent of lgbt individuals have experienced Online Harassment because of their gender or sexual orientation. This is compared only 14 percent of men. And 37 percent of all americans of any background have experienced your Online Harassment includes actual harassment, stalking, physical threats or sustain harassment so i want to ask you professor, and also ask you this theatervery briefly , to talk to me about how section 230 facilitates illegal activity and do you think it undermines the value of those laws and if so, how. Let me say that in cases involving harassment, of course theres a perpetrator and the platform that enables it and most of the time the perpetrators are not pursued by Law Enforcement. I explored the fact that Law Enforcement, they dont understand the abuse. Dont know how to investigate it and in the case of grindr, there are 10 protective orders that were violated and new york has done nothing about it so its not truethat we can always find a perpetrator nor especially in the cases of stalking, harassment and threats. We see a severe under enforcement of law particularly when it comes to gendered harm. The site can be ordered to block the person from communicating with the other. Select even on section 230 platforms, ignore requests that this dive boat material. They have. Select your nodding your head professor. They do and they can especially if those protective orders are coming from the criminal law. I wanted to ask you doctor mcsherry, Sexual Harassment continues to be a significant problem on twitter. Other social medias and platforms as well. I know section 230s a critical tool that facilitates content moderation but is we have heard the testimony, a lot of the platforms are being aggressive enough to enforce the terms and conditions. So what i want to ask you, is what can we do to encourage platforms to be more aggressive in protecting consumers and addressing issues like harassment. I imagine this hearing will encourage monday of them. Hearings. I understand that. So actually think that monday of the platforms are pretty aggressive already in the policies. I agree with what monday have said here today, which is that it would be nice if they would start by clearly enforcing their actual terms of service. We share a concern about this. Often they are enforced very inconsistently. They can be very challenging for users. The concern that i have if we institute what i think is one proposal which is whenever you get a notice to have some duty to investigate, that could actually backfire for marginalized communities because one of the things that also happens is if you want to. [silence] someone online, one thing you might do is for the Service Provider with complaints about them. And that they end up being the ones who are silenced rather than the other way around. Doctor fleetwood is your view. Noticed two issues and have there. Twentytwo moderation in your risk over moderating or on poverty. I would argue is where way on moderating. We look at where we fall down and make mistakes and take down things that we should have. I weigh that against 45 million pieces of content just last year to make and child abuse material and terrorism and drugs. The weights are imbalanced. You sort of have to note the imbalance and we are going to make mistakes. We are making way more mate mistakes on the allowing of content. Thank you mr. Chairman. And thank you to for holding this very important hearing. Ive been an Informational Technology for most of my adult live. Social responsibility has been an issue that i have talked about a lot. In the absence of heavyhanded government, and regulating, i think the absence of regulations is what has allowed the internet and the social media platforms grow like they have. I hate to sound cliche us but that old line from the Jurassic Park movie, sometimes were more focused on what we can do and we dont think about what we should do. So i think thats what we are where we find ourselves. Some of this anyway. Some of our witnesses, accessibility of a global audience through internet platforms is being used for illegal and illicit purposes by terrorist organizations and even for the sale of opioids which continues to severely impact an Impact Communities consternation and particularly in rural areas like i live in in southeastern ohio. However they also provide an essential tool for legitimate communications in the free safe and opening exchange of ideas. This is become a vital component of modern society in todays global economy. I appreciate hearing from all of our witnesses is our subcommittees examine both of section 230 of the Communications Act is empowering internet platforms to effectively self regulate on this live touch friend work. So mr. Hoffman, in your testimony you discussed the ability of not only employees, but its users to self regulate and remove content that goes against ravitch. Its rules and community standards. Do you think other social media platforms for example facebook or youtube, have been able to successfully implement similar selfregulating functions indictments. If not, what makes reddit unique in their ability to self regulate. Thank you congressman. I am only familiar with the other platforms to the extent that you know. Which is to see that im not an expert. I do know theyre not sitting on the hands. Another making progress. Radix model is unique in the industry. And that we believe that the only thing that skills with users is user. Seven were talking about User Generated Content, i am sharing some of this burden with those people in the same way in our society here in the United States, there are monday unwritten rules about what is acceptable or not to see. The same thing on her platforms. And by allowing and empowering our users the communities to enforce those unwritten rules. It treats an overall of the ecosystem. In your testimony, you discussed this possibility of determining which content is allowed on platforms including balancing respect for diverse viewpoints and giving a platform for marginalized voices. What a system like reddick, uploads and downloads impact the visibility of diverse viewpoints like platforms like youtube, and dislikes on youtube, impact videos visibility. Thank you for the question. Is you have seen users can sums up thumbsup or thumbs down. Its one of monday signals. It certainly wouldnt be determinative of recommendation of a video on youtube. Though mostly before road lit. We really appreciate your. About responsible content moderation. I didnt want to make the point that on the piece about harassment and bullying, we did remove 35000 videos from youtube just in the last quarter we can do this because of this to be eight to 30. Whatever somebodys content is remove the may also be upset that there can be against the Service Provider or defamation for breach of contract. Service providers large and small are able to have these policies and implement procedures to identify bad content and take it down. This is because of the provisions of this cda 230. See michalak his mother questions that im going to submit the me some rice with h his. So i want to stay within my time. You will require me to stay within my time. In the absence of regulations is i mentioned in my opening remarks, that take social responsibility to much higher bar. I would suggest to the entire industry, of the internet and social media platforms to better get serious about this selfregulating or you are going to force congress to do something they might not want to have done. With that i yelled back. Gentlemen youll back. You very much mr. Chairman. Thank you to the witnesses for me here today. Lastly the Senate Intel Committee released a bipartisan russias use of social media. It shows that they use social media platforms. They influence the outcome of 2016 election. Conception 230 play what role can it play the platforms wont be used again. It is critically important to allow Services Like us to protect citizens and users against foreign interference in elections. Is a critical issue especially with elections like coming up, we found on google across our systems in the 2016 election partially due to the measures we been able to take and is removals, their only two accounts associated our systems. They had less than 5000 is all they spent. We continue to be extremely vigilant so we do publish a political and transparency report. Require that it is disclosed in a shop in a library. Do you feel that you know effective. We can always do more but on this issue, we are extremely focused and working with campaigns. Mr. Hoffman. In 2016, we found that we saw the same fake news and misinformation submitted to our platform is was on the others. The differences, on reddick and is largely rewrapped rejected by the community by the users long before it even came to our attention. One thing we are im going at our community is im going at is being skeptical and rejecting also source of things and questioning things for better or for worse. Between that and now, we become dramatically better at defending groups of accounts that are working in a coordinated and we collaborate with Law Enforcement. So basically everything we have learned in the past, and can see going forward, i think wearing a pretty im going position coming into the 2020 election. Doctor crane in your written testimony, disinformation campaign, designed to disrupt in elections. You mentioned theres more to platforms could be doing. About moderating content online. What more should they be doing about this issue now. One example, if you want to go we saw fake videos Speaker Pelosi make rounds. And the responses will be interesting. Facebook said we know its fake, normally not. Not in the business of la truth. That is not a technological problem that was a policy problem. That was not set entirely comedy it was meant to discredit the speaker. So i think fundamentally, we have to relook at the rules. In fact if you look at facebook his rules, it says you cannot post things that are misleading or fraudulent. There was a clear case where technology worked in the policies unambiguously simply failed the policy. Youtube his credit they actually took it down and twitters in the discredit, they didnt even respond to the issue. So in some cases it is a technological issue but more often than not we are civilly not enforcing the rules that are already in place. So thats a decision they made. Okay. What you think of that what he just said. Are two aspects of this. First specifically towards medics, we have policy against impersonation. A video like that can both be used to neglect people or service misinformation and also raises questions about rest of the things that we see and hear an important discussion so the context about both of around or a video like that stays up or down on reddick is really important. Those are difficult decisions. I will observe that we are injuring into a new era. We can manipulate videos. Which has historically been able to do. With photoshop. Company is. So i do think that not only do the platforms of responsibility, we is a society have to understand that the of materials for example which publication is critically important because there will come a time, no matter what any of my tech tears see that we will not be able to detect that. Specific, content that you met you youtube, we do have a policy against the practice in the mood remove things. But there is ongoing work that needs to be done. To be able to better identify the fakes. Of course, in communion sometimes, political context that can severely undermine democracy. We opened up where we are working with research is to develop technology to better detect when renate tia is manipulated. A lot more to see but you know how this is. I will yell back. Chair recognizes mr. King to achieve. Thank you for being here today. We very much appreciate it. Its interesting on the last line of questions. One of us things about democracy is our ability to have free speech and share opinions but this can also be something is a real threat. I think the chairman for yielding. I be safe to see that not every member of congress and i have a plan for what to do about section 230 of the Communications Decency act but i think we all agree that the hearing is warranted and we need to have a discussion about the origins and intent of the section both of the companies and enjoy these Liability Protections are operated in the manner attended. In a state, and generally appreciate the efforts certain platforms have made over the years to remove and block unlawful content. Also state that is clearly not enough. And that status quo is unacceptable. Its been frustrating for me in recent years that my image and variations of my name, and used by criminals to different people on social media and this goes ten years. Literally i think an approach in the 50s to hundreds given on the ones we just know about. These camps are increasingly pervasive and i not only brought it up in the hearing with and more zuckerberg last year i also wrote him again the summer to continue to press him to act more wildly to protect his users. Sources indicate that in 2018, people reported hundreds of millions of dollars lost to Online Scammers including hundred and 43 million for a moment his scams. Given what so monday people a month or, is become more and more important for platforms to verify user authenticity. So both to mr. Hoffman and ms. Miss llama, what your platforms do to verify the often to city of user accounts. You for the question. Again to parse my made it through. The first is on the stand himself. My understanding is probably referring to scams that target veterans in particular. We have a number of veterans communities on aisle. Our support shared state experiences, theyll like all of our communities they create their own rules. These communities have operated rules that prohibit fundraising in general. The community and the members of its communities know that they can be targeted by the sort of scam in particular. Thus a sort of nuance that we think is really important and highlight it however community model. Is a nonveteran, might not have had that same sort of intuition. Now in terms of what we know about her users, reddit its not different from our peers and that we dont require people to share their realworld identity with us. We do know where they register from what he said he is and maybe their email address but we dont force them to reveal the full name. Both of a gender and if this is important because i read it there discuss sensitive topics. In this very same pattern communities or for example, drug addiction communities or any sore parents who are struggling being a parent. These are not things of somebody would go on to like facebook for example see hey, i dont like my kid. I dont mean to cut you off but we need to get move on. Very sorry that that happened to you congressman. On youtube we have a policy against impersonation is. If you know never to channel that was impersonating you know user saw that, is a former they can go in and submit. An example of their government ideas but that was a result the channel being struck. Things show off across the labs, searches an index of the web, try to get relevant information to your users every single day of search. We suppressed 19 billion links. That are spam that could be scam. To defend the users the risk engine that can actually kick out fraudulent accounts before they enter. Thank you. Im not upset about the sites that are the worst congressman ever. Thats understandable i guess for some people. But we do have again in my case, somebody is an example, multiple cases from india using her entire live savings because she thought we were dating for a year. Not to mention all of the her name that she gave to this perpetrator at all of these other stories i think one of the biggest and most of our things is people need to be aware that if you have somebody over a period of a year of dating you never authenticating goff, its probably not real. Missed peters what is the risk for people not being able to trust other identities online. There are multiple risks of that. When i come back to key issue for us if it is alyssa, the site should be required to hanover data to lawenforcement to be proactively work with them. We heard a lot today from a gentleman from reddick about her efforts to better moderate from some of our members were able to go online and type in a search for by fentanyl online and came up with monday results. Never bite federal online. Bye for cheap that went out prescription. Those are fairly simple search terms. Not talking about it super high bar to get rid of that new platform. It is a is it too hard to have it automatically directed to a site i would advise you to get counseling for drug abuse. Im not trying to be the thought police. Were trying to protect people from organizing crime and terror activities. Audio back but i have a bunch more questions i will submit. Thank you. For the record i think i dont think hes not the worst member of congress. [laughter] i dont even think you are the very bottom. You are not a bad guy. The chair recognizes mr. Castor five minutes. Thank you chairman doyle for organizing this hearing. Thank you to all of our witnesses for being here today. I like to talk about the issue up to 30 in the context of this horrendous tragedy in wisconsin a few years ago. Where a man walked into a salon where his wife was working and shot her dead in front of their daughter and killed two others in that salon. And then he killed himself. This is the type of horrific tragedies that is all is it too common in america today. You mentioned and i think you must poke a little bit because you said that was all legal but it wasnt because of two days before the shooting, there was a temporary restraining order issued against that man. He went Online Shopping and arms. Com and two days after that, and the next day, he commenced those murders. What happened is arms list knows that they have domestic abusers shopping, they have felons, theyve got terrorists, these are all shopping for firearms yet they are allowed to proceed with this. Earlier this year, the Wisconsin Supreme Court ruled that arms list is the man. Even though they know that they are perpetuating illegal content. They said, that arms list is immune because of section 230. It basically said it did not matter that arms list actually know or even attended that his website which facilitate illegal firearms sell to a dangerous person. Section 230 so granted immunity. In them speeders, you highlighted, were talking about child sexual abuse content, illegal drug sales, and has gone way is it too far. I appreciate that you will have posed some solutions for this. You highlighted a safe harbor. New Companies Using best efforts to moderate content, they would have some protection but how would this work in reality. With this then be left up to the courts and those dive boat liabilities. The kind of speak to the need for very clear standards coming out of congress i think. Yes, i would. Thank you so much for the question. How would we do this. It would be in the courts. It would be an initial motion to dismiss the company with then, was being sued, that question would be are you being reasonable large. Now with regard to anyone piece of content or activity. And it is true that it then would be a mechanism in federal court, have companies then explain what constitutes thanks. I think we can come up right now with some basic thresholds, what we think is reasonable content moderation practices. Or technological due process. Accountability, and is having a process clarity about what it is you prohibit but is going to have to be casebycase contact by contact. Because what is reasonable and response to a deep state and is going to be different from the kind of advice i would give the Facebook Twitter and others. What constitutes a threat and how one figures that out. Thinking about the testimony about what we do about there are certain things. Would be the Public Interest i believe that is it is explicit, that they dont or it wouldnt wind up is an issue of fact in a lawsuit. What you think doctor friedman. His illegal content online. Umbrellas should be a debatable question is our right. Another lawyer, im in a mathematician by turning back completely great with you. For example the years and we saw this we are employing, and Technology Companies wanted to be bottled up in the gray area. The conversations that we are trying to remove child abuse materials. Weapons of an 18 yearold when its not sexual explicit my made it through is there are complicated questions but theres clearly cut bad behavior clear cut bad behavior. There is also an issue with the number of moderators are being hired to go through this content through the publication called the merge had horrendous story of facebook and caught my attention because one of the places is entebbe florida, my district. Im going to submit followup questions about moderators and some standards for that practice and i encourage you to made it through is in the back. Thank you mr. Chairman. In my 23 years of being a member ive never had a chance to really address the same question the two different panels on the same day, so its kind of an interesting emergent. Upstairs were talking about underage use anybody and watching the product. So i was curious when we were in the Opening Statements here, someone and i apologize, someone mentioned to cases one was dismissed because they really did nothing and one who tried to be the im going actor, got slammed. I dont know about slammed but is the couple of heads, can you miss his term, can you address that first. You know shaking your head the most. Enthusiastically because those are the two cases that effectively that rise to section 230. What animates to get a run writing, we gotta do some thing about this. One basically says if you do nothing, not going to be punished for it but if you try and you moderate, actually that heightens your responsibility. No im going deed goes unpunished. Yes right. Thats why im here today in monday respects. To make my tie this into whats going on upstairs and someone uses the platform to encourage on aged vaping with unknown nicotine content, and the site then decides to clean it up, because of the way the law is written right now, this im going deed which we most agree that this is the im going deed, go punished. Nono. Thus why we have section 230. They are encouraged so long is they were doing, on section they can remove it. And they are im going samaritans. Right. That is a benefit of it. His fear, okay so in this debate that we heard earlier and opening comments for my colleagues, in the u. S. Ca debate that part of that would remove the protections of 230 and that it would fall back to a regime by which a im going deed person could get punished is that correct. We need to keep the 230 language out of the trade agreements. It is currently an issue great debate here in the United States is unfair with that in a trade make it impossible for her make it harder,. Document rock about what it passed is soon is possible that went out encumbered works. Im not a proponent of trying to delay this process but im just trying to work through this debate and also, upstairs to look those of us, we believe in legal products that have been, approved by the fda and concerned about a blackmarket operation that within use platforms illicitly to sell underaged kids. That would be how i would tie these two hearings together and again i think its pretty interesting. Monday of the facebook hearing a couple of years ago, i refer to a book called the future computer which talked about the ability of industry to set the standards. I do think that industry, we do this across the board and a lot both of it is engineering heating and air cooling equipment or we do have mistreatment just come together for the im going actors and safe here are our standards. The fear is this sector doesnt do that, the heavy hand of government will do it which i think would really cause a little more problems. Doctor krieger shaking your head. Even staying to the industry, you have to do better. If you dont, somebody is going to do it for you. See you do it on your terms odious. I agree were not the experts. Part of that book talks about fairness privacy transparency transparency liability accountability. I would encourage those of you who listening to help move in that direction on their own before the we do that but for them. I feel like my time. German deals the chair recognizes the chair for five minutes. Is very interesting testimony and jarring in some ways. Ms. Peters, your testimony particularly jarring. Have you seen in the on aesthetic offers of weapons of mass destruction online. I am not. We certainly have members of our allies that are tracking weapons activity. I think was more concerning to me in a way is the number of illegal groups designated groups to al qaeda that maintain webpages and link to the twitter and facebook pages on those. Fundraising campaigns. [inaudible conversation] there are monday platforms that allow for different groups and is inside those groups, at the epicenter of Illicit Activity. So it is hard for us to get aside of those. We actually run undercover operations to get inside of them. Mr. You talked about Tech Companies between the motivation and the amount of time online on the platforms on one hand on the other hand, content moderation. We do about that briefly. Weve been talking a lot about 230 thats important conversation but there is another judge avoid your and there is nothing. This interlining Business Model of Silicon Valley today its not to sell a product. You know the product. In some ways that is where a lot of the tension is coming from because of the metrics we use how monday users, and how long do they stand the platforms. You can see why that is fundamentally a tension with moving users and removing content. So the Business Model is also an issue in the way we deal with privacy of user data, is also an issue here because if the Business Model is monetizing your data, then i need to feed you information. There is a reason why we call it the rabbit hole effect on youtube. Theres a reason why if you start watching certain types of videos of children or conspiracies or extremism, you are and more and more of that content down the rabbit hole. Sinners real tension there and it is the bottom line. Its not just ideological. Were talking about the underlying problems. Thank you. I think monday of these issues that we are discussing today, both of it is harassment, extremism, it is important to remember the positive and productive potential for the internet. On youtube, we have seen it gets better and we have seen counter messaging and we have a program called creators to change who are able to create really compelling content for youth. Thing is just im going to remember that 230 was born out of this committee. Longstanding policy and is relevant to Foreign Policy as well. And we would support his inclusion. In trade free markets who are responsible for the surplus the United States hasnt digital services. His critically important for us to moderate content and to prevent censorship of other more repressive genes abroad. Hard to restrain yourself to brief answers. But clearly some of these could be doing more within the current Legal Framework to address the problematic content. Each of you what can you do with todays tools for tomorrows content. Breast, the biggest challenge is evolving our policies to meet your new challenges. I believe all of our policies, a dozen times and will continue to do so into the future. For example, two recent ones for us were expanding our harassment policy and banning pornography. So undoubtedly there will be there will be new challenges in the future and be able to stay nimble and trust. 230 actually gives us the space to adapt to the source of new challenges. The nimble, ensuring that we do respond to changing threats. The landscape is going to change we can have a checklist right now. I would encourage companies to not only have policies but be clear about them. In a be accountable. Just quickly, the issue for me with a standard, is a litigator, that is terrifying. That means the practical matters especially for small business, a lot of litigations and courts try to figure out what counts is reasonable. Two questions, one of the crucial things i think we need if we want better moderation and practices, and we want users not to be treated just is products, is to incentivize alternative Business Models. To make sure that we clear space of the competition, when a given site is behaving badly such is writer, people of other places to go with other practices. And are encouraged to another site to encouraged to develop. And evolve. They will make market forces, and the can work and we need to let them work. Ms. Brooks this very important hearing thank you. After free, actually, set the record and made the reason i am asking these questions i am a former u. S. Attorney and i was very involved in internet crime. We did a lot of work from 2001 to 2007, and you are right, deep fake pornography was not a term at that time, two years ago. We certainly know that Law Enforcement has been challenged for now decades in dealing with pornography over the internet. And yet, i believe that we have to continue to do more to protect children and protect kids all around the globe. A concept or tool sort of dna, was developed along time ago to detect criminal online child photography on Means Nothing to detect that illegal activity. In it if the platforms or do anything about it so now we been dealing with is now for decades. This its not new and heavy now have new tools and so do is in a matter of tools or effort or how is it that it is still happening. I have to see that this is the source of incredible frustration. Photo dna was something that i was help developing. From an industry, the prize itself on rapid and Aggressive Development that is been no tools the last decade that is going beyond photo dna. That is pathetic. That is truly pathetic when we are talking about this kind of material. How does an industry that prides itself on innovation see were going to use ten yearold technology to combat some of the most gutwrenching heartbreaking content online. It is completely inexcusable. This its not a technological limitation. We are simply not putting the effort into developing and deploying jewels. Me to share that having watched some of these videos, it is something you never want to see and you can never get it out of your mind. Smack i agree. Cement i have curious, i wanted to respond and how is it that we are still at this place. Thank you for the question. I will see with google, that is not true at all. They never stopped working on prioritizing. We always do better but we are constantly adapting new technologies. We initiated one of the first ones which is called csi a match. Enables us to create digital finger prints of the imagery. Prevent it from ever being uploaded from youtube and also share it with others and mutual and we are sharing it with others in the industry with ngos, and it resulted in a seven increase and speed of which it content is able to identify. So its going to continue to be a priority but i just want to be clear about on the very top of our company, we need to be a safe secure place for parents and children. We will not stop working on this issue. Im very pleased to hear that there have been advances than in that your sharing them and that is critically important. However, i will see an Indian State Police captain who is actually testified before energy and commerce recently told me that one of the issues that Law Enforcement runs into woodworking with Internet Companies is an attitude that he calls minimally compliant. And he said that Internet Companies will frequently not preserve content that can be used for investigation is long enforcement makes the company is aware of the concern material and they will automatically flag at content. The Law Enforcement for review that went out actually checking to see if it is truly objectionable or not. Do any of you have thoughts specifically on his comment. He has been an expert. Dna if you have thoughts on how we balance this Law Enforcement critical need because they are saving children. All around the globe. Ms. Miss peters, that went out restricting honeys immunity from hosting concerning content. I just feel like if Companies Start getting fines for some sort of punitive damage, every time theres illicit content regarding see a lot less illicit content very quickly. It is illegal and real live should be ill legal posted online. Its a very simple approach that i think we could apply industrywide. Spec i have a question in particular because i asked and more zuckerberg this relative to terrorism terrorism in ices and recruitment. And now we need to be more concerned about isis. Understand that you have teams of people have taken down. How monday people on your team stop in. Dedicated to removing content that scale about 20 percent of our company. About a hundred people. More than 10000 people working on content moderation. Actually remove content. But how monday people on the team that actually do that announcer you can watch this entire hearing on cspan. Org. Type internet and Consumer Protection in the Video Library search box on the homepage. You will also find other hearings and conferences on the topic. Us live on the phone is jeff mason, senior White House Reporter for reuters. Was this a surprise . Guest yes. I w c

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.