vimarsana.com

Transcripts For CSPAN Senate Judiciary Committee On Childrens Online Privacy 20240714

Card image cap

Good morning. We will wait a minute to start. I want we will wait a minute to start. I think we will learn a lot about social media sites and children and what we can do to help. He is on his way. I will give an opening statement. I will be visiting the border. Bipartisan we have a group. We are not going there to blame anybody, at least i am not. I am trying to find a solution. I look forward to learning fromd to our friends in texas about what we can do. Thank you very much. Just go ahead and get started. Can you raise your right hand . Do you swear the testimony you are about to give is the truth, the whole truth, and nothing but the truth . Ok. Ok, welcome, mr. Stone, the 14th judicial circuit solicitor or in South Carolina. We are proud of your work and welcome to washington. Thank you for having me. Mr. Chairman, thank you. I understand Ranking Member feinstein will not be able to be here, but i would like to thank her and the members of this committee for the invitation to speak with you today. I am the solicitor for the 14th circuit. That is a circuit made up of five counties in the low country of South Carolina. Senator graham knows. I am also the incoming president of the National District attorneys association. For the purpose of this hearing, i also carry the title of parent, and i think that is relevant because i think in this particular issue and concepts, i think we need to understand a world we are living in, which is substantially different than it used to be. If you think about it, parents teach their children every day to not to talk to strangers, to not open locked doors for people they do not know, certainly not to get into a car or van with someone they dont know, and yet, every day, and we take precautions about this. We put up locked doors. We have our doors and windows locked, security systems. Despite all of that, strangers are talking to our children every day. They are doing it through the internet, by use of the children cell phones, often unknowingly, but by invitation from our own children. The Pew Foundation did a study recently. I think this is a 2018 study that stated 95 of the teenagers today have access to the internet through their cell phones. 45 of which are doing it almost constantly. I dont think any of us that either have teenagers or no teenagers without those statistics. If you look at it from that perspective, the question is, what is the risk . What is the risk of these children having these types of access and people having access to our children behind locked doors . I think our friends from the center of missing and exploited children would tell you that 18. 4 million cyber tips that they received in 2018 indicate the risk to our children. Of those 18. 4, after the studies were conducted, the offenders themselves, 98 of those are strangers, so strangers are talking to our children. If you look at state Department Statistics, they will also tell you that tens of thousands of children are being exploited online. If you look at further state Department Statistics concerning Human Trafficking, you will learn half our children. What these human traffickers are doing when it comes to labor or sex trafficking, they are using the internet to entice, coerce, and in many ways, even arrange the buying and selling of our children. The internet and the cyber world we live in is no longer a bricks and mortar world. It is a situation we have to take into consideration. I am a solicitor for a fivecounty circuit in South Carolina. I have of those five counties, one county is beaufort county, home of hilton head, the heritage golf tournament, and what it seems like at this time of the year tourism. It is probably the second wealthiest county in the state of South Carolina, but the average poverty in South Carolina is 16. 6 . I have three counties below that percentage. One of my counties as allendale county. Unless you are senator graham, you may never have heard of it, but it suffers at over 30 poverty. The demographics of my circuit, basically over a 3700 square mile period, is a microcosmic of South Carolina and in this country. What i see in the poor, as well as the wealthy areas, and my circuit are the same issues when it comes to cybercrime. Briefly, i would like to tell you what i see. I spend about 14 my office is probably mediumsized when it comes to National Standards. I spend about 14 of my overall personnel budget on intelligence, and part of it our computer and cell phone experts, who not only download from these computers and cell phones, but also steady and try to help us uncover these crimes and help Law Enforcement throughout her circuit make these type of cases. What i have seen, i have seen it professionally made music videos that are being used to produce used and produced by and for gang members to recruit our children. I have seen prisoners who operate prostitution rings outside of the prison, having already been convicted. I see not only drug dealers, gang members, but other criminals with all of the information that they have inside of their cell phones, which is, in the past, it would be a notebook. Today, it is their cell phone. These events that we see, we need your help in many ways because Law Enforcement and the solicitors and district attorneys around this country, they face a little bit of a challenge. When it comes to these notebooks, cell phones and information we have in there, we need help with the encryption issue. When it comes to prisons, in particular, i would like to thank chairman graham and senator kennedy, also on this committee, for sponsoring the bill that would allow our prisons to block that cell service. We think of a purpose behind the prison and isolating those people behind it, but if they have access to a cell phone inside the prison, we are not keeping our citizens, community, or children safe from them. Let me also say, i think we have to look at it from the perspective of what type of world we are dealing with today as opposed to what we did when i first came a prosecutor in 1989. I think the biggest change you have seen over the past 10 years is the use of technology by criminals. We see that in every aspect of every case we have. If we go back and talk first about the encryption issue, which i think is particularly relevant, i started my career as a narcotics prosecutor. If a Police Officer executing a lawful search warrant went into someones house and found drugs, that would help them with that case. If they found guns to protect their product, that would also help. Scales, that type of thing. But in order to destroy the enterprise, if they could find a notebook, those collections of documents that told them who the major buyers were, sellers were, that type of thing, that is how you could actually destroy that network. Today, that is not on a notepad like this. It is on their cell phones. We have exports experts. And yet there are some Cell Companies that refuse to insist assist Law Enforcement in enforcing court orders. If we are going to do those investigations and destroy these Human Trafficking enterprises and gangs, we are going to need help with that aspect of it. Senator graham thank you. Thank you, members of the committee for inviting me. For over 30 years, i have been a professor at georgetown law, i have represented nonprofit organizations, such as i the campaign for Digital Democracy in efforts to try to improve the media and environment for our children. I have also written articles about that subject. Many of the problems families struggle with today, such as how to protect their childrens privacy, how to prevent exposure to inappropriate content, and to limit the amount of time children spend on minor digital devices, are the direct result of two things. First, the Business Models are the dominant Tech Companies. They design their systems not to protect or nurture children but to attract a large number of users, including children, and to keep them online as long as possible so they can maximize revenue by collecting valuable data about the users and delivering targeted marketing to them. Second, as i explained in my written testimony, the government has failed to adopt sufficient safeguards for children and has not effectively enforced the ones that exist, in particular, the federal trade commission has failed to enforce the childrens Online Privacy protection act. As a result, the big Tech Companies, including google, youtube, facebook, and amazon, have felt empowered to ignore the safeguards. It was adopted in 1998, and there was no youtube, social media, smart phones, smart speakers, or tories connected to the internet. They have not kept up with these new developments. The underlying principle law is that parents can protect their children if they are given adequate notice and if they can give verifiable consent before data is collected from the children. The practice has proven this is no longer tenable. Another problem is the safeguards only come into play in certain situations, either for a website or Online Service directed at children, or where the operator has knowledge of children using specific services. As a result, many times when children are online, they are not getting any of their protections. Another limitation is that it does not provide safeguards once a child turns 13 and teenagers, because of their stage of development, are less able to evaluate risk and are susceptible to peer pressure, so they take a lot of risks. Passing the do not track kids act of 2019, introduced by two senators, would be a good first step to update it. But even in the absence of the new legislation, the ftc can and should act to better protect childrens privacy. Since 2001 the law took effect, the ftc has only brought 29 actions to enforce conflict. Since 2012, i have been involved in filing 14 requests, asking them to investigate the violations. Yet, the ftc has not acted publicly in response to any of them. In addition to bringing Law Enforcement actions, cftc should ensure that the selfregulatory copa safe harbor programs that congress intended to help are actually doing their job. Given the large number of websites and Online Services used by children, it is not easy for the ftc to pursue all violations, so i insist they use their authority under section five to hold platforms responsible when they character acts or content as appropriate characterize apps or content as appropriate for children when they are not, and conversely when they say their services are not appropriate for children and yet they know a lot of children use them anyhow. Just to give you one example of a recent complaint we filed in december 2018 on behalf of those organizations i mentioned, we asked the ftc to investigate whether the google eyesore was engaging in deceptive cactuses and labeling apps in the family section as familyfriendly, when in fact they did not meet wills meet googles criteria. Googles criteria was put under privacy laws, and prohibited inappropriate contents. What we found through our investigation was that google was not sourcing them. Another complaint that i will briefly mention involving youtube and the fact that youtube clearly knows it is the most popular destination for Children Online, yet, they pretend there are no children and they do not give notice to parents or meet the requirements. I urge members to put pressure on the trade commission to enforce the laws on the books and also in companies to act more responsibly. A number of members had written letters to the ftc, and i also urge congress to look at what kind of Legislature Might be helpful in addressing some of these not only shortfalls but new issues coming up that are with facial recognition, Artificial Intelligence, and other new technologies. Thank you and i welcome questions. Chairman graham, members of the committee, thank you for allowing us the opportunity to speak today. I cannot think of a more noble cause for all of us to be spending our time on that the safety and protection of our children. I began protect young eyes five years ago because the internet is complex and even diligent parents are overwhelmed by the digital choices their children face. Through hundreds of presentations and dozens of articles examining digital trends, we have witnessed the wonderful potential and troubling and pervasive darkness that exists in the pockets of millions of young people today. I am certain that in the course of todays discussion, we will hear difficult stories. I wish they were uncommon. In march 2019, cnn reported instagram was the leading social media platform for child grooming by sexual predators. Our test accounts weakly discovered that young people, particularly young girls, can be hunted like prey. We started in Instagram Account with two photos and try to mimic the behavior of an average teen girl. We posted two selfies with the few havetags, searched a few and liked photos. Within a week, dozens of men sent us images of their genitals, telling us they were horny and sending us hard core pornography through direct messages, even after we told all of them we were only 12. They were relentless. A recent poll of 2000 teens found that nearly 75 of them had received pornographic direct messages from strangers on instagram, even if they had a private account. But you will not find any warnings in the App Store Descriptions from instagram that mention anything about sexual predators, direct message risks, sex trafficking or hardcore pornography. Instagrams defaults are not set for child safety or data privacy, even though instagram is rated 12 plus by apple and 13 plus by google. At protect young eyes, we have seen the predatory symbiotic relationship that exists between instagram and snapchat, both used by 75 of teens today. Many instagram predators quickly shuttle kids over to snapchat, where evidence disappears. For example, one relentless pedophile on instagram, who calls himself daddy invites young girls to join him on snapchat with content like hit me up to be daddys naughty girl. Snapchat is where explicit content lives. The app allows horn porn performers to make thousands of dollars daily through their premium accounts. Snapchat is were harmless looking filters can encourage children to share their location on the app, giving it to 13yearolds, even though they tout age gating. It is where kids have access to the discover new section, where they are taught how to engage in risky sexual behavior, such as group hook up, anal or torture drugs, and howl to hide activity from parents using incognito mode. You will not find anything in snapchats app description that warns parents about premium accounts, creditor risks of using snap or the news. Snap chat is rated 12 plus by apple and 13 plus by google. Tech companies will tell us that ratings are already in place. However, they are not uniform, lack transparency, and there was little accountability for inaccurate ratings. For example, twitter was rated four plus for a decade, even though hardcore point and prostitution was everywhere. It is still breaking the noporn policy would only claim smiled content innt or mild the description. Netflix is rated four plus in apple but teen and google play. Most social media apps are allowed in google, even though copa requires 13 plus. Companies tell us protecting kids is a parents responsibility but this position ignores the ubiquitous notion of the internet. Even if i do everything right to protect my own 14yearold daughter, with almost 90 of teenagers owning a smart device, a simple rid on the bus or a visit to a friends house can expose her toe. Content altering to life altering content. Never have we had something in history that could change the trajectory so quickly. Tech Companies Tell Us safety guards are already in place but it takes a lot to set up a parental control on an iphone. We need to simplify this process to help more parents protect their children. For example, better aged based defaults would easily protect millions of kids. The addition of easy controls over schooling and bedtime could shut off distracting apps during these critical times. Tech companies will tell us that regulating apps is too big of a job and cannot be done, however, videogames successfully do it, and the esrb shows us a path that could be applied by reasonable individuals to apps. In fact, it was Mark Zuckerberg of facebook who recently said i believe we need more effective roles for governments and regulators, and we agree. Our hope is to create safer digital places for young people. Two Simple Solutions could change everything. Create a uniform Rating System. Enact better defaults. Lets fix this for the kids. Thank you and i look forward to answering your questions. Thank you, mr. Chairman. Good morning, and members of the committee, it is an honor and privilege to be here. I have the honor and privilege of serving as the president and ceo for the National Center for missing and exploited children. I am pleased to be here today to provide our perspective on a growing problem of Child Sexual Exploitation online, and the battle children face on the internet. We were rated in 1984 by child advocates as a private organization to help find missing children, reduce Child Sexual Exploitation and to prevent child victimization. Our program to combat online Sexual Exploitation is the cyber tip line, a tool for the public and Electronic Service providers to report abusive images to us. Technically, this content is legally defined as child pornography, but we use the term child sexual abuse imagery to make clear what this content truly is. Not pornography, but the rape and Sexual Assault of children captured on images and in video. Since we began the cyber tip line over 21 years ago, report exploded. Last year alone, we received over 18 million reports of international and domestic online child sexual abuse. The number of reports is daunting, and so our new trends, are new trends, including the following, between 2017 and 2018, video files reported an increase to 541 . A broader range of Online Platforms is being used for store and trade child sexual abuse images, including chat, video and messaging apps, photo sharing platforms, social media, dating sites, gaming platforms and email systems. If someone can share content online, we know that platform can be used to sexually exploit a child. Reports relating to enticement and sexploition are increasing. We are seeing reports with graphic and violent sexual images of young children, including infants and reports of ondemand sexual abuse known as live streaming. Virtually all reports come from the open web, not the dark web, but we know that the dark web, are havens for child sexual abuse imagery. Unique initiatives are needed. Law enforcement cannot arrest their way out of this. There must always be a way to rapidly detect, report, and remove such images. This cannot be done in a vacuum. Multiple layered partnerships are essential to protect children on line. We have built strong voluntary partnerships with Technology Companies in support of our mission. These initiatives are crucial because while there is no legal requirement for companies to proactively screen for child sexual abuse, Many Companies have business reasons to do so. In support of this, we offer voluntary initiatives to proactively take a digital fingerprint image that can scan for identical or similar images. Many Online Companies use hashes of images to screen and remove these images from their services. We facilitate voluntary hash sharing by hosting platforms for Technology Companies to share over 1. 2 million hashes of abusive images. And even hosting a second for nonprofits, including sharing over 1. 7 million hashes with those companies. We are also devoted to Critical Resources to modernize the cyber tip line. We need to ensure that as a more complex technology is used to exploit children, we are using the most advanced technology, including Artificial Intelligence and Machine Learning to protect children from abuse online. We also provide a broad range of additional programs to support children who have been victims of online sexual abuse and to provide datadriven prevention and Education Programs relating to such abuse. To conclude, the problems of Child Sexual Exploitation are becoming more complex and challenging. We are fortunate to be joined by our mission in this committee, our nonprofit partners and the u. S. Technology companies. It is essential to form broader partnerships to support our work as the society reflects on the continuing role of our daily lives and the internet, we must make sure that technology tools, such as hashing remains viable and effective combat the spread of online Child Sexual Exploitation. Thank you for this opportunity to appear before you and i look forward to the continuing work with this committee in answering questions you have. Thank you. Thank you. Good morning, chairman graham, senator blumenthal, and members of the committee, i am stephen balkan, a founder and ceo of the Family Online Safety Institute or fosi, as we call it. We need to create a culture of responsibility online, one. Two, we need to develop resiliency in our children, and three, we must move from protection to empowerment, from locking to monitoring, and from restrictions to responsibility. Fosi is an International Nonprofit organization working to make the online world safer for kids and families. We achieved this by identifying and promoting the best practices, tools, and methods in the field of Online Safety that is also respectful of free speech. I first testified before this committee 24 years ago as part of hearing on cyber porn and children. The technology has changed immensely the past two decades, but the challenges remain. We must keep children safe on the internet while ensuring they can access amazing opportunities for learning, creativity and fun available to them. Our experience has shown by engaging all parts of the technology ecosystem, including governments, industry, Law Enforcement, parents and teachers, we are able to empower and instill a sense of resilience in our children, and in this way, ensure their online experiences remain safe and productive. The internet enhances the educational and social lives of children. The use of this media allows them to Gain Knowledge in a variety of new ways. Children are able to create and share their own content and express their ideas and experiences on a worldwide platform. While the many benefits of the online world are undeniable, the accompanying risks and challenges cannot be discounted. At fosi, we do not seek to diminish these risks on the internet rather focus on the constructive ways that they can be managed against the benefits and opportunities of being online. That is why we believe that creating a culture of responsibility is the most effective framework to ensure that children are protected in the digital world. Hearings like this one today are an important part of the oversight and support role that government needs to play in order to create a Safer Internet for all. Additionally, we have been pleased to work with the first lady on her be best initiative. As a membership organization, we pushed industry to go further and consider Online Safety throughout the development of new products and services. We support effective oversight of industry selfregulation, believing that this allows for maximum innovation and the development of Creative Solutions. We are grateful for the hard work of Law Enforcement, including those sitting here with me today. And we have a long history of working together. It is imperative that they are fully resourced and given the tools and training to combat the rising cybercrime. Parents must be engaged and knowledgeable about what their children are doing on the internet. We know that caregivers are worried about the amount of time children spend on devices, as well as the content they are accessing. We speak to address these concerns by allowing parents to confidently navigate the online world with their children. This involves exploring the internet alongside their kids and imploring the many parental tools provided by companies. Training must be provided to all teachers since they are often one of the first places children and parents help and information on digital challenges. And most importantly, children must be taught to be good digital citizens, to know about their rights and responsibilities that come with being online, to understand the consequences of sharing personal information, and to empower them to make the right decisions when they see upsetting content or inappropriate behavior. The internet cannot be made 100 safe. We must foster resiliency in our children and give them the tools, moral and technical, to deal with what they will inevitably encounter on and offline. The steps that we take to protect a sevenyearold will not be the same for a 17yearold, although they are both minors. That is why we must move from protection to empowerment, from blocking to monitoring, from restrictions responsibility. Let me conclude by urging you to not underestimate the benefits and opportunities that enrich the lives of so Many American children, their access to information, the freedom to express themselves and the tools that will equip them to compete and thrive in the connective world must be protected. It is only through working together that we can ensure we acknowledge the risks, mitigate the real harms, and allow all children to reap the rewards of this digital world. Thank you for your time. I look forward to your questions. Senator blumenthal, you can make your opening statement. Senator blumenthal thank you, senator graham, for holding this hearing. It is supremely important we face this question protecting Children Online. I would like to thank our witnesses, especially nicmic being at the forefront of this cause from the start. I can remember well when connecticut and other states established their child predator, sex offender registry, and then i argued the United States Supreme Court to defend the constitutionality of our supreme laws. They were upheld and your support was invaluable, so thank you. If anything, the task of protecting children has become all the more difficult since then. It was hard then and it is nearly impossible now without the partnership of data companies. Parents have a primary responsibility, but they need a more proactive and preventive role from big Tech Companies. All too often, these companies have been inadequate and abysmally slow in responding. Last month, researchers found that youtubes Recommendation System was promoting videos that sexualized minors. Parents could post innocent videos of their kids and youtube algorithms by shepherding the wrong people to them. I am disappointed, frankly, youtubes unwillingness to take basic steps that senator blackburn and i asked that they do to fix their platform. We have yet to receive inadequate response from them, and i will be asking questions about it going forward. A youtube celebrity, austin jones, was sentenced to 10 years in prison for coursing and coercing and deceiving at least six underage girls and sending him sexually explicit videos. Youtubes initial response was to demonize his videos, saying the channel did not violate the terms of service. Only after significant negative press did they finally do the right thing. That epitomizes the reactive nature of the Tech Companies addressing this problem or failing to address it. Let me say in conclusion, as is disappointed as i am in the failure of the companies to meet their own terms of service and to act more promptly and ly, the failure of enforcement is in many ways even more disturbing, even shocking. What you just heard from professor campbell, i think it reflects my experience and others with the ftcs lapses in enforcement. If there is no deterrence, and potential swift and sure punishments, these companies have a lot less incentive to enforce their own terms of service, not to mention morals and standards that the law provides. We need to set a National Standard for Tech Companies and how they protect children. We need to know that if a user is reported for abuse, social media sites will act immediately. We need to know that authorities and organizations like nicmic will be provided all the evidence they need when children are threatened and the resources that are required. And we need to know that private and Sensitive Information about our kids will be kept safe and secure. I want to thank all of you for your very impressive work and thank you for this opportunity to ask questions. Senator graham mr. Stone, what percentage of the cases were trying to exploit a phone or social media device where you had a warrant that Companies Fail to help you . Mr. Stone i do not know that i could give you the exact percentage. Senator graham is it a big problem . Mr. Stone it is a big problem, and there are some phones that are investigators and analysts our investigators and analysts recognize they cannot get into, so whether they had a search warrant or not, they recognize it and will not be able to get past encryption codes, so they are not able to make the efforts. Senator graham if you had a group of people going to a schoolhouse and exposing themselves at a bus stop, it would be front page news all over the world. And we would put every resource possible to find these people, right . Mr. Stone yes, sir. Senator graham this happens every minute of every day. Mr. Stone it does. Senator graham what can we do to have that same intensity to track down the guy that is going to school and doing inappropriate things to our children on the internet . What is missing . Mr. Stone first of all, what i think you are seeing throughout this country is you are seeing prosecutions at the state and federal level that are targeting these areas. Part of it has to be the intelligence investigation aspect of it and trying to stay focused. Senator graham mr. Mckenna, you set up a site to mimic a teenage girl, right . Mr. Mckenna correct. Senator graham what happened to the people who sent pictures of themselves . Mr. Mckenna absolutely nothing. Their messages showed up in the direct message box of the Instagram Account, and that is where they sit. We can report those accounts and then ask for instagram through the reporting process to do their own investigation and potentially take those accounts down, but in our experience, because you can set up a new Instagram Account within seconds, all that does is create a blip in their behavior. Senator graham how do we get to the bottom of that . Mr. Mckenna we would say that is a feature issue within instagram that the very nature in which we have identified four primary features that exist within instagram create ripe opportunity for exploitation through direct messaging, even with the private accounts, through hashtags, which instagram really is a mini Search Engine, where there are no filters. Senator graham so the social Media Companies, it is within their power . Mr. Mckenna it is within their power. Their features create these opportunities. Senator graham tell me about the 18. 4 million number. Those are cyber tip reports, mr. Chairman, that come to us mostly from the Electronic Service providers or large Tech Companies. About 98 come from Tech Companies that are reported to us. The rest are from citizens who would report such content to us. Interestingly, when i take over as ceo of the center, we were at about 4 million or so cyber tip reports annually. That was 3. 5 years ago. It rose to 18. 4 million last year. It is due in large part to larger reporting requirements through the Electronic Service providers. Senator graham what happens to these people, for lack of a better word, that are reported . We send those reports on. It is a global problem. Senator graham do they get prosecuted . They do. We work with the internet crimes against Children Task forces in the United States. We send those reports around the globe. About 93 have a national connection, which i think simplifies the Global Nature of this problem, and so when those reports are sent to those particular entities, because we are not a Law Enforcement organization, we trust in their goodwill to try to find out, investigate and prosecute those individuals. Senator graham is it creating a sense of deterrence . I believe it is in the sense we have better reporting, and i think it is always good that you are getting more of that content identified. As i said in my initial testimony, aid is a problem that we cannot really arrest our way out of. The volume alone is staggering. Senator graham ms. Campbell, what would be your message to the committee about the ftc and how they should up their game . Miss campbell [no audio] the ftc really needs to enforce the provisions of copa. Senator graham on a scale of one to 10, how good are they doing . Miss campbell i would say about one half. There is a lot of potential to get better. Miss campbell there is. Senator graham do you all recommend to be call in the Tech Companies and they sit where you are sitting . Miss campbell yes, i do. I think they bear a lot of that responsibility. Senator graham we will do that. Senator blumenthal . Senator blumenthal thank you. I agree completely that we ought to be hearing from the Tech Companies because i think on that scale of one to 10, half is probably generous. Given the experience that you and others have outlined today. Would others agree that the Tech Companies can and should do more to protect children . Mr. Stone . Mr. Stone i do, and i would like to make a comment about what you previously said concerning youtube. The dark web gets the majority of the attention when it comes to criminal issues. On the other hand, i believe the mainstream, the more mainstream internet, is actually just as dangerous. The videos i described earlier, the music videos that are professionally made that show gang members with guns and drugs, show them literally law belittling Law Enforcement, those recruiting tools, trying to get our teenagers and our young people to join into their gangs are not being posted on the dark web. They are posted on youtube. We watch those in our office. From that perspective, i think the mainstream, more mainstream internet is something we really need to be more concerned with right now. Sen. Blumenthal would you agree with me, and let me ask the other members, we could pass all the laws we want, and i am a supporter of the mark and holly revisions, and i will be a cosponsor as soon as i am able, if we have another republican join us, but the best laws in the world are dead letters and if they are not enforced. So with the members of the panel agreed ftc should be doing a better job, as professor campbell has stated . Is there anyone who disagrees . I would suggest we should have the ftc testify, as well, on an occasion separate from the Tech Companies. Professor campbell, you indicated there have been 29 actions to enforce copa, essentially all settled inadequately with minimal penalties and ftc has so far failed to respond to 14 requests for actions per you outlined two. Would you be able to provide the committee with the other 12 . Ms. Campbell of course. Sen. Blumenthal in order to deter this kind of misconduct on the internet, let me ask mr. Stone and mr. Clark, are the Tech Companies providing you with all the forensic data and support that you need when you are investigating Sexual Exploitation . Until we get to the issue of the encryptions, yes, sir, i think we are getting a good bit of that information. I prosecuted a case in may, in which the people, there was one 15yearold that shot and killed another 15yearold. The threat the shooter made to the victim was over snapchat. The problem with snapchat is it is instantaneously, it goes away. We were not able to get that material. The shooter also, when he was looking for his 40 caliber gun to gun down the other teenager, he posted that on facebook, and we were able to pull all of this facebook material through a search warrant. Sen. Blumenthal you mentioned the department of defense sponsored nimics project to crawl the dark web and is an example of the Technical Schools the government could provide for local and state Law Enforcement. Thanks for saying that. The Manhattan District Attorney used the program and testified to congress concerning it. That is correct. It crawls the dark web, it is an advanced Search Engine, but most prosecutors dont have the resources to have that advanced Search Engine. In todays time, especially when we deal with human traffic, i think we are seeing a small percentage of the actual crime, and i think Something Like that advanced Search Engine is essential for us to determine how big of a problem we are dealing with. Sen. Blumenthal that is an example of how the federal government could be more of an assistance to you. Yes, sir. Sen. Blumenthal how about forensic evidence from the big Tech Companies, is it forthcoming . Mr. Clark there is great room for improvement. We have seen, thankfully, some improvement with that. For example, microsoft used photo dna and helping us be able to do a better job of tracking images online through the hashing process. We also received or have by evidence from the cyber tip line, have been receiving a lot more notifications through the Electronic Service providers and tech that are reporting the images. Last year, i believe it was around 67,000 images that we also found, nicmic discovered we ask the Big Companies to take them down and they did. So we are doing better at that level of collaboration, but there is always room for improvement. Sen. Blumenthal and they have been mostly reactive, not proactive, correct . Mr. Clark in essence, yes. Reactive is not probably the role we want to see, although it is such a volume of imagery that is out there around the globe, some of which is not even coming from the United States but other places around the world. It is very difficult to get all those Tech Companies. Right now, we have about 1500 of them working with us, but to get them all in line to help on that problem. Sen. Blumenthal thank you. Senator lee . Senator lee thank you for h. Olding this hearing and for each of you for being here. This is an important topic. It reminds me of the time i was riding in the car with my then teenage son a few years ago and a song came on the radio. I listened to the words, concluded they were bad, i told him it was terrible and turn ed down the volume. And john said without missing a beat, it is not bad if you dont think about it. This is one of those things. We prefer not to have to think about it. It is not bad only if you dont think about it. This is a crisis affecting our children. It is not an abstract or a rare one. 2018 Research Study concluded that 45 of americas teenagers today are online almost constantly, while another 44 are online several times a day. That is 89 of teenagers overall. This is far from an abstract issue. Many of those people come into contact with things that harm them. Sometimes the results are severe. According to a thorn study conducted in 2018, 55 of Human Trafficking survivors, who are trafficked in 2015, first encountered their trafficker online or via text message. Those are the extreme examples. Many others who are not trafficked end up being harmed by things they are exposed to online. The Apple App Store and the Google Play Store accounts for about 95 of App Purchases in the United States. To their credit, both apple and google use and age Rating System and have some child focused rules to prevent children from being exposed to things that they should not be. Yet, some of the most popular apps sold through those stores, including things like youtube and snapchat and instagram can and often do. In fact, constantly due, provide sexually explicit content to children. It is not just when a child is looking for that, it happens when they influence childlike search terms that have nothing to do with sexual content. Mr. Mckenna, i would like to talk to for a minute, what should app stores be doing differently with their rating measures to help prevent this kind of thing from happening, prevent providing this kind of sexually explicit content to children . Mr. Mckenna thank you, senator lee. First and foremost, we simply want transparency and Additional Information for parents. In no other context where we know young people are spending a lot of time do we allow for this much inaccuracy with the information we have given parents to protect their kids. Im envisioning an analog example of a toy and a toy isle that has some of the features of the things we see on apps used by teens, and then telling parents, well, extend with all the other things your kids love, we want you to avoid looking at that one place. We would never allow that to happen, so when it comes to the app stores, we want there to be a unification. You referred to apple and google, but they use different processes, as we explained in my written testimony, to evaluate and score those apps. So there is this confusion when you look at one or the other. We also want to argue that what you get is a generic set of descriptions that really do not tell you what is actually going on inside of the app, so this transparency of what the true interactions are. If you were to look at the App Store Description and apple for instagram, ticktock and snapchat, three of the top social media platforms for kids, you would find the same description. In other words, there is nothing unique to tell us about predatory risks or things specific. Just let me connect active esrb, or if you look at some of the top games that the esrb has rated, there are volumes of information telling you what goes on, what happens to a certain body parts, blood, gore, and you walk away from this scription as a parent with a clear view as to what im going to be putting in front of my children. Now, i still have the option, but ive been informed. We wanted that unification of the same. Make it transparent and provide accountability, so that when players in the app store not getting information to parents as they should, that there are some ramifications, otherwise, hate or will not change. Senator lee well said. Expecting that a child behave as a good internet citizen falls on deaf ears, especially when someone is encouraging a child to use an app that at times acts as a strip club. Mr. Chairman, i ask unanimous consent that we submit into the record the testimony of john hopkins, the director of child exportation. Exploitation. Senator graham senator ernst . Senator ernst thank you and thank you to the panel today. This is a hard topic to talk about openly, but i think it is vitally important. The internet has changed the way we do business so much, and it is so easy to access information. And some ways, that is wonderful, in other ways, i think about where we are today. I remember as a kid, and probably many of you do, too, at school, we talked about stranger danger and do not approach a man if he has candy or puppies. Now, that hypothetical van has gone away and we have those who want to exploit our children using the internet as their van or puppies or kitties. So, just talking, we are talking about different apps, devices, how we can protect our children. Our airways that we can focus on the actual devices that our children are using . Is there a way to block certain content and so forth . Should we be going further with that, mr. Mckenna, could you talk about that . Mr. Mckenna sure, i believe there are simple things that could done almost instantly. Whenever we set up a device for our young people, whether an iphone or Android Device, that device almost always knows the age of the primary user because when you set up an apple id, you put in a birthday. When you set up a google account, you put in a birthday, so we would simply submit as an idea that in setting up the device, what are the simple defaults attached to an age . Which removes the barriers. I speak to so many parents. I had a conversation on the airplane ride over last night, and i said, i would love for you to come. She was relating the difficulties as a professor, and even with her high level of education on how to hunt for certain settings in a phone, so many of these needs could be done by default, not only from a production standpoint but privacy standpoint. What if the posture of our digital places was one of protection and privacy instead of exploitation, so that i have to search for protection . Instead what i have now is i have to search for privacy. I have to search for Parental Controls in order to protect children. And we would simply argue that the default posture should simply be the other way around. That if i want to choose to allow my children to have access to things and that should be when i hunt for instead of the setting. I think that is just smart. Since we are able to set up our devices, to understand as a parent that there are ways to do that but it does need to be easier. For heaven sakes i have a hard time figuring out my own device. This is very interesting to me. Could the ftc do this by rule . . Whatever the agency is . The ftc doesnt have Rulemaking Authority except i think they could probably do something but they havent really used that authority much. Im just thinking about how you can make this a reality. Absolutely. We can focus on the apps but i think they could work in conjunction with our devices as well. And if we really want these Tech Companies to step up i think that needs to be part of the equation as well. I know that my time is expiring. I will leave it there. I know weve got some other questions. I do think this is an area we could delve a little deeper into. There needs to be Creative Solutions coming from all angles and not one of just apps and the different programs that exist out there on the internet. But also finding ways that we can use our devices as a screening mechanism to protect our children, too. Thank you very much to our panelists. Thank you mr. Chair. Senator kennedy. Mr. Mckenna, if you were buying a phone today for your 14yearold, and you wanted to prevent her from being exploited sexually, or exposed to things that she shouldnt see, what would you do . My daughter does currently have a phone. She has an Android Device that is used in certain very public places. Weve decided as a family that Location Matters where kids use phone matters. Im not asking about your daughter, particularly. But in general. What im getting at is what can a parent do today to try to prevent their child from seeing this stuff . Well, frankly, if they allow their children to use social media, there are very few Parental Controls that would prevent them from seeing these things. So the only way to prevent them from seeing the things that we experience on social media is to not allow them to use social media. Which creates this sort of binary decision that parents are confronted with. Either we allow it or we dont. And if we allow it we accept a certain amount of risk. Social media can be filtered, can it not . Social media creators in its current iteration, no. I cannot filter content on instagram if i were to search for hashtags. Same is true for snapchat. Let me put it another way. Im interested in solutions. I think we all understand the problem. Could apple, for example, design a program that a parent could opt into and the instructions to apple would be design a program to filter all information that my daughter or son may see that could be sexually exploitive. Maybe filter all pictures or written references to human genitalia. Can that be done . To be honest, senator, im not sure. Well, isnt that the short way home here . Sorry, go ahead. Senator, all of the major opeRating Systems, all of the cell phone operators all have got Parental Controls. Whats very interesting is that the First White House summit that the first lady held only last year, she called upon the Tech Companies to make but first, can it be done . Yes. So, could we write a legislation or promulgate a rule that says, heres the thing that a reasonable parent would do to protect his or her child from seeing this stuff. And we do that in conjunction with somebody who has the obvious expertise. And you filter everything. I dont know how to do it. I cant write software. Maybe it is to prohibit any pictures of human genitalia or prohibit any references to sexual activity. I dont know. The kids arent going to like it. Thats not who were trying to please here. What can be done . I think senator ernst is right. You can fight with the social Media Companies and their lobbyists and their campaign contributions. I mean, weve been talking for what, two years about a privacy bill . Havent seen one. Ok . Dont know if well ever see one. Thats not a criticism of anybody. Its just the way this place is. We need a microwave, not a crockpot here. And i just dont see why you cant apple cant just write a program. Are they going to miss some things . Sure. But if theyre in good faith, write me a program. Put it into my phone thats going to filter all of this stuff and do it in good faith and look over their shoulders. Can we do that . Just to give you an example, verizon has something called smart family that will do something do what youre asking to do and they have created it. At t has their own and tmobile has their own without having created a law. I dont know, im not a lawyer. Can it be done . It is being done. Can we require that . On every operating device . If we did require that to be put in every operating device would that solve the problem, substantially . I dont know how you craft a law, and there are already Parental Controls in the devices. Our biggest challenge is to educate parents that these we cant legislate on that. Let me ask one other quick question of the professor. Professor, you think theres some people at the ftc who are doing a lousy job on this, dont you . Yes. Do you know who they are . The bureau of consumer protection. Would you support legislation to make it easier to fire them . I dont know that thats the solution, senator. You dont think somebody ought to be fired when theyre doing a lousy job . Well, perhaps, but i think we need to look at the reasons why theyre doing a lousy job. How about because theyre lazy or not interested or theyre not very good employees. Perhaps. Thank you to each of you for being here. Weve been doing privacy and data security. I have a privacy bill you can sign on. Its called the browser act. I think the chairman for this thank the chairman for this hearing. And i thank senator blumenthal for working with me on this issue. And to each of you, i thank you for your attention on this. I have to tell you, as i watched and read what was happening with Jeffrey Epstein this weekend and that arrest and how the authorities found hundreds of nude photos of these underage girls in his home, and i think his case serves as a warning that predators really do lurk and mr. Clark, this is partly to your point and mr. Mckennas point. They lurk in every corner of society, whether its the rich and powerful or the poor and depraved. And those protections that have existed in the physical world many times are not being transferred to the virtual space. And you each have spoken to that. And i think, as you said, these predators no longer lurk in aol chat rooms. They are right there on all of these apps that our precious children are using. Whether it is snapchat or instagram or facebook or youtube or groupme or any of those. Unfortunately, we do not have regulation, and many of us have focused on this issue of safety of Children Online for years. Im glad the Justice Department and general barr are working on this. Theyve got the safe neighborhoods program. I think that deputy ag stacy harris, who is doing work with the National Strategy for Child Exploitation is really focused on these tech platforms and having a collaborative effort is going to be important. I had reviewed that case from a couple of years ago. The soledad case from california. When ice agents apprehended him , he was caught using snapchat to try to coerce a 13yearold boy to try to send him sexually explicit video, and then the parents called the police and then when ice agents got involved, they found five more victims, ages 12 to 15. The children lived in tennessee, texas, georgia, illinois and california. And each victim was coerced using a social media application. That is how this sick, vile human being was going after these children. And i will tell you, as a mom, as a grandmom, it is really so disheartening to me to see children used as a center for profit on these social media platforms. And a few years ago, the eff filed a complaint with the ftc regarding an investigation into googles surveillance of schoolchildren on chrome books. And to the professors point about the lack of followup and Due Diligence that is being done by the ftc, this is one of those areas where they go in, they scoop the data. They track. They follow. And theyve got that virtual you, if you will, of that child. Because they have all of that identifying information on that child, their virtual presence. I have talked through much of my time. Mr. Mckenna, i will just say this. You mentioned mr. Zuckerberg. He has fought us on privacy for years. Finally, he is changing his tune. Youve talked about some of these dangers. What can the apps like instagram and snapchat do, specifics, to embed into these applications . They can manipulate these algorithms if they so choose. They are choosing not to do it. So if you were building that algorithm, what would you embed . We would simply allow we want children to learn how to use technology and to use these apps. We just would argue that parents need a choice as to what features their children do and dont have access to inside of these apps. So, if i were creating a picture app, i would allow children to truly have a private account with a curated set of contacts that wouldnt allow people who are not a part of that curated list like the apple watch is doing now. Thats correct. I would eliminate the ability for strangers to contact let me move on. You mentioned the videogame rating. And i was in the house when we handle that process. I think Mortal Kombat was the big game. It was. And what i would like for you to do is to submit, in writing, for the record, how you would approach this ratings system because i agree with you. The ratings are totally wrong on these apps and therefore the parents do not have the information to make an informed decision, nor do they realize that when youre in a snap and you get into that discover section or that newsfeed that it is and ive said this yesterday when i was looking through some of these things. I said this is almost what we went through with the videogame and Mortal Kombat. One thing leads you to another to another and it is deeper and further down the chain of what is acceptable or even expected by parents. So, i thank each of you for your diligence and your work in protecting children, and i look forward to continuing the conversation. I yield back, mr. Chariman. Thanks to each of you for being here. I think we can agree that exploiting Children Online is one of the worst dangers and one of the worst social threats that we face. But lets talk about whether or not being callous towards childrens safety is actually part of the Business Model of some of these companies. My concern is that it is. And i think you dont have to look any further than reports of youtube, the New York Times report about the algorithm collecting and funneling information to pedophiles and of course, as we discussed this morning, allowing those pedophiles to contact the children. This report was sickening, but i think what was even more sickening was youtubes refusal to do anything about it. Youtube admitted they could do something about it. They could stop auto referring these videos of minors to pedophiles, but they chose not to do so. Why not . Because their model is that 70 of their business, 70 of their traffic comes from these auto recommended videos. In other words, theres ad revenue that would be lost if they actually took some enforcement here, to take steps to stop this exploitation of children. My first question is, does anybody think that one of the Worlds Largest most powerful companies should be privileging their ad revenue over childrens safety . Thats why i introduced a bill that would require you to do what it can do, what it absolutely has the power to do, which is to stop recommending videos that feature children unless those videos fall within very carefully tailored safety exceptions. Its common sense legislation. I hope this committee will hold a hearing on it and act to actually protect our Children Online. Let me just ask you, mr. Mckenna, do you agree that for some of these companies, aspects of their Business Model actually conflict with protecting the safety of children . Thank you for the question. I absolutely agree that the Business Model based on reach and engagement is one that absolutely conflicts with protection. What else can we do about that . And tell us more about how the design of some of these companies and models of their platforms pulls against the goal of safety for Children Online . Well, the simple premise of snapchat, which is one based on images that disappear, they have a model that wants young people to use an app that, neurologically, they are not ready to use. Their entire platform is built on something that we would argue teens are just not ready to engage in. Snapchat could be doing so much more in terms of the type of friend referrals that happen, in terms of what happens in discover news. They know that, in discover, which is where third parties would put articles, thats where engagement happens. They want more people looking at that content and that just feeds that cycle of revenue. Is it your sense that snapchat and other similar platforms are engaging in practices as relates to their child users that are exploitive . Absolutely. In no other places where we have significant numbers of children spending a significant amount of time do we allow so few controls to govern those environments. I mean, 16 million teens are using snapchat. Even if the app were to say that we are for adults only, i believe they have a duty of care responsibility for a population of teens that they know are using their app to protect them. Professor campbell, you are nodding your head. Do you agree with that and do you want to add something on this point . Yes. I would add to give you an example of how a company could do a better job of structuring its products, youtube actually has a product intended for children called youtube kids and its got some good policies. The problem, again, is that they are not really enforcing those policies. Theres a lot of content even on youtube kids that is inappropriate, and weve complained to the ftc about this. One of the reasons is that they design the product so that all programming on youtube kids first has to be on youtube. So all of these kids channels are on youtube and so kids are watching it on youtube instead of youtube kids and then getting exposed to all the advertising and, of course, google is benefiting from those ad revenues. If they had designed youtube kids differently so that it was a protected space for children , that wouldve been a good thing. But thats not what they chose to do. Let me ask you briefly you mentioned the need for the ftc to get serious about enforcement for updates to coppa. The first in 20 years that would provide sweeping new privacy protections for children. It would also apply to any company that has reason to know that a child is a user, even if they dont have actual knowledge, constructive knowledge, which is hugely significant. You mentioned earlier that the ftc may be able to use some Rulemaking Authority under coppa to require new default privacy settings for devices or on platforms. Should we thinking about an update including language to this effect . What would you counsel us to do to make this update as tough as possible . Well, one problem is what we call dark patterns, which is when a company will create usage to do something that is not necessarily in the users interest but does benefit the company. But what mr. Mckenna was talking about, with making it difficult to find privacy settings, not setting defaults to protect privacy. Those are part of the dark pattern problem. They can make it very easy to do what you want them to do, and make it very difficult even if you make a lot of effort to change those privacy settings or protect yourself. Great. Thank you very much. Thank you for your important work on this topic. Senator taylor. Thank you all for being here. Could you talk a little bit about the initiative with the first lady . Whats a successful instate look like . She chose Cyber Bullying , Online Safety as her main task to work on as first lady. Shes also included opioid abuse and general wellbeing of children. Weve been very impressed with her approach. What does that look like in terms of the impact it would have on the Technology Providers . Well, i think its raised the subject and the topic to the front. Look, at the highest levels, there is concern and a challenge to Technology Companies to step up and to deal with this issue. Mr. Mckenna, when you were exchanging questions and answers with senator ernst, you brought up the concept to try to get out a way to understand who is likely to use a phone. Im a technology person. Im trying to get my head around the numbers because there are some 300 million cell phones in use in the United States and that number is only going to grow. Many of those are in the hands of adults who may leave them. Theyre not password attracted. Im trying to figure out practically how we could do that. Theres also, i think in the last census, some estimated 89 of all households have some personal computer. We havent even talked about one of the other portals and all this horrible activity that weve also got to sweep into this. There are so many ways, and the internet of things, who knows whats next in terms of being able to drive content to you and possibly have a child respond to it . Has there been any sort of collaboration between groups like yourself and the Tech Industry to sit down and figure out the short and longterm strategy for doing this . I follow a lot on texting while driving. I got criticized when i was in the North Carolina state house for apparently denying somebody there constitutional right to endanger my drive while theyre with their phone going down the road. I understand what you all are here for. I think if we really take a look at how we are going to make the new internet of things a safer environment for youths and young adults and adults weve got to really instead of looking at any one area as the solution to the problem we have to look at this holistically or were only going to scratch the surface of what i consider to be the Public Health and Public Safety crisis. So if we go back to the idea i immediately thing, when you go and set up an apple id, you could lie and say youre 18 or 16 or 14 or 25. So whos responsible for determining the honesty of the user of that platform . I mean, lets say youre a 14yearold who lied because your parents gave you access to the credit card or whatever you needed to open up an apple id account. You lied and said you were 20. Whos responsible . The parents in that case, no question. Im glad you answered the question that way. We also have to recognize we cant only weave into this discussion the tech providers , which are constantly changing and if all of a sudden we put the big tech on notice then little tech is going to come in and provide a different solution. Youre giving your child a potentially lethal device and at some point in time we have to have a discussion about the parents responsibility for making that discussion safe. Or taking away from them. Heres a concept. At 9 00, they dont get the phone. We have to have the discussion with the parents about understanding their routers. Understanding you could prohibit access to all of these sites if you choose to if your kids are connected to a hotspot at home. And then coming up with a balanced solution that absolutely mandates that Technology Providers get a handle on the content and the potential lethality of that content. But we have to get the parents and those who provide the technology to the young people in the equation. Absolutely agree. And in doing so, make it as minimally complex as possible. Right. And then you can actually place more responsibility on the people who ultimately had to take care of their children. If you opt out of this, if you allow them to go into a Gaming Program environment potentially leads to a dangerous outcome than the parents ultimately have to be held accountable for that. Senator, if i may, its why we created the good digital parenting initiative. We distilled all of our experience and expertise to something called the seven steps to good digital parenting and key to this is talking with your kids. Sitting down and setting Parental Controls. Using an Online Family Safety contract with your children and critically, and whats missing in all of this also is parents being good digital role models themselves. Putting that phone down at dinner, not texting while driving. Giving your kids eye contact when they come for it. Parents are absolutely radical part of this. Theyre the least well served in terms of information and support. There was a news report about a month ago, it may have been google or youtube. They have operations set up where they have people its a horrible job. I cant imagine doing it. Where they are reviewing all this content. Do we have any idea about how much of the maligned content is being interdicted by the current processes versus whats getting through it . I think thats another thing we need to do to see to what extent do we scale what theyre already doing to cast a wider net . Those are the sorts of things we have to talk about if were going to come up with good sustainable policy that lets us benefit from all the great things the internet has given us , but also address the Public Safety that youre all here today. That sort of information if you have any they can give me an idea i know were doing poorly. But what efforts out there if we simply scaled them could reduce a lot of the problems. Senator, i may not be able to give you exact numbers on what were missing, but i will tell you this. Our Attorney Generals Office in South Carolina prosecutes about 250 of these offenders every year. But whats more disturbing through the internet crimes against children whats more disturbing i think to me looking at those gross numbers is while those indictments are being brought their investigators are working through about 85 new cases every year and in those 85 new cases they are seeing over a million 86,000 files of child pornography that theyre having to go through. So the numbers that were talking about are massive. Senator blumenthal. Thank you. A couple of brief questions. We havent really talked about data mining and the threat to childrens security from the extraordinary exponentially exploding means of gathering information about children. Coppa was never designed for toys connected to the internet, toys with cameras. Toys that often have their own security vulnerabilities. Im part of a working group that is examining proposals to assure protections for privacy, and one of our models is the California Consumer privacy act, that requires businesses to get opt in consent to sell information of children under the age of 16. I believe that the rest of the country ought to follow californias example and impose the strictest protection possible. Does everyone on this panel agree that the standards for consent and strict privacy protections under federal rules should be expanded to cover children and minors under 16 . Yes sir. Mr. Mckenna . Mr. Clark . Im glad that everyone agrees. Professor campbell, you suggested that all child directed content should be placed on a separate platform where targeted advertising commercial Data Collection links to other sites where content, comments and autoplay are prohibited. And i agree that that kind of separate platform would help protect children. Would legislation be required for it if it were imposed rather than done voluntarily . I think its possible that is something the federal trade commission could negotiate with youtube as part of if they are investigating them and trying to settle the case as part of the settlement. Otherwise i think it couldnt require that under existing law. With that approach have applications beyond youtube . Thats the problem with consent decrees. They are only binding on the parties to the consent decrees. So it would probably require legislation. It would require the ftc to impose that kind of separate platform on a variety of other sites. Yes. And let me ask finally if each of you were to design or let me ask you professor campbell initially, heightened safeguards for children and minors to protect against this kind of data mining, what additional steps would you take . And we are seeing this happening a lot for example with the amazon speaker for kids. Specifically for kids but they are saving their voice recordings of children and the translations of those voice recordings in many cases indefinitely and theres no way to even delete all of them even when the parent tries. Except to take the child out of the service and a lot of times you want your child to be able to use these services. I think we really need to focus on data minimization and security because the data thats available to companies can easily be available through data brokers and other sources to predators and lots of other people. And data minimization is another requirement of the california law. And the childrens Online Privacy protection act. Exactly. Any other suggestions from other elements of the panel . Thanks again. Senator cruz. Thank you for testifying on this very important topic. Mr. Mckenna. An article last month in the New York Times detailed how youtubes recommendation algorithms can lead users from adult pornography took videos of girls as young as five or 6yearold wearing bathing suits. Do you know if youtube has changed their algorithms regarding Child Exploitation and if not what can and should youtube do to stop this . Senator cruz, thank you. On that very important topic i am not aware that youtube has changed their algorithm. Im guessing if they would have done that it would have been newsworthy and parents everywhere would have been celebrating that and im not aware of that news. I believe that youtube in their vast knowledge has the ability to in the same way that they can recommend videos they can choose not to recommend videos and that would simply be a change that they would have to choose to make. Wired reported that sexual predators use the Comment Section of youtube videos to help other predators find videos. For example they would announce times in videos that would be of particular interest to sexual predators. Youtube thankfully disabled commenting on many of the videos of children under 13. Has this prevented sexual predators from using youtube to spread this information or is it simply moving the problem elsewhere . I believe its a step in the right direction. Im happy theyve made this change. I believe it should mitigate some of the activity. But based on our experience these individuals are extremely creative and the comments section irrespective of platform whether its youtube or instagram and other platforms that allow comments are all breeding grounds for this sort of networking of pedophiles. So if its not going to be on youtube it will simply move to another platform and weve been witness to that. As you know, little over a year ago we passed and the president signed into law the stop enabling sex traffickers at which amended section 230 of the Communications Decency act to strip immunity from Online Platforms that knowingly assist facilitate or support sex trafficking. How successful has sesta been and what else can congress do . Good question. I think sesta largely has helped the takedown of backpage. Com for example. Certain ads that were attracted to the networks of mr. Mckenna that shared a network to entice kids into sex for sale. I would also say that it seems like when one disappears another one appears. Thats a global issue as weve been saying across the country of the United States plus the world. We are seeing still in increase on those number of chat sites, video sites, opportunities for exploitation to continue in the digital world. Last year, a texas woman sued facebook for providing human traffickers with a way to stock, exploit, recruit, groom and extort children into the sex trade. In your judgment are facebook and the other social media sites doing enough to combat exploitation of their platforms by sex traffickers . Once again i would say there is room for improvement here. I think what weve seen with the cooperation of the Tech Companies on the plus side being participating in things like cyber tip roundtables bringing companies together so that we can have a collaborative approach. If theres ever been a cause for what i call a true Publicprivate Partnership this is it. Of all the Tech Companies that handle reporting to nick nick about 1500 of them, many of them still can do more. Was trying to formulate in my head to say was that it doesnt feel as though until theres a huge news report or there is some pressure. There is always this reactive response to these sort of issues from these large companies. They are brilliant organizations. I am confident that they have in their knowhow the ability to within weeks take care of many of the problems that we have spoken about here at this hearing. But its not until they are pushed or there is pressure or there is reputational damage or something that they seem to move and even then in the case of youtube they are still unwilling to do some very simple things that we think would protect children. Thats my greatest concern is continual resistance and reactive approach to the problems that us feel so obvious. I would certainly be interested, i know the committee would in any of the witnesses recommendations for additional steps that need to be taken because this has to be continuously and vigorously addressed. I just wanted to give an example using youtube. Why are so many under 13yearolds on youtube when the terms of service for him at them from using youtube. I think again if children were using a different site that you wouldnt have as much problem with predators. It is true that its a global problem because you will see the same kids in an outdoor pool in their bathing suit promoting toys. They are being paid by products to promote these toys and they are on all of the different platforms as well. It is a problem that needs to be looked at across platforms. I want to thank senator cruz for mentioning sesta and also section 230 which in some ways is the elephant in the room. The immunity which is so broad and uniquely encompassing by many of the Tech Companies is part of the reason that they are failing to do more. Because you mentioned the potential duty of care, standard of care which is enforceable only if there is some accountability. And right now the biggest obstacle to accountability is section 230. I appreciate senator cruz mentioning assessed up mentioning sesta. We thank you again for your help on it. I think its an example of the bipartisan action that i hope will emerge from this very excellent hearing and i thank you, mr. Chairman. Thank you. Things would change tomorrow if you could get sued. The problem of getting sued about providing content if you are responsible for everything that flows through your Business Organization in terms of content you probably drive at least knew people out of business and maybe put the old people in jeopardy of folding. What i would like to do is make sure that you are earning this Liability Protection. That under section 230 your neutral platform for you to have that Liability Protection when it comes to exploiting children that you are earning it that you come up with these Business Practices. It seems to me we need to come up with a set of best Business Practices to protect children. If you have a default we think thats the best thing for the country. What i want to do is look at the best Business Practices when it comes to protecting children from exploitation on the internet and if you meet those best practices, you are ok. If you dont, youre going to get sued. Seems to me that will do more good than anything else. The Solutions Come from the people in the private sector as long as they are legitimate and people like you will tell of it tell us if these are legitimate. You need to find out what does work and we will certify at the government level whether or not its genuine. And if you pass that sort of government audit then youre good to do. If you dont then you are exposed to lawsuits from moms and dads and everybody else. So thats where id like to take this discussion because 230 is the elephant in the room. What i would like to do is leverage that elephant in the room that you can have these Liability Protections because we want to save the industry and preserve it for competition but you have to earn it. And what have i learned from this hearing. Youve got a lot of Liability Protections but you are not doing a whole lot to earn it. Thank you very much for your testimony. Washington journal everyday with news and policy issues that impact you. Coming up sunday morning, we look at campaign 2020 and efforts by democrats to boost turnout with justin myers, ceo of a super pac. Plus, american conservative editor james shares his views on. Ampaign 2020 live at 7 a. M. Eastern, sunday morning and look for contest week starting at 9 a. M. Eastern and we speak to the washington examiner. Tuesday, rachel kuester quester of the daily joins us. On thursday, crist Chris Stirewalt and on friday, jennifer briney, host of congressionalish cong ressional dish. Watch the second round of democratic debates on cspan, cspan. Org, or listen with the free radio and app. Powellednesday, jerome announced an Interest Rate cut for the First Time Since the 2008 financial crisis. Following the announcement, he took several questions from reporters, this is 45. Minutes. Good afternoon and welcome. We have decided to lower the target for the federal funds rate by quarter of a percentage point to 2 to two and a quarter percent. The outlook for the u

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.