Transcripts For CSPAN Digital Photo Theft 20141109 : compare

Transcripts For CSPAN Digital Photo Theft 20141109



this is one hour 10 minutes. >> welcome everybody. i and the executive director of the internet caucus advisory committee. hopefully we will get you out of here in 60 minutes or so. this topic is on your program, which you have in front of you. you have the information on the speakers and their twitter accounts. you can contact them on twitter or any other way you would like bywe are hosting this event the congressional internet caucus advisory committee in conjunction with the internet caucus. on the house side, bob goodlatte. we are in their debt for hosting and supporting this program. they don't agree on every issue. frankly, not on a lot of issues. we are thrilled that they agreed that the internet should be a place where we can debate these issues with expert speakers like we have today. i want to thank them, and our moderator today is a reporter with politico, a cyber security reporter, and she has covered this over the past couple years, and she is situated to moderate our panel today. her twitter account information is on the program as well. take it away. >> thank you, and thank you to the net caucus for having me here today. it is a very interesting topic which we will be diving into headfirst. just to introduce our panel, from here down on, we have marianne franks. they associate professor of law at the university of miami. to her left, is the director of the free expression project. then a columnist at yahoo! tech. contributor -- all these fine people have a lot of expertise in this topic from a lot of different angles, which is where i wanted to start off today. one of the most interesting things about this hack of the celeb photos is it raised a different issues for a lot of different people. as a cyber security reporter, i covered in terms of the password security on the cloud and what the technical aspects of that hack might have been, but it raises a number of conversations from misogyny on the internet to what actually is the nature of the crime that occurred, whether you look at it from the perceptive of a sex crime, a hacking crime, a first amendment issue here, and we will touch on all those different takeaways today. how i would like to begin is if each of the panelists could go down and say what for them was the one or two big takeaways from this hack and what is the most important thing to look at about this. so i think a good place to start as any is with jennifer lawrence calling what happened to her is a sex crime. a lot of people were taken aback by that characterization because as a matter of law that is true. what it highlights is an invitation to what we think what a crime is, what we think a sex crime in particular is, and thinking about ways we can recognize it as being such. i think it is interesting to hear from such a high-profile victim of this particular behavior that her own sense of , one of was this violation of sexual autonomy, humiliation, and exposure that she would call this a sex crime. what i think would be the perspective that i would take on this is to consider why we criminalize certain types of behavior. when do we start drawing the line between bad behavior and behavior that we think needs a response from criminal law. i would invite us to think about that in terms of why we think the criminal law is important, not a narrow focus, but a social expression, a condemnation of certain types of harms that are so serious that one of the only ways we can express it as a community is to say this should be against the law, and think about the particular nature of what happened to lawrence and other victims in terms of the daily suffering and humiliation they have to experience, that they can never get back, there is no way to undo what has been done, that the harm in these cases really is irreversible and ongoing. what i hope we can do to frame the conversation by looking to a perspective of the victim is think about why we might care about the fact that such sexual humiliation has become an entertainment industry and what is our response validity as society, and if people are concerned about having a free and equal an open internet, what we should be doing in response to that. >> thank you, and i thought it is interesting how professor franks was talking about social expressions of condemnations about this behavior, because that me was one of the major differences i saw in the response around this most recent exposure of celebrity photos, compared to how this issue and how the nonconsensual disclosure of these images has been treated over the years. five years ago or several years ago, when many of us seated at this table first started talking about this issue, it was difficult to get people engaged on the question. there was not a public conversation about how is this being used as a way to try to go after women, to harass them, to silence them, and to see that shift in the public conversation about there is much more willingness for major media outlets and for people engaging on social media to be talking about the other side of the story, to talk about, no, people should not be going and following these links. the information might be out there on the internet, that we do not have to see it and to treat what has happened to the people whose photos have been exposed as a real harm that has happened to them. i think it is a good thing that we are having much more of that conversation happen in public , in society to appreciate the real harm that is happening to women when they are targeted and this way. of course, the concern that i see coming from a first amendment and open internet background is wanting to see if there are proposals on how to take a stronger response to this, insuring that whatever those proposals are not so broadly crafted that they end up pulling in a lot of protected expression as well. there is a way -- it is very difficult to craft a law that goes after, that makes a crime of disclosing information in a way that only gets after a bad or a malicious disclosure of information and does not also sweep in a lot of real and vital important speech. i hope one of things we can focus today is looking on what are the existing laws that identify the kinds of harms that happen here, whether it is a person trying to inflict distress on another person, a person launching a campaign of harassment against someone, whether the federal computer fraud and abuse act can cover the hacking aspect of this. there are ways we have addressed the harms that can come from this kind of behavior in existing law that do not entail focusing specifically on the speech aspect of it. >> my first reaction was there was this it is a bunch of celebrities in trouble response, which is unhelpful and stupid because i'm sure no one in this room has pictures they do not want on the entire internet, with some of those people on facebook, and that is i got better way to look at it, because just calling celebgate was one of the stupider -gate words put around. i spent a lot of time looking at exactly how apple security is set up. if you want to keep your information safe, are there tools available, do they help, and in apple's case, they had a weak implementation of two-step verification. if you had done it, they did not protect icloud backups. -- iallway icloud works have an ipad at home and i am not clear on what is getting backed up and how to control it. it is an opaque system. you have this case where these people did not think they were putting their pictures on the internet, and it is not only -- always clear in a lot of cloud services, where did your data go? there is a story in "the washington post" earlier this week where a cryptographer from johns hopkins -- he thought this was only on his computer. this is a guy who is paid to know this stuff. legally speaking we have laws , against unauthorized access. it does affect the sort of thing. we also need to -- not everyone is going to go through the factor of two-step verification, but it should be there, it should work, and you should know what is protecting and what it is not. >> my -- i guess i have less to say about the specifics of the lawrence incident. one of the other things that is within this purview of this panel is a related question, a broader question about, as, i guess, professor franks, called it the epidemic of sexual sites on the internet, photo sites, those kinds of things, which is a serious social issue. my thoughts turn with emma to the first amendment, first of all, which, as she said, crafting, even if we think this is harmful, crafting prohibition that would survive first amendment scrutiny with respect to much of this material, it would be quite difficult, probably not impossible, but difficult and would require some care to make sure it does not sweep in a good deal of protected material. i got involved in this -- i had a student who was working on actually a project on copyright, possible copyright remedies around these revenge porn sites, can you take down photographs based on copyright claims, and i spent half an hour, 45 minutes poking around at those sites about a year or so ago. there's a good deal of material on there that is clearly protected speech. there's some material that may not be. drawing that line would be challenging to my thinking. that is one thought i had. in the discussion about these issues, and there has been a good deal of discussion in the legal academy at least, about what to do about this, what kinds of remedies we can provide. the conversation and debate has moved often quickly to the question of website operator liability for hosting these photographs. there are existing -- we can get into more the rest of this session -- there are existing tort remedies that may provide relief to people who have been harmed against the individual uploaders of the private photographs that are being posted. but section 230 of the communications act has been construed to protect the website operator from being drawn into that tort liability, because it immunizes the operator against a broad range of tort liability, including this. so much of this discussion has come around to people arguing about whether 230 should be repealed completely or modified to allow actions against website operators. it is a very important law problem because many issues have this feature where the intermediaries, the website operators typically, are helping to spread this harmful information and yet federal law immunizes them against liability. i hope we can get into those 230 issues. >> great. as we can all see, there is quite a bit at play here. perhaps we can start with -- and it is difficult in this case because we are sort of going from a very specific instance. this was jennifer lawrence had private photos in what she believed to be a private place gotten into and her photos got on the internet. those are a lot of different from revenge porn situation, and after that relationship went south, where the photo was initially given with consent. that is very different than someone hacking into a private computer that you may not have stored something on the web. it is different if someone gets the password because you used your dog's name as a password versus a sophisticated phishing malware attack. so there are a lot of different pieces that raise these issues. generally speaking, what are the remedies that people who feel that there are private images and private data in the digital world has been exposed to the internet, what can they do now under the law to get relief, although they may never be able to get it back? >> i think it is important to focus on the fact that jennifer lawrence's situation, and the other celebrities come are different from these other types of context. it is important to not make too much difference out of these instances. if we look at it from a more traditional privacy perspective, it should not be a difficult intuition when people disclose internet information one party they also do not expect it will given to another party. whether or not that is disclosing it to your partner, or the cloud, it the obvious way to look at it is don't we have some sense of contextual integrity or sense of privacy? when you tell your doctor about your symptoms, you expect your doctor will not tell anybody else about your symptoms or share the pictures of your medical exam. you have plenty of situations where we can think about it not related to the charged issue of women's naked bodies, but think about all the ways we expect our information should be kept confidential within a relationship even though we have given it voluntarily. when we consider it that way, it is awful to think about what we do in other contexts. do we protect people's credit card information, social security information, home addresses, do we protect companies' trade secrets? these are all ways in which we might be disclosing information to trusted parties, but we have criminal penalties when people step outside of this particular contexts, and it is useful to think about why we should order should not apply those remedies here, because it is true we can come up with ways for victims to talk about copyright remedies. for them to talk about things like infliction of emotional distress. most people can see that it is difficult to talk after the fact about any kind of remedy. copyright may work for jennifer lawrence. it will not work for the vast majority of victims and have no recourse. it takes some time in the takedown process. you need to have a scary lawyer backing you up. many private citizens do not have that clout. is not an effective solution for the mass majority of victims. we also need to think about the fact that this is not going to be situations of relationships gone sour, but actual domestic, ongoing violent relationships used to trap people in these relationships, used as extortion, or keep them from a visiting the relationships. a very big category of devastating intimate material that is getting out there. the idea that there is any kind of lawsuit, or copyright remedy there will be responsive to that particular harm -- i think it is naïve or abstract, given what actual victim expenses have been. it is also important as were trying to think about adequacy of legal remedies to think about first amendment values. in that thinking, consider how much of an effect these types of harms are having on women speech. how many women are now afraid of ever being intimate with anyone or having their webcam hacked? or having a hidden camera somewhere recording them having sex? how money women are afraid to commitment ends -- themselves to careers because they are afraid this is what will happen to them. this is the punishment. the laws best response is that we should clean up the mess afterwards. many victims will not have enough money and time. that is really a sense that we will have to take seriously about how much this is affecting women. using the threat this type of behavior. using this as a way to shut women up and dry them off line. as a free-speech matter, as a section 230 matter, in the interest of fostering open to if i may disrupt the orderly bit so we mix this up, david, if you could go in depth a little bit more into what are some of the ways that the law as it stands has tried to grapple with some of these issues, and what are the ways people have looked at adapting laws that were crafted long before the internet was what it is today. >> yes, and i think the hacking -- there is the hacking side of the problem, the computer fraud and abuse act is 1 -- this may well have been in this specific instance a violation of the computer fraud and abuse act. accessing a protected computer without authorization gives rise to both civil and criminal liability under the federal criminal code, and it could be applicable -- obviously, i'm not giving anybody legal advice or taking a position on whether or not it is, but that is certainly one avenue. there are also i guess on the tort side -- there are both, as several people have mentioned -- there are a number of state law remedies for some of this behavior. as recognized in one state for outrageous conduct, outrageous activity. i guess in response to what marianne said, i completely agree. to think of copyright as a solution to this problem is naïve and not very sensible, but let me just say that the copyright act is one place in the federal code where a grieved parties can quickly arrange without a lawsuit to have materials taken down from the internet. there is notice and takedown procedure in the copyright act, which is a very powerful thing. this probably covers a small subset of the problem, but i think it is not a trivial subset of the problem where people can in fact -- at least there is a remedy that is useful in terms of removing material that for one reason or another they believe they have a claim on. notice of takedown procedure most websites operate on automatically. send them a message, and i have to more or less give you some procedures to follow. if you want to claim copyright immunity, they must do so. generally speaking, millions of times a day, this operates to actually remove the material from the site. one very quick comment i want to make about the notion again, does the law have to wait until something happens -- something bad happens before providing a remedy? i think in this context, the answer may well be yes most of the time. because this is -- a lot of what we are talking about falls into the category of detected speech or speech, there's a very serious problem with a prior restraint doctrine that says you cannot put it up in the first place. that would avoid much of the harm, but that raises even more serious first amendment problems than ex post regulation of this, which raises its own problems. i think that has to be taken -- that has to be thought about more carefully. >> if you could talk a little bit -- we mentioned the computer fraud and abuse act. and hacking. i'm sort of reminded of several years ago when sarah palin's e-mail address was also hacks, in a similar instance. >> with hers, it was not guessed, it was password recovery questions. hopefully, we will all be in a position to make that mistake at some point. >> the point being that we put a lot of information on the internet and do not always think about the levels of security. what are the levels of security for information on the internet, and how does the law protect those right now? >> cfaa -- a lot of people's complaint is that it is so overbroad. if you read the text of it, it basically says this computer might be on the internet, and you use it in a way that was not specifically authorized by the people who own or control it, and you can be charged under this, which leads to it can criminalize a lot of basic security research that needs to solve the problems we are talking about right now. if a webpage is coughing up data because you into the right input, that could be a cfaa crime, even the you have to do that to prove to the owner of the page that they have a problem. i would say in this case the problem is not that the existing laws to not protect us, but they also put in a bunch of other stuff that criminalizes activity that the people in white hats need to do to stop the people wearing the black hats. >> certainly as well because there are some in cases it can be hard for prosecutors to figure out which ones to bring. >> how many tax dollars do we waste on persecuting aaron swartz, who was -- you could say he was being not very nice with harvard's i.t. systems. this was a fellow who put a laptop in a closet to download academic research to make a berkeley available -- taxpayer-funded research, i believe. it was a cfaa prosecution, i believe, and he committed suicide. >> now that we have some sense of the lay of the land, moving into what can be done to change the laws and address the issues. if you can talk about -- you mentioned a little bit when laws are being crafted, it's very important to understand what you are sucking in unintentionally. at some point, i'm sure you could weigh in as well. there has been a lot of effort not necessarily at the federal level but the state level to try to figure out ways to write laws to criminalize some of this behavior with different levels of success. if you could sort of give a rundown on what has been tried and where the pitfalls have come up. >> sure. a professor at the university of maryland school of law has been working very hard to figure out if there is a way to craft model legislation that would allow going after, you know, only the identified criminal activities that they want to target with this sort of law and not prevent a whole host of other speech. there are key categories you have to think about. what kind of content is covered? the content we are talking about is generally content that is protected under the first amendment, you know, when it is created. a person taking a photo of themselves or of a partner of theirs, the nude image of a person is constitutionally protected speech, and there is certainly no crime involved in the creation of the image at the outset. so trying to define a set of is it sexually explicit imagery, imagery that reveals different types of nudity or sexual activity without nudity? there's a lot of activity -- a lot of debate over what exactly is the nature of this content, and it is difficult to define because there is a fair range of this -- the sort of photos we could all think of of ourselves getting exposed to others that we would see as a harassing sort of effort. so trying to define the category of content that would he protected so it's not so broad to include rings like, you know, a photo of a woman breast-feeding or some other kind of nudity that you might very well be able to capture taking photos in public places and really try to focus it in on images that are in this kind of -- this sphere of intimate exchange that professor franks is talking about. its also who is potentially liable under these bills, which is a big question. it seems clear that you want to be looking at the person who uploads the photo out of the consent that they have or have not received from the person depicted in the photo, but there's also a question of how these laws are drafted syria the website where the photo is updated, does it include any person who looks at the photo who may or may not know that the photo was uploaded without consent? i think you'd -- just to give a couple example, there is a virginia state statute that was passed within -- i think it went into effect this summer, and the first prosecution under that law is under way, and it is a relatively narrow law that includes a requirement that there is an intent to coerce, harass, and intimidate a person by displaying their image, and, you know, tries to define what the content of this sort of image would we. it is an attempt to draft a fairly narrow law, and i do not know if it has been challenged yet under the first amendment by any group. on the other hand, the state of arizona also passed a law that, again, was the nude photo law, trying to restrict the ability for people to share nude photos of other people without their consent, but it basically -- it would make the display or publication or sale of nude images without the consent of the person depicted a felony. that was sort of it to the law. there were no exceptions for newsworthiness. no real acknowledgment that if somebody poses for a photo for an art exhibit, and they have clearly given their consent to the person taken -- taking the photograph to be included in the exhibit, if someone else then hosts that exhibit online, they have not gotten consent directly from the model depicted, it is implied. it's the process of being a model in an art exhibit, but under the letter of the law as it currently is in arizona, that website that is just hosting stills from the gallery show could he in violation of the law. it is done with the best intentions of wanting to get the consent of people depicted in photos before those photos are shared, but not really done with a view to just how much sharing of images happens in a way that does not violate an initial consent but also does not involve direct, explicit consent. this is getting a little bit into the weeds of the law here, but these are the kinds of things that we have to think through if we are looking at is it possible to craft something that really is very narrowly tailored and sort of anticipates all of these unintended consequences? >> if you could pick up on that and talk about what some of the efforts have been to change law at a state level and if any of that could be translated, you know, on a federal level. >> this is a difficult task, of course, because never drafting is always difficult. i'm sure everyone in's room knows that. you could start out with the best of intentions, and you might end up with something that is not that great. that is certainly true. the organization or which i serve as the vice president, the cyber civil rights initiative -- we have published a guide for legislators trying to make clear what elements we think really are constitutionally sound and protective for victims and what the pitfalls are we think these legislators should avoid. many of the points made we have been making for quite some time. there needs to be a narrow definition of what is considered sexually explicit material. we need to be clear about who it is that is responsible for this type of criminal conduct. we need to have certain exceptions, including exceptions for the public interest, which is a pretty broad exception but can include if like law enforcement or newsworthiness, but there are a couple of things on which we might do verge. as much as i agree that arizona's law has problems, and that has made the news recently because the aclu is now suing it -- we can look at what some of those problems are. there is no public interest exception in the arizona law. that was a mistake, and when they are probably going to fix, but as for the rest of it, it is not at all clear that there would be as many problems as the aclu and many others are trying to make it out to be. exceptions include exceptions for images that were disclosed in public or commercial settings, so really, anything we are talking about is a model issue, photography exhibit -- none of that will be a problem. war over, the question of whether or not you have to get consent from every single person every time is also not true because the law says it is when you knew or should have known that the image in question was posted without consent, which is a pretty good standard to consider when we start inking about revenge pouring sites -- revenge porn sites. you have a pretty good sense that this is a nonconsensual image, and that's exactly the type of behavior we are discussing. as to the question of who should be responsible, as many of you probably know because of section 230, which allows for a lot of immunity for online intermediaries, as far as state criminal law goes, 230 will always trump, so none of these state criminal laws actually pose any threat to 230 immunity. it cannot actually preempt. that is obviously not going to be true if there is a federal criminal law that gets passed. as many of you know, section 230 is not absolute. it does not apply to copyright. it does not apply to electronic privacy communications, but it also does not apply to violations of admiral criminal law, which is like google, facebook, twitter all have to care about child pornography laws because section 230 does not write them a blank check for that. i think we can all agree that is probably a good thing. what i want to emphasize here is that while it is true that we have to care about unintended consequences, sweeping into much speech, we always have to worry about that. that is true with every single law. there is no such thing as a law that does not sweep in something that we are probably not going to like sleeping in. the question always has been not just in the first amendment context but also in criminal law generally, on balance, are we accomplishing more good with this law than bad? for us to suggest or to have a kind of response that says any time you suggest to someone that they might not be able to disclose whatever they want to disclose, that means a disaster for us as a democracy, or the internet -- that has not proven to be true in many contexts, and this one that we can take that we have discussed already is the notice and takedown has been going on for some time. many people were convinced when it was passed that it would check the internet down. it looks like the internet is doing ok, even in light of the fact that it is a very powerful tool to get people to stop saying certain things in expressing themselves. same thing is true of child put out with the laws. same thing is true about gambling laws, and, frankly, the same thing, i'll say again, is true about trade secret is, identity theft, voyeurism, all kinds of situations in which we have for quite some time accepted the fact that disclosures of lawful information can be criminalized. if we think about the identity theft context, none of us want to be criminalized for having a social security number or a credit card number, and we're not, but if someone takes that information and uses it in an unauthorized way, we do say that is criminal, and the same thing happens with trade secrets. this is not novel. maybe the only thing that is novel is we are now dealing with the type of conduct primarily directed at women and trying to treat that the same as we would treat other types of sensitive information, and perhaps we are resistant as a society to giving those same rights, but that maybe should not be the way we would approach this. we really need to think about what we count as privacy, what we consider to be the social value of saying you cannot actually disclose certain information unless we want to live in a world where there are no identity theft protections, no medical record for texans, no confidentiality protections at all -- in other words, we are living in a world in which we restrict speech at all. the question really is when is it with it on balance to restrict that speech or not? some people will say that is not what the first amendment does, but effectively it is what it does. there are plenty of situations not only when the supreme court has said on balance we have to consider these tubs of harms and consequences, but also many times where people do not even bring up first amendment questions. how many people really think that spam is a first amendment issue? how many of you think that -- other than david, thinking that spam is a first amended issue is kind of a rare thing. it's a question generally speaking about criminal law, copyright, about our law generally. do we think that what is going to happen, the people we are able to protect and the types of values we are able to support are more important than the few things that might happen otherwise? that said, i do not want to trivialize or underestimate the fact that we do need to think as much as we can about unintended consequences, so let's remind ourselves that no law can ever accommodate every single unintended consequence. there will always be some measure in which we will be depriving people of some measure of their liberty because that's the way that laws were, and unfortunately, most of the time, when we have to pass new laws, it's because our society has come up with poor if it ways to hurt other people. we cannot just say we're going to let that happen because we are all full up on laws and we do not want any new ones. again, it's a question of trying to figure out how we traditionally treat privacy, confidentiality, intimacy, and why we are holding off on doing that here. question is a very quick response. obviously, this is a contentious, i guess, difference of opinion that will not be resolved in a 50-minute program, but just to focus on what was said about looking for ways to craft a law that has more benefits -- does more benefit than harm -- i would think this is an issue supported by lots of authority that that is precisely what the first amendment does not ask. it does not say to weigh the benefits against the harm. it has a higher threshold in cases involving the suppression of speech. merely showing that harm is -- you know, that the good outweighs the bad is sort of what the first amendment is on the scales of that determination, and i think it does make it more difficult. it is not simply enough to say that this is present -- preventing harm when the harm is speech related. we require more. we require more precision in the drafting of those statutes to do everything possible to ensure that -- everything possible to ensure that protected speech is not swept in. once we do that, if it is an economic crime, we do not have to be that precise. marianne is of course right -- no law is perfectly precise and gets 100% of the bad guys and 0% of everybody else, but in the first amendment context, we require efforts to at least move in that direction that, i think, would be difficult in this context. not impossible, but very difficult. >> section 230 came up again. i don't know if you want to pick up that conversation again as well as how that applies here. >> just to say that section 230 is one of congress' great legislative achievements. i am prepared to say that. i think it is in large measure. you know, we are all in congress-bashing mode all the time. other than those who are sitting in this room, i suppose, but section 230 was of critical importance and helping the internet become the internet. in 1996, this community you could not have -- facebook, tumbler, twitter, you name it. the explosive -- explosion of user-generated content was unthinkable. there are many reasons to think -- this is an active debate, of course, about whether broad immunity for intermediaries is a good thing or bad thing. i guess one thing to consider is as part of that debate, is tweaking that law a little bit -- an exception for this, exception for that, additional exception for something else -- probably makes it go away and rapid order. the immunity would disappear. there are a lot of claimants who would like to see section 230 -- lots of people -- people who have been defamed, people whose privacy has been invaded, people who have been scammed, people who have been defrauded -- all sorts of things -- who would like to see an exception for their harm, as it were, caused -- carved out of 230. they have a good argument. why can we not just make sure there is a remedy in this case? i think once congress goes down that road and starts carving out exceptions, the floodgates will open, and 230 will be -- will largely disappear. i think that would be a dreadful thing for the internet. >> our well-connected friends in the entertainment industry have suggested all kinds of tweaks to the nca that would pose a kinds of liability issues for websites. yes, they have tried, and it has not worked. i'm a little more interested in how we can use laws already on the books, the prosecutors can already go to court with to make life is painful and expensive as possible for the people who went after these ipod users and other like-minded creeps. >> i just wanted to build out a little bit on the section 230 point, to give an example of why, you know, those of us who are such staunch defenders of that law -- what role it really plays. imagine we take a person or website operator who knows or should have known this website -- this photo was shared without consent. if we had a law that said, you know, imagine i run my own photo hosting website. a created what i hope will be the next instagram. i get something way better than filters for photos. i don't know what it is, but i'm running my own site. a law on the book says i can be taken to court if summit he is claims that i know or should have known that a photo that was uploaded to my site was an image of another person shared without consent. currently, i can immediately get out of any lawsuit that somebody tries to come -- thousands of photos are uploaded to my site a day. it's doing very well, and somebody says, "a photo of me is on your site, and you should have known that i did not consent to it" -- under 230, i do not even get dragged into the court case. is very clear i cannot be held liable for this, and i can go back to my business of running a photo hosting website. if the law changed and there was this question of should i have known that this photo was shared without consent, then we are in the case where i as the operator of the website have to go to court. i have two or three employees for my business, and now i got to hire a lawyer. i'm operating on thin margins, and now i have a few legal fees because i have got to go and defend -- there's no way i could have known. if even we are talking of a good-faith operators who really had no knowledge and could not be considered to should have known that these kinds of photos were on their site, they will still have to go to court and defend that, and that is one of the real burdens that this kind of liability framework would put on operators of -- not even thinking about these giant internet platforms that deal with millions of pieces of content a day and what knowledge standard they have about tens of millions of voters hosted on their website, even just thinking about small companies, two or three-person operations -- it will be vastly more complicated. >> the flipside, copyright lawsuits are not something that lawyers on retainer normally do. i guess it is a larger issue that we have made the law something that people who can afford to hire lawyers can be good at. >> to respond to that specifically, we want to balance the harm. it may be true that especially the supreme court has gotten into the mode of saying we do not do balancing tests, concerns of overly broad laws must not be real. basically saying exactly that there are harms out there i can be addressed by this law, and you cannot complete say there could be all these things that might happen. there could. that's true, but they have to be real harms, and they have to be weighed against the legitimacy of any statute. if we are looking at the case of a poor site owner, it's true. there's no reason to say that will not create issues. of course there will be, but we also know that there are actual current harms, thousands of people who are actually being affected by this, whose lives are literally being ruined. that's a real harm as well to say that we are not sure what will happen to these particular site operators -- it is a concern, but it cannot be the only concern. as far as section 230, people waving their hands saying that they want carveouts as well -- it already has carveouts. it has already been made clear that we are going to say it does not apply. that has always been a matter of interpretation, always a question of who we are going to say gets protection and who does not, which interests are so valuable. we have to think about why that is the case. as long as we are going to talk about section 230, just one final note -- the goals of section 230, written in the statute itself, include to ensure vigorous enforcement of federal them and a lot to deter and punish trafficking and obscenity, stalking and harassment by means of computer. that is in section 230 itself, and that's what a lot of people seem to forget. it's not just about letting intermediaries do whatever they want. there are certain values and goals embedded in the idea that we would do well to ask if they are being served today. >> i would love to keep asking questions, and i will, but i want to ask a chance for the audience, if anyone has any pressing thoughts they would like to address with the panel. right here in the front. >> what is the flaw in the current law? there is defamation, intimidation -- whenever we have a high profile case, there's a desire from people who think they can solve the problem or people who were injured who want a new law specifically for that issue. >> why do we need a new law in this case? do you want to start? >> the civil rights work did not start with any high profile case. we started two years ago when average people were being affected by this. this has been something that is happening private citizens for years. this is not some high profile case we are responding to out of a sense of now that has happened to someone famous we care. we care about this because the experience of victims has been that none of these laws work. if the image is out there, if it was not necessarily by someone trying to harass, and many of them do not, just to give you a concrete example, just a couple of weeks ago, the highway patrol officers who have arrested women for drunk driving and taking the phones, taking naked pictures off their phones, sharing them with each other, he did not even 100 find out about it. find out how many times they get turned away by lawyer's. there's no money there, no reason to pursue this. for all these reasons, we are responding to an issue where thousands of victims have come forward and said that they cannot get any relief from the law. they are not going to second-guess -- i do going to second guess the victims because they are the ones experiencing this firsthand. i do going to second guess the victims because they are the ones experiencing this firsthand. >> but if law enforcement is telling women, "it's your own fault this happened to you," they are wrong. i know you agree with that, but that is not only just morally wrong and a faulty understanding of what it means to, you know, take your own photo or share photos in an intimate setting, but it is also -- probably indicates that they do not understand the laws that do exist. >> they might not be too effective in enforcing any law. >> right. by no means -- i hope no one gets the impression -- i do not know anyone who thinks criminal laws are the silver bullet to this problem. there is no silver bullet. we're asking tech companies to rethink their internal policies, asking people to engage in educational programs, inform people about the practice, and we are engaging with law enforcement and others because we want them to understand the stakes. by no means is it a silver bullet, but much like in the 1970's and 1960's when domestic violence was not considered a crime, when sexual assault was largely not considered a crime, especially if it was your husband who assaulted you, there is a social and legal importance to recognizing that this is a harm that should be addressed by the law, at least in theory. >> i was going to point to if we have a range -- we have a range of laws on the books that might be useful in different cases. whether it is a privacy tort, invasion of privacy, public disclosure of private facts, going after it from a hacking angle, whether it is a copyright remedy, intentional infliction of emotional distress, there's a mosaic of laws out there that are the way society has expressed that it is wrong to intentionally cause emotional distress to another person, and there are laws against that sort of thing. it's not going to mean that every single instance of this kind of exposure of a private photo is covered. there are going to be gaps where a case does not fit into every single aspect of or cannot fit into every single aspect of a current law, but if we try to craft a new crime that is expansively enough defined to cover every single instance of an exposed photo, we are absolutely going to sweep in other kinds of content, other kinds of expression, and that law is not going to survive first amendment scrutiny. that's the challenge we are facing. there's no silver bullet. >> many have said that facebook has gotten better at doing this. twitter has not done it yet. it's kind of insane for a social network that has been around since 2007. i hope they are taking it more seriously because they can do a lot. twitter is not the public internet. they have their own rules. they are allowed to change them to make it easier for people who are being harassed or people who see others being harassed to call out the offenders, and they have not done enough. >> going back to our case example for the day, a lot of these images were circulating for a long time on websites like 4chan and sort of blew up when they hit reddit, which had a thread which really made some of these images go viral, which made these shut down. it was sort of a question. for the users to use the way they want to, and it's difficult to say a website that is based on the idea of people having open forums to share and discuss what they would like also needs to be responsible for the making judgment calls of when it crosses the line, so what are some of the difficulties with that? perhaps you might want to jump in here. >> someone said a while back the biggest problem with the internet is the people on it. >> to that point, though, i would say i'm glad you brought up twitter as an example to dovetail with the point about intentional emotional distress. that's why i think we need to rethink the emphasis on emotional distress. when people are engaging in these activities, all these different types, not just some of the cases, but it was actually a pretty large number of these cases -- why are people doing this? because they think it is funny. because they think it is entertainment. not to cause emotional distress. why do we hold that is the one thing we would penalize? why is this person any better? why if he is doing it for profit, if he is getting ad revenue? why would we say it's totally fine to do it for that but just do not hurt her feelings? that seems like an odd emphasis to make exception for the public nature of this humiliation. >> don't you need to have, as part of a prohibition -- you have to think -- maybe i misunderstand, but you have to have some reference to an improper purpose. you're not saying, like i don't think, like the arizona statute, which is clearly unconstitutional, that you cannot post a picture of someone without their clothes on -- >> that's not what the statute says. >> and is not what you are saying. the difference is you, too, have to focus on the improper purpose. >> if purpose is meant by causing distress, no, it does not have to be focused on that. the motive for why some and does something, the motive for why someone spies and you in your bedroom -- why would that matter? they do it because they think you are funny looking or because they think you are arousing should not matter. we look at different categories and we are back to the theme of jennifer lawrence calling this a sex crime. think about the way sex crimes tend to be worded. we think of sexual assault in terms of consent. there are certain forms -- and again, this is true of our identity information and about our other forms of privacy. do we only criminalize disclosures of medical records when you intend to distress me with them? that is not even part of the statute at all. same thing when it comes to social security numbers. i just thought it would be funny to put your social security number out there. i did not mean to hurt your feelings. no one cares. when it comes to certain types of intimate information, the motive for why someone is doing it should not be the point. it's the lack of consent to do so. i think that is something that is becoming clearer to us as a society, that we has serious deep problems with sexual consent understandings in this country. we can see this in terms of how many sexual assaults are committed every year, but also that we seem to take it as a given whether a woman especially consented to the use of her body for sexual entertainment or enjoyment. i think it's time we started to rethink that. any other questions? >> i believe this was referred to as a hacking, but wasn't this really an instance of phishing rather than hacking? >> social engineering is still hacking. it's the easier kind, often. >> it involves folks being able to give their passwords, not necessarily a hacking of the icloud. >> it's unclear exactly what went down. apple has come out and said that their systems were not hacked. which is to say that apple writ large was not hacked. they did not rule out that individuals through sophisticated techniques, whether it was social engineering, whether it was phishing were able to get passwords for individual accounts. >> two things about apple as they are not generous with specifics about their products. everyone who is looked at things like touch id on the iphone thinks that is great. their cloud security -- they have had issues with it. >> to that question, does that matter? is that a significant distinction? we talked about how there's a difference between this case where you have perhaps a violation of cfaa -- where does the distinction of how the image was gotten come into play? >> i think back to what marianne was saying, where if you think consent is the fulcrum or the line, then whether you have hacked into someone's account or have a photograph that was sent to you or you have access to that account gotten in a perfectly reasonable way -- all of those would matter and would have to be evaluated. i mean, there is a much larger debate here about the role of consent. generally, with respect to information, on the internet, there is an active international debate with the so-called right to be forgotten in european -- there's various european laws where if you no longer consent to have information that has been published about you, you can sort of withdraw the consent, have the website delete information that may be floating around about you. again, not to beat a dead horse, but there's a familiar landscape in a sense for the first amendment debate, which has sort of been in the background of this consent, privacy, information debate for many years. what will it do to the free flow of information if you have to show that you have consent in some form for passing on a piece of information? this is related to this idea, i think, that the people should own the information about them, should own the information about them. should be a property interest, even, and therefore, people have to come to them if they want to publicize various things about them. that has serious, very difficult issues that we could call free speech or first amendment issues because it is very difficult to evaluate whether consent has been given in many circumstances as to find out how you demonstrate consent, and it would have a very serious impact on the sorts of things -- you know, can you tell people you saw me at this -- i would use this as an example about owning information. i saw this thing in the rayburn office building. if i own that information, you cannot. it is an extreme example. nobody suggests we should have such a law, but that is the issue, i think, with respect to sort of balancing the free flow of information on the one hand against reasonable requests for a showing of consent with respect to some information on the other. >> i think that is right, and it's one of the reasons why am actually optimistic about this particular type of material because there seems to be a fairly easy way to fix this. you want to disclose some but his image, ask them to sign a form, you can disclose away. make it easy. we have something like that when it comes to modeling releases. we have it when it comes to medical records. if you want to submit this information, and you think it is consensual because that is the only principled stance to take, make sure you have documented evidence. we can fix this. this is not nearly as hard as things like the right to be forgotten or a general question about what people can say about you. it's very specific and can be resolved through paperwork. i saw one more hand right here. >> given that there are already takedown regimes for child pornography and these other protected what have you, would it be that much more burdensome to require search engines, facebook, other tech companies to also takedown revenge pornography? do you think it would possibly impede the goals of small tech businesses? >> there are a couple of things you really have to keep in mind when you are talking about some kind of takedown regime. first and foremost, what a notice and takedown regime does is give a person the ability, a mechanism to tell, say, a website host to take down someone else's content, takedown something that was uploaded by another person. this is a mechanism that has been helpful in taking down, you know, infringing copies of movies and songs, but at its heart, it is giving a person ability to say, "take what that other person has uploaded." the potential for abuse of these systems is very high. when you look at something like the copyright takedown system, there are a number of safeguards built into the system based on what you have to include in a notice. it has got to include the ability to -- you have to identify yourself, including contact information. you have to attest that you are the legitimate owner of the copyright. the person who uploaded the content originally has the ability to push back and say, "no, this is actually my content," or "i have been making a fair use of this copyrighted work" or what have you. the site puts the information backup online and leaves them to fight it out in court. it's not as simple a mechanism as saying give someone an easy form to fill out, and the information comes down, and you are set. a lot needs to go into figuring out how to construct this takedown mechanism so that it is not so vulnerable to someone using it to say, "i don't like what that person said, so i'm going to file a takedown request and abuse the system." one of the real challenges we have to think about and we talk about questions around nude images is that there is a sensitivity and a privacy interest that the person depicted in the image might have. if it is your photo that has been posted without your consent online, you will want to, you know, file a takedown request and get that taken down. if you have to identify yourself in that request, that could cause privacy concerns, but if it is someone trying to abuse the system. if you have uploaded your photo under a pseudonym and you're happy with the photo being out there, but you do not want it connected with your real name, and someone else is trying to abuse the system to get the content taken down, your ability to respond requires you to disclose who you are. there are some issues, thinking about the vast range of nude imagery that is available on the internet. some of it is this kind of nonconsensual posting, but there is a lot of it that is uploaded anonymously or pseudonymously. we really have to take that into account. >> just to follow up a little bit, i think a notice of takedown regime is worth exploring at this point. the existing notice of takedown schemes, particularly the copyright one, which is the one in section 512, does require -- the burden is on the aggrieved party. it does not say google or facebook or twitter has to take stuff down. you have to respond to the copyright owner's identification of the infringing material, which i think is very important and has been very contentious, so they are working that out in the court, but they have more or less come to the resolution that it is the obligation of the aggrieved party to find -- and it's not a trivial obligation -- to find the material and send the notice in, at which point the process kicks in. there are all sorts of protections. you do have to be careful about allowing something to be abused. if it is too easy to submit a takedown notice, people will be using it for purposes that have nothing to do with the harms they are trying to protect against, but all those, i think, are -- the devil is in the details. the copyright takedown regime, if one wanted to go in that direction for these sorts of problems, i think would be worth looking at carefully. hundreds of millions of copyright infringing files are taken down weekly. in that sense, it has removed an enormous -- i know the copyright industry has gone crazy about it because they have to go find the material. they don't like that, but on the other hand, it has done the job pretty well. it has provided a process at scale, and scale is always important when you are talking about the internet. whatever we're talking about, we are talking about millions of it, and that scale has allowed the automation of takedown, but yet, protecting the people who have uploaded and giving them an avenue to say, "i did not post it" or it's not infringing or whatever there argument might be. that will be worth looking at carefully and see how it could be modeled to work on this problem. it might be a useful avenue of approach. >> you mentioned search engines. we should be careful about going too far. someone like youtube can have content ided, but also, they have a known universe of copyrighted material that they get from the entertainment industry that they can use to match against this. there's no such thing when it comes to people's private photos. trying to do a general search and match -- webmail sites can do automated screening against child porn because there's no such thing as -- there's no consensual anything there. that's why they are illegal, and there is a hash database assembled by the national association of missing and exploited children, and they can compare the hash of the image, the mathematical shortcut to it, to what is in that database. doing that for the broader universe -- that's not going to work. >> just one clarification -- child pornography is not quite that easy because it's not always easy to tell whether the person depicted is in fact a minor, so they do have to engage in some judgment calls and some investigation. not necessarily more onerous than figuring out whether or not a picture was consensual. unfortunately, we have broken a promise to keep it to about 60 minutes, but we were close. wonderful conversation that i'm sure could continue for hours, but we appreciate all of you coming. thanks very much. [applause] [captions copyright national cable satellite corp. 2014] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org]

Related Keywords

Miami , Florida , United States , Berkeley , California , Arizona , Maryland School , Texas , Virginia , Syria , Washington , District Of Columbia , Texans , Aaron Swartz , Johns Hopkins , Jennifer Lawrence , Marianne Franks , Bob Goodlatte , Sarah Palin ,

© 2024 Vimarsana
Transcripts For CSPAN Digital Photo Theft 20141109 : Comparemela.com

Transcripts For CSPAN Digital Photo Theft 20141109

Card image cap



this is one hour 10 minutes. >> welcome everybody. i and the executive director of the internet caucus advisory committee. hopefully we will get you out of here in 60 minutes or so. this topic is on your program, which you have in front of you. you have the information on the speakers and their twitter accounts. you can contact them on twitter or any other way you would like bywe are hosting this event the congressional internet caucus advisory committee in conjunction with the internet caucus. on the house side, bob goodlatte. we are in their debt for hosting and supporting this program. they don't agree on every issue. frankly, not on a lot of issues. we are thrilled that they agreed that the internet should be a place where we can debate these issues with expert speakers like we have today. i want to thank them, and our moderator today is a reporter with politico, a cyber security reporter, and she has covered this over the past couple years, and she is situated to moderate our panel today. her twitter account information is on the program as well. take it away. >> thank you, and thank you to the net caucus for having me here today. it is a very interesting topic which we will be diving into headfirst. just to introduce our panel, from here down on, we have marianne franks. they associate professor of law at the university of miami. to her left, is the director of the free expression project. then a columnist at yahoo! tech. contributor -- all these fine people have a lot of expertise in this topic from a lot of different angles, which is where i wanted to start off today. one of the most interesting things about this hack of the celeb photos is it raised a different issues for a lot of different people. as a cyber security reporter, i covered in terms of the password security on the cloud and what the technical aspects of that hack might have been, but it raises a number of conversations from misogyny on the internet to what actually is the nature of the crime that occurred, whether you look at it from the perceptive of a sex crime, a hacking crime, a first amendment issue here, and we will touch on all those different takeaways today. how i would like to begin is if each of the panelists could go down and say what for them was the one or two big takeaways from this hack and what is the most important thing to look at about this. so i think a good place to start as any is with jennifer lawrence calling what happened to her is a sex crime. a lot of people were taken aback by that characterization because as a matter of law that is true. what it highlights is an invitation to what we think what a crime is, what we think a sex crime in particular is, and thinking about ways we can recognize it as being such. i think it is interesting to hear from such a high-profile victim of this particular behavior that her own sense of , one of was this violation of sexual autonomy, humiliation, and exposure that she would call this a sex crime. what i think would be the perspective that i would take on this is to consider why we criminalize certain types of behavior. when do we start drawing the line between bad behavior and behavior that we think needs a response from criminal law. i would invite us to think about that in terms of why we think the criminal law is important, not a narrow focus, but a social expression, a condemnation of certain types of harms that are so serious that one of the only ways we can express it as a community is to say this should be against the law, and think about the particular nature of what happened to lawrence and other victims in terms of the daily suffering and humiliation they have to experience, that they can never get back, there is no way to undo what has been done, that the harm in these cases really is irreversible and ongoing. what i hope we can do to frame the conversation by looking to a perspective of the victim is think about why we might care about the fact that such sexual humiliation has become an entertainment industry and what is our response validity as society, and if people are concerned about having a free and equal an open internet, what we should be doing in response to that. >> thank you, and i thought it is interesting how professor franks was talking about social expressions of condemnations about this behavior, because that me was one of the major differences i saw in the response around this most recent exposure of celebrity photos, compared to how this issue and how the nonconsensual disclosure of these images has been treated over the years. five years ago or several years ago, when many of us seated at this table first started talking about this issue, it was difficult to get people engaged on the question. there was not a public conversation about how is this being used as a way to try to go after women, to harass them, to silence them, and to see that shift in the public conversation about there is much more willingness for major media outlets and for people engaging on social media to be talking about the other side of the story, to talk about, no, people should not be going and following these links. the information might be out there on the internet, that we do not have to see it and to treat what has happened to the people whose photos have been exposed as a real harm that has happened to them. i think it is a good thing that we are having much more of that conversation happen in public , in society to appreciate the real harm that is happening to women when they are targeted and this way. of course, the concern that i see coming from a first amendment and open internet background is wanting to see if there are proposals on how to take a stronger response to this, insuring that whatever those proposals are not so broadly crafted that they end up pulling in a lot of protected expression as well. there is a way -- it is very difficult to craft a law that goes after, that makes a crime of disclosing information in a way that only gets after a bad or a malicious disclosure of information and does not also sweep in a lot of real and vital important speech. i hope one of things we can focus today is looking on what are the existing laws that identify the kinds of harms that happen here, whether it is a person trying to inflict distress on another person, a person launching a campaign of harassment against someone, whether the federal computer fraud and abuse act can cover the hacking aspect of this. there are ways we have addressed the harms that can come from this kind of behavior in existing law that do not entail focusing specifically on the speech aspect of it. >> my first reaction was there was this it is a bunch of celebrities in trouble response, which is unhelpful and stupid because i'm sure no one in this room has pictures they do not want on the entire internet, with some of those people on facebook, and that is i got better way to look at it, because just calling celebgate was one of the stupider -gate words put around. i spent a lot of time looking at exactly how apple security is set up. if you want to keep your information safe, are there tools available, do they help, and in apple's case, they had a weak implementation of two-step verification. if you had done it, they did not protect icloud backups. -- iallway icloud works have an ipad at home and i am not clear on what is getting backed up and how to control it. it is an opaque system. you have this case where these people did not think they were putting their pictures on the internet, and it is not only -- always clear in a lot of cloud services, where did your data go? there is a story in "the washington post" earlier this week where a cryptographer from johns hopkins -- he thought this was only on his computer. this is a guy who is paid to know this stuff. legally speaking we have laws , against unauthorized access. it does affect the sort of thing. we also need to -- not everyone is going to go through the factor of two-step verification, but it should be there, it should work, and you should know what is protecting and what it is not. >> my -- i guess i have less to say about the specifics of the lawrence incident. one of the other things that is within this purview of this panel is a related question, a broader question about, as, i guess, professor franks, called it the epidemic of sexual sites on the internet, photo sites, those kinds of things, which is a serious social issue. my thoughts turn with emma to the first amendment, first of all, which, as she said, crafting, even if we think this is harmful, crafting prohibition that would survive first amendment scrutiny with respect to much of this material, it would be quite difficult, probably not impossible, but difficult and would require some care to make sure it does not sweep in a good deal of protected material. i got involved in this -- i had a student who was working on actually a project on copyright, possible copyright remedies around these revenge porn sites, can you take down photographs based on copyright claims, and i spent half an hour, 45 minutes poking around at those sites about a year or so ago. there's a good deal of material on there that is clearly protected speech. there's some material that may not be. drawing that line would be challenging to my thinking. that is one thought i had. in the discussion about these issues, and there has been a good deal of discussion in the legal academy at least, about what to do about this, what kinds of remedies we can provide. the conversation and debate has moved often quickly to the question of website operator liability for hosting these photographs. there are existing -- we can get into more the rest of this session -- there are existing tort remedies that may provide relief to people who have been harmed against the individual uploaders of the private photographs that are being posted. but section 230 of the communications act has been construed to protect the website operator from being drawn into that tort liability, because it immunizes the operator against a broad range of tort liability, including this. so much of this discussion has come around to people arguing about whether 230 should be repealed completely or modified to allow actions against website operators. it is a very important law problem because many issues have this feature where the intermediaries, the website operators typically, are helping to spread this harmful information and yet federal law immunizes them against liability. i hope we can get into those 230 issues. >> great. as we can all see, there is quite a bit at play here. perhaps we can start with -- and it is difficult in this case because we are sort of going from a very specific instance. this was jennifer lawrence had private photos in what she believed to be a private place gotten into and her photos got on the internet. those are a lot of different from revenge porn situation, and after that relationship went south, where the photo was initially given with consent. that is very different than someone hacking into a private computer that you may not have stored something on the web. it is different if someone gets the password because you used your dog's name as a password versus a sophisticated phishing malware attack. so there are a lot of different pieces that raise these issues. generally speaking, what are the remedies that people who feel that there are private images and private data in the digital world has been exposed to the internet, what can they do now under the law to get relief, although they may never be able to get it back? >> i think it is important to focus on the fact that jennifer lawrence's situation, and the other celebrities come are different from these other types of context. it is important to not make too much difference out of these instances. if we look at it from a more traditional privacy perspective, it should not be a difficult intuition when people disclose internet information one party they also do not expect it will given to another party. whether or not that is disclosing it to your partner, or the cloud, it the obvious way to look at it is don't we have some sense of contextual integrity or sense of privacy? when you tell your doctor about your symptoms, you expect your doctor will not tell anybody else about your symptoms or share the pictures of your medical exam. you have plenty of situations where we can think about it not related to the charged issue of women's naked bodies, but think about all the ways we expect our information should be kept confidential within a relationship even though we have given it voluntarily. when we consider it that way, it is awful to think about what we do in other contexts. do we protect people's credit card information, social security information, home addresses, do we protect companies' trade secrets? these are all ways in which we might be disclosing information to trusted parties, but we have criminal penalties when people step outside of this particular contexts, and it is useful to think about why we should order should not apply those remedies here, because it is true we can come up with ways for victims to talk about copyright remedies. for them to talk about things like infliction of emotional distress. most people can see that it is difficult to talk after the fact about any kind of remedy. copyright may work for jennifer lawrence. it will not work for the vast majority of victims and have no recourse. it takes some time in the takedown process. you need to have a scary lawyer backing you up. many private citizens do not have that clout. is not an effective solution for the mass majority of victims. we also need to think about the fact that this is not going to be situations of relationships gone sour, but actual domestic, ongoing violent relationships used to trap people in these relationships, used as extortion, or keep them from a visiting the relationships. a very big category of devastating intimate material that is getting out there. the idea that there is any kind of lawsuit, or copyright remedy there will be responsive to that particular harm -- i think it is naïve or abstract, given what actual victim expenses have been. it is also important as were trying to think about adequacy of legal remedies to think about first amendment values. in that thinking, consider how much of an effect these types of harms are having on women speech. how many women are now afraid of ever being intimate with anyone or having their webcam hacked? or having a hidden camera somewhere recording them having sex? how money women are afraid to commitment ends -- themselves to careers because they are afraid this is what will happen to them. this is the punishment. the laws best response is that we should clean up the mess afterwards. many victims will not have enough money and time. that is really a sense that we will have to take seriously about how much this is affecting women. using the threat this type of behavior. using this as a way to shut women up and dry them off line. as a free-speech matter, as a section 230 matter, in the interest of fostering open to if i may disrupt the orderly bit so we mix this up, david, if you could go in depth a little bit more into what are some of the ways that the law as it stands has tried to grapple with some of these issues, and what are the ways people have looked at adapting laws that were crafted long before the internet was what it is today. >> yes, and i think the hacking -- there is the hacking side of the problem, the computer fraud and abuse act is 1 -- this may well have been in this specific instance a violation of the computer fraud and abuse act. accessing a protected computer without authorization gives rise to both civil and criminal liability under the federal criminal code, and it could be applicable -- obviously, i'm not giving anybody legal advice or taking a position on whether or not it is, but that is certainly one avenue. there are also i guess on the tort side -- there are both, as several people have mentioned -- there are a number of state law remedies for some of this behavior. as recognized in one state for outrageous conduct, outrageous activity. i guess in response to what marianne said, i completely agree. to think of copyright as a solution to this problem is naïve and not very sensible, but let me just say that the copyright act is one place in the federal code where a grieved parties can quickly arrange without a lawsuit to have materials taken down from the internet. there is notice and takedown procedure in the copyright act, which is a very powerful thing. this probably covers a small subset of the problem, but i think it is not a trivial subset of the problem where people can in fact -- at least there is a remedy that is useful in terms of removing material that for one reason or another they believe they have a claim on. notice of takedown procedure most websites operate on automatically. send them a message, and i have to more or less give you some procedures to follow. if you want to claim copyright immunity, they must do so. generally speaking, millions of times a day, this operates to actually remove the material from the site. one very quick comment i want to make about the notion again, does the law have to wait until something happens -- something bad happens before providing a remedy? i think in this context, the answer may well be yes most of the time. because this is -- a lot of what we are talking about falls into the category of detected speech or speech, there's a very serious problem with a prior restraint doctrine that says you cannot put it up in the first place. that would avoid much of the harm, but that raises even more serious first amendment problems than ex post regulation of this, which raises its own problems. i think that has to be taken -- that has to be thought about more carefully. >> if you could talk a little bit -- we mentioned the computer fraud and abuse act. and hacking. i'm sort of reminded of several years ago when sarah palin's e-mail address was also hacks, in a similar instance. >> with hers, it was not guessed, it was password recovery questions. hopefully, we will all be in a position to make that mistake at some point. >> the point being that we put a lot of information on the internet and do not always think about the levels of security. what are the levels of security for information on the internet, and how does the law protect those right now? >> cfaa -- a lot of people's complaint is that it is so overbroad. if you read the text of it, it basically says this computer might be on the internet, and you use it in a way that was not specifically authorized by the people who own or control it, and you can be charged under this, which leads to it can criminalize a lot of basic security research that needs to solve the problems we are talking about right now. if a webpage is coughing up data because you into the right input, that could be a cfaa crime, even the you have to do that to prove to the owner of the page that they have a problem. i would say in this case the problem is not that the existing laws to not protect us, but they also put in a bunch of other stuff that criminalizes activity that the people in white hats need to do to stop the people wearing the black hats. >> certainly as well because there are some in cases it can be hard for prosecutors to figure out which ones to bring. >> how many tax dollars do we waste on persecuting aaron swartz, who was -- you could say he was being not very nice with harvard's i.t. systems. this was a fellow who put a laptop in a closet to download academic research to make a berkeley available -- taxpayer-funded research, i believe. it was a cfaa prosecution, i believe, and he committed suicide. >> now that we have some sense of the lay of the land, moving into what can be done to change the laws and address the issues. if you can talk about -- you mentioned a little bit when laws are being crafted, it's very important to understand what you are sucking in unintentionally. at some point, i'm sure you could weigh in as well. there has been a lot of effort not necessarily at the federal level but the state level to try to figure out ways to write laws to criminalize some of this behavior with different levels of success. if you could sort of give a rundown on what has been tried and where the pitfalls have come up. >> sure. a professor at the university of maryland school of law has been working very hard to figure out if there is a way to craft model legislation that would allow going after, you know, only the identified criminal activities that they want to target with this sort of law and not prevent a whole host of other speech. there are key categories you have to think about. what kind of content is covered? the content we are talking about is generally content that is protected under the first amendment, you know, when it is created. a person taking a photo of themselves or of a partner of theirs, the nude image of a person is constitutionally protected speech, and there is certainly no crime involved in the creation of the image at the outset. so trying to define a set of is it sexually explicit imagery, imagery that reveals different types of nudity or sexual activity without nudity? there's a lot of activity -- a lot of debate over what exactly is the nature of this content, and it is difficult to define because there is a fair range of this -- the sort of photos we could all think of of ourselves getting exposed to others that we would see as a harassing sort of effort. so trying to define the category of content that would he protected so it's not so broad to include rings like, you know, a photo of a woman breast-feeding or some other kind of nudity that you might very well be able to capture taking photos in public places and really try to focus it in on images that are in this kind of -- this sphere of intimate exchange that professor franks is talking about. its also who is potentially liable under these bills, which is a big question. it seems clear that you want to be looking at the person who uploads the photo out of the consent that they have or have not received from the person depicted in the photo, but there's also a question of how these laws are drafted syria the website where the photo is updated, does it include any person who looks at the photo who may or may not know that the photo was uploaded without consent? i think you'd -- just to give a couple example, there is a virginia state statute that was passed within -- i think it went into effect this summer, and the first prosecution under that law is under way, and it is a relatively narrow law that includes a requirement that there is an intent to coerce, harass, and intimidate a person by displaying their image, and, you know, tries to define what the content of this sort of image would we. it is an attempt to draft a fairly narrow law, and i do not know if it has been challenged yet under the first amendment by any group. on the other hand, the state of arizona also passed a law that, again, was the nude photo law, trying to restrict the ability for people to share nude photos of other people without their consent, but it basically -- it would make the display or publication or sale of nude images without the consent of the person depicted a felony. that was sort of it to the law. there were no exceptions for newsworthiness. no real acknowledgment that if somebody poses for a photo for an art exhibit, and they have clearly given their consent to the person taken -- taking the photograph to be included in the exhibit, if someone else then hosts that exhibit online, they have not gotten consent directly from the model depicted, it is implied. it's the process of being a model in an art exhibit, but under the letter of the law as it currently is in arizona, that website that is just hosting stills from the gallery show could he in violation of the law. it is done with the best intentions of wanting to get the consent of people depicted in photos before those photos are shared, but not really done with a view to just how much sharing of images happens in a way that does not violate an initial consent but also does not involve direct, explicit consent. this is getting a little bit into the weeds of the law here, but these are the kinds of things that we have to think through if we are looking at is it possible to craft something that really is very narrowly tailored and sort of anticipates all of these unintended consequences? >> if you could pick up on that and talk about what some of the efforts have been to change law at a state level and if any of that could be translated, you know, on a federal level. >> this is a difficult task, of course, because never drafting is always difficult. i'm sure everyone in's room knows that. you could start out with the best of intentions, and you might end up with something that is not that great. that is certainly true. the organization or which i serve as the vice president, the cyber civil rights initiative -- we have published a guide for legislators trying to make clear what elements we think really are constitutionally sound and protective for victims and what the pitfalls are we think these legislators should avoid. many of the points made we have been making for quite some time. there needs to be a narrow definition of what is considered sexually explicit material. we need to be clear about who it is that is responsible for this type of criminal conduct. we need to have certain exceptions, including exceptions for the public interest, which is a pretty broad exception but can include if like law enforcement or newsworthiness, but there are a couple of things on which we might do verge. as much as i agree that arizona's law has problems, and that has made the news recently because the aclu is now suing it -- we can look at what some of those problems are. there is no public interest exception in the arizona law. that was a mistake, and when they are probably going to fix, but as for the rest of it, it is not at all clear that there would be as many problems as the aclu and many others are trying to make it out to be. exceptions include exceptions for images that were disclosed in public or commercial settings, so really, anything we are talking about is a model issue, photography exhibit -- none of that will be a problem. war over, the question of whether or not you have to get consent from every single person every time is also not true because the law says it is when you knew or should have known that the image in question was posted without consent, which is a pretty good standard to consider when we start inking about revenge pouring sites -- revenge porn sites. you have a pretty good sense that this is a nonconsensual image, and that's exactly the type of behavior we are discussing. as to the question of who should be responsible, as many of you probably know because of section 230, which allows for a lot of immunity for online intermediaries, as far as state criminal law goes, 230 will always trump, so none of these state criminal laws actually pose any threat to 230 immunity. it cannot actually preempt. that is obviously not going to be true if there is a federal criminal law that gets passed. as many of you know, section 230 is not absolute. it does not apply to copyright. it does not apply to electronic privacy communications, but it also does not apply to violations of admiral criminal law, which is like google, facebook, twitter all have to care about child pornography laws because section 230 does not write them a blank check for that. i think we can all agree that is probably a good thing. what i want to emphasize here is that while it is true that we have to care about unintended consequences, sweeping into much speech, we always have to worry about that. that is true with every single law. there is no such thing as a law that does not sweep in something that we are probably not going to like sleeping in. the question always has been not just in the first amendment context but also in criminal law generally, on balance, are we accomplishing more good with this law than bad? for us to suggest or to have a kind of response that says any time you suggest to someone that they might not be able to disclose whatever they want to disclose, that means a disaster for us as a democracy, or the internet -- that has not proven to be true in many contexts, and this one that we can take that we have discussed already is the notice and takedown has been going on for some time. many people were convinced when it was passed that it would check the internet down. it looks like the internet is doing ok, even in light of the fact that it is a very powerful tool to get people to stop saying certain things in expressing themselves. same thing is true of child put out with the laws. same thing is true about gambling laws, and, frankly, the same thing, i'll say again, is true about trade secret is, identity theft, voyeurism, all kinds of situations in which we have for quite some time accepted the fact that disclosures of lawful information can be criminalized. if we think about the identity theft context, none of us want to be criminalized for having a social security number or a credit card number, and we're not, but if someone takes that information and uses it in an unauthorized way, we do say that is criminal, and the same thing happens with trade secrets. this is not novel. maybe the only thing that is novel is we are now dealing with the type of conduct primarily directed at women and trying to treat that the same as we would treat other types of sensitive information, and perhaps we are resistant as a society to giving those same rights, but that maybe should not be the way we would approach this. we really need to think about what we count as privacy, what we consider to be the social value of saying you cannot actually disclose certain information unless we want to live in a world where there are no identity theft protections, no medical record for texans, no confidentiality protections at all -- in other words, we are living in a world in which we restrict speech at all. the question really is when is it with it on balance to restrict that speech or not? some people will say that is not what the first amendment does, but effectively it is what it does. there are plenty of situations not only when the supreme court has said on balance we have to consider these tubs of harms and consequences, but also many times where people do not even bring up first amendment questions. how many people really think that spam is a first amendment issue? how many of you think that -- other than david, thinking that spam is a first amended issue is kind of a rare thing. it's a question generally speaking about criminal law, copyright, about our law generally. do we think that what is going to happen, the people we are able to protect and the types of values we are able to support are more important than the few things that might happen otherwise? that said, i do not want to trivialize or underestimate the fact that we do need to think as much as we can about unintended consequences, so let's remind ourselves that no law can ever accommodate every single unintended consequence. there will always be some measure in which we will be depriving people of some measure of their liberty because that's the way that laws were, and unfortunately, most of the time, when we have to pass new laws, it's because our society has come up with poor if it ways to hurt other people. we cannot just say we're going to let that happen because we are all full up on laws and we do not want any new ones. again, it's a question of trying to figure out how we traditionally treat privacy, confidentiality, intimacy, and why we are holding off on doing that here. question is a very quick response. obviously, this is a contentious, i guess, difference of opinion that will not be resolved in a 50-minute program, but just to focus on what was said about looking for ways to craft a law that has more benefits -- does more benefit than harm -- i would think this is an issue supported by lots of authority that that is precisely what the first amendment does not ask. it does not say to weigh the benefits against the harm. it has a higher threshold in cases involving the suppression of speech. merely showing that harm is -- you know, that the good outweighs the bad is sort of what the first amendment is on the scales of that determination, and i think it does make it more difficult. it is not simply enough to say that this is present -- preventing harm when the harm is speech related. we require more. we require more precision in the drafting of those statutes to do everything possible to ensure that -- everything possible to ensure that protected speech is not swept in. once we do that, if it is an economic crime, we do not have to be that precise. marianne is of course right -- no law is perfectly precise and gets 100% of the bad guys and 0% of everybody else, but in the first amendment context, we require efforts to at least move in that direction that, i think, would be difficult in this context. not impossible, but very difficult. >> section 230 came up again. i don't know if you want to pick up that conversation again as well as how that applies here. >> just to say that section 230 is one of congress' great legislative achievements. i am prepared to say that. i think it is in large measure. you know, we are all in congress-bashing mode all the time. other than those who are sitting in this room, i suppose, but section 230 was of critical importance and helping the internet become the internet. in 1996, this community you could not have -- facebook, tumbler, twitter, you name it. the explosive -- explosion of user-generated content was unthinkable. there are many reasons to think -- this is an active debate, of course, about whether broad immunity for intermediaries is a good thing or bad thing. i guess one thing to consider is as part of that debate, is tweaking that law a little bit -- an exception for this, exception for that, additional exception for something else -- probably makes it go away and rapid order. the immunity would disappear. there are a lot of claimants who would like to see section 230 -- lots of people -- people who have been defamed, people whose privacy has been invaded, people who have been scammed, people who have been defrauded -- all sorts of things -- who would like to see an exception for their harm, as it were, caused -- carved out of 230. they have a good argument. why can we not just make sure there is a remedy in this case? i think once congress goes down that road and starts carving out exceptions, the floodgates will open, and 230 will be -- will largely disappear. i think that would be a dreadful thing for the internet. >> our well-connected friends in the entertainment industry have suggested all kinds of tweaks to the nca that would pose a kinds of liability issues for websites. yes, they have tried, and it has not worked. i'm a little more interested in how we can use laws already on the books, the prosecutors can already go to court with to make life is painful and expensive as possible for the people who went after these ipod users and other like-minded creeps. >> i just wanted to build out a little bit on the section 230 point, to give an example of why, you know, those of us who are such staunch defenders of that law -- what role it really plays. imagine we take a person or website operator who knows or should have known this website -- this photo was shared without consent. if we had a law that said, you know, imagine i run my own photo hosting website. a created what i hope will be the next instagram. i get something way better than filters for photos. i don't know what it is, but i'm running my own site. a law on the book says i can be taken to court if summit he is claims that i know or should have known that a photo that was uploaded to my site was an image of another person shared without consent. currently, i can immediately get out of any lawsuit that somebody tries to come -- thousands of photos are uploaded to my site a day. it's doing very well, and somebody says, "a photo of me is on your site, and you should have known that i did not consent to it" -- under 230, i do not even get dragged into the court case. is very clear i cannot be held liable for this, and i can go back to my business of running a photo hosting website. if the law changed and there was this question of should i have known that this photo was shared without consent, then we are in the case where i as the operator of the website have to go to court. i have two or three employees for my business, and now i got to hire a lawyer. i'm operating on thin margins, and now i have a few legal fees because i have got to go and defend -- there's no way i could have known. if even we are talking of a good-faith operators who really had no knowledge and could not be considered to should have known that these kinds of photos were on their site, they will still have to go to court and defend that, and that is one of the real burdens that this kind of liability framework would put on operators of -- not even thinking about these giant internet platforms that deal with millions of pieces of content a day and what knowledge standard they have about tens of millions of voters hosted on their website, even just thinking about small companies, two or three-person operations -- it will be vastly more complicated. >> the flipside, copyright lawsuits are not something that lawyers on retainer normally do. i guess it is a larger issue that we have made the law something that people who can afford to hire lawyers can be good at. >> to respond to that specifically, we want to balance the harm. it may be true that especially the supreme court has gotten into the mode of saying we do not do balancing tests, concerns of overly broad laws must not be real. basically saying exactly that there are harms out there i can be addressed by this law, and you cannot complete say there could be all these things that might happen. there could. that's true, but they have to be real harms, and they have to be weighed against the legitimacy of any statute. if we are looking at the case of a poor site owner, it's true. there's no reason to say that will not create issues. of course there will be, but we also know that there are actual current harms, thousands of people who are actually being affected by this, whose lives are literally being ruined. that's a real harm as well to say that we are not sure what will happen to these particular site operators -- it is a concern, but it cannot be the only concern. as far as section 230, people waving their hands saying that they want carveouts as well -- it already has carveouts. it has already been made clear that we are going to say it does not apply. that has always been a matter of interpretation, always a question of who we are going to say gets protection and who does not, which interests are so valuable. we have to think about why that is the case. as long as we are going to talk about section 230, just one final note -- the goals of section 230, written in the statute itself, include to ensure vigorous enforcement of federal them and a lot to deter and punish trafficking and obscenity, stalking and harassment by means of computer. that is in section 230 itself, and that's what a lot of people seem to forget. it's not just about letting intermediaries do whatever they want. there are certain values and goals embedded in the idea that we would do well to ask if they are being served today. >> i would love to keep asking questions, and i will, but i want to ask a chance for the audience, if anyone has any pressing thoughts they would like to address with the panel. right here in the front. >> what is the flaw in the current law? there is defamation, intimidation -- whenever we have a high profile case, there's a desire from people who think they can solve the problem or people who were injured who want a new law specifically for that issue. >> why do we need a new law in this case? do you want to start? >> the civil rights work did not start with any high profile case. we started two years ago when average people were being affected by this. this has been something that is happening private citizens for years. this is not some high profile case we are responding to out of a sense of now that has happened to someone famous we care. we care about this because the experience of victims has been that none of these laws work. if the image is out there, if it was not necessarily by someone trying to harass, and many of them do not, just to give you a concrete example, just a couple of weeks ago, the highway patrol officers who have arrested women for drunk driving and taking the phones, taking naked pictures off their phones, sharing them with each other, he did not even 100 find out about it. find out how many times they get turned away by lawyer's. there's no money there, no reason to pursue this. for all these reasons, we are responding to an issue where thousands of victims have come forward and said that they cannot get any relief from the law. they are not going to second-guess -- i do going to second guess the victims because they are the ones experiencing this firsthand. i do going to second guess the victims because they are the ones experiencing this firsthand. >> but if law enforcement is telling women, "it's your own fault this happened to you," they are wrong. i know you agree with that, but that is not only just morally wrong and a faulty understanding of what it means to, you know, take your own photo or share photos in an intimate setting, but it is also -- probably indicates that they do not understand the laws that do exist. >> they might not be too effective in enforcing any law. >> right. by no means -- i hope no one gets the impression -- i do not know anyone who thinks criminal laws are the silver bullet to this problem. there is no silver bullet. we're asking tech companies to rethink their internal policies, asking people to engage in educational programs, inform people about the practice, and we are engaging with law enforcement and others because we want them to understand the stakes. by no means is it a silver bullet, but much like in the 1970's and 1960's when domestic violence was not considered a crime, when sexual assault was largely not considered a crime, especially if it was your husband who assaulted you, there is a social and legal importance to recognizing that this is a harm that should be addressed by the law, at least in theory. >> i was going to point to if we have a range -- we have a range of laws on the books that might be useful in different cases. whether it is a privacy tort, invasion of privacy, public disclosure of private facts, going after it from a hacking angle, whether it is a copyright remedy, intentional infliction of emotional distress, there's a mosaic of laws out there that are the way society has expressed that it is wrong to intentionally cause emotional distress to another person, and there are laws against that sort of thing. it's not going to mean that every single instance of this kind of exposure of a private photo is covered. there are going to be gaps where a case does not fit into every single aspect of or cannot fit into every single aspect of a current law, but if we try to craft a new crime that is expansively enough defined to cover every single instance of an exposed photo, we are absolutely going to sweep in other kinds of content, other kinds of expression, and that law is not going to survive first amendment scrutiny. that's the challenge we are facing. there's no silver bullet. >> many have said that facebook has gotten better at doing this. twitter has not done it yet. it's kind of insane for a social network that has been around since 2007. i hope they are taking it more seriously because they can do a lot. twitter is not the public internet. they have their own rules. they are allowed to change them to make it easier for people who are being harassed or people who see others being harassed to call out the offenders, and they have not done enough. >> going back to our case example for the day, a lot of these images were circulating for a long time on websites like 4chan and sort of blew up when they hit reddit, which had a thread which really made some of these images go viral, which made these shut down. it was sort of a question. for the users to use the way they want to, and it's difficult to say a website that is based on the idea of people having open forums to share and discuss what they would like also needs to be responsible for the making judgment calls of when it crosses the line, so what are some of the difficulties with that? perhaps you might want to jump in here. >> someone said a while back the biggest problem with the internet is the people on it. >> to that point, though, i would say i'm glad you brought up twitter as an example to dovetail with the point about intentional emotional distress. that's why i think we need to rethink the emphasis on emotional distress. when people are engaging in these activities, all these different types, not just some of the cases, but it was actually a pretty large number of these cases -- why are people doing this? because they think it is funny. because they think it is entertainment. not to cause emotional distress. why do we hold that is the one thing we would penalize? why is this person any better? why if he is doing it for profit, if he is getting ad revenue? why would we say it's totally fine to do it for that but just do not hurt her feelings? that seems like an odd emphasis to make exception for the public nature of this humiliation. >> don't you need to have, as part of a prohibition -- you have to think -- maybe i misunderstand, but you have to have some reference to an improper purpose. you're not saying, like i don't think, like the arizona statute, which is clearly unconstitutional, that you cannot post a picture of someone without their clothes on -- >> that's not what the statute says. >> and is not what you are saying. the difference is you, too, have to focus on the improper purpose. >> if purpose is meant by causing distress, no, it does not have to be focused on that. the motive for why some and does something, the motive for why someone spies and you in your bedroom -- why would that matter? they do it because they think you are funny looking or because they think you are arousing should not matter. we look at different categories and we are back to the theme of jennifer lawrence calling this a sex crime. think about the way sex crimes tend to be worded. we think of sexual assault in terms of consent. there are certain forms -- and again, this is true of our identity information and about our other forms of privacy. do we only criminalize disclosures of medical records when you intend to distress me with them? that is not even part of the statute at all. same thing when it comes to social security numbers. i just thought it would be funny to put your social security number out there. i did not mean to hurt your feelings. no one cares. when it comes to certain types of intimate information, the motive for why someone is doing it should not be the point. it's the lack of consent to do so. i think that is something that is becoming clearer to us as a society, that we has serious deep problems with sexual consent understandings in this country. we can see this in terms of how many sexual assaults are committed every year, but also that we seem to take it as a given whether a woman especially consented to the use of her body for sexual entertainment or enjoyment. i think it's time we started to rethink that. any other questions? >> i believe this was referred to as a hacking, but wasn't this really an instance of phishing rather than hacking? >> social engineering is still hacking. it's the easier kind, often. >> it involves folks being able to give their passwords, not necessarily a hacking of the icloud. >> it's unclear exactly what went down. apple has come out and said that their systems were not hacked. which is to say that apple writ large was not hacked. they did not rule out that individuals through sophisticated techniques, whether it was social engineering, whether it was phishing were able to get passwords for individual accounts. >> two things about apple as they are not generous with specifics about their products. everyone who is looked at things like touch id on the iphone thinks that is great. their cloud security -- they have had issues with it. >> to that question, does that matter? is that a significant distinction? we talked about how there's a difference between this case where you have perhaps a violation of cfaa -- where does the distinction of how the image was gotten come into play? >> i think back to what marianne was saying, where if you think consent is the fulcrum or the line, then whether you have hacked into someone's account or have a photograph that was sent to you or you have access to that account gotten in a perfectly reasonable way -- all of those would matter and would have to be evaluated. i mean, there is a much larger debate here about the role of consent. generally, with respect to information, on the internet, there is an active international debate with the so-called right to be forgotten in european -- there's various european laws where if you no longer consent to have information that has been published about you, you can sort of withdraw the consent, have the website delete information that may be floating around about you. again, not to beat a dead horse, but there's a familiar landscape in a sense for the first amendment debate, which has sort of been in the background of this consent, privacy, information debate for many years. what will it do to the free flow of information if you have to show that you have consent in some form for passing on a piece of information? this is related to this idea, i think, that the people should own the information about them, should own the information about them. should be a property interest, even, and therefore, people have to come to them if they want to publicize various things about them. that has serious, very difficult issues that we could call free speech or first amendment issues because it is very difficult to evaluate whether consent has been given in many circumstances as to find out how you demonstrate consent, and it would have a very serious impact on the sorts of things -- you know, can you tell people you saw me at this -- i would use this as an example about owning information. i saw this thing in the rayburn office building. if i own that information, you cannot. it is an extreme example. nobody suggests we should have such a law, but that is the issue, i think, with respect to sort of balancing the free flow of information on the one hand against reasonable requests for a showing of consent with respect to some information on the other. >> i think that is right, and it's one of the reasons why am actually optimistic about this particular type of material because there seems to be a fairly easy way to fix this. you want to disclose some but his image, ask them to sign a form, you can disclose away. make it easy. we have something like that when it comes to modeling releases. we have it when it comes to medical records. if you want to submit this information, and you think it is consensual because that is the only principled stance to take, make sure you have documented evidence. we can fix this. this is not nearly as hard as things like the right to be forgotten or a general question about what people can say about you. it's very specific and can be resolved through paperwork. i saw one more hand right here. >> given that there are already takedown regimes for child pornography and these other protected what have you, would it be that much more burdensome to require search engines, facebook, other tech companies to also takedown revenge pornography? do you think it would possibly impede the goals of small tech businesses? >> there are a couple of things you really have to keep in mind when you are talking about some kind of takedown regime. first and foremost, what a notice and takedown regime does is give a person the ability, a mechanism to tell, say, a website host to take down someone else's content, takedown something that was uploaded by another person. this is a mechanism that has been helpful in taking down, you know, infringing copies of movies and songs, but at its heart, it is giving a person ability to say, "take what that other person has uploaded." the potential for abuse of these systems is very high. when you look at something like the copyright takedown system, there are a number of safeguards built into the system based on what you have to include in a notice. it has got to include the ability to -- you have to identify yourself, including contact information. you have to attest that you are the legitimate owner of the copyright. the person who uploaded the content originally has the ability to push back and say, "no, this is actually my content," or "i have been making a fair use of this copyrighted work" or what have you. the site puts the information backup online and leaves them to fight it out in court. it's not as simple a mechanism as saying give someone an easy form to fill out, and the information comes down, and you are set. a lot needs to go into figuring out how to construct this takedown mechanism so that it is not so vulnerable to someone using it to say, "i don't like what that person said, so i'm going to file a takedown request and abuse the system." one of the real challenges we have to think about and we talk about questions around nude images is that there is a sensitivity and a privacy interest that the person depicted in the image might have. if it is your photo that has been posted without your consent online, you will want to, you know, file a takedown request and get that taken down. if you have to identify yourself in that request, that could cause privacy concerns, but if it is someone trying to abuse the system. if you have uploaded your photo under a pseudonym and you're happy with the photo being out there, but you do not want it connected with your real name, and someone else is trying to abuse the system to get the content taken down, your ability to respond requires you to disclose who you are. there are some issues, thinking about the vast range of nude imagery that is available on the internet. some of it is this kind of nonconsensual posting, but there is a lot of it that is uploaded anonymously or pseudonymously. we really have to take that into account. >> just to follow up a little bit, i think a notice of takedown regime is worth exploring at this point. the existing notice of takedown schemes, particularly the copyright one, which is the one in section 512, does require -- the burden is on the aggrieved party. it does not say google or facebook or twitter has to take stuff down. you have to respond to the copyright owner's identification of the infringing material, which i think is very important and has been very contentious, so they are working that out in the court, but they have more or less come to the resolution that it is the obligation of the aggrieved party to find -- and it's not a trivial obligation -- to find the material and send the notice in, at which point the process kicks in. there are all sorts of protections. you do have to be careful about allowing something to be abused. if it is too easy to submit a takedown notice, people will be using it for purposes that have nothing to do with the harms they are trying to protect against, but all those, i think, are -- the devil is in the details. the copyright takedown regime, if one wanted to go in that direction for these sorts of problems, i think would be worth looking at carefully. hundreds of millions of copyright infringing files are taken down weekly. in that sense, it has removed an enormous -- i know the copyright industry has gone crazy about it because they have to go find the material. they don't like that, but on the other hand, it has done the job pretty well. it has provided a process at scale, and scale is always important when you are talking about the internet. whatever we're talking about, we are talking about millions of it, and that scale has allowed the automation of takedown, but yet, protecting the people who have uploaded and giving them an avenue to say, "i did not post it" or it's not infringing or whatever there argument might be. that will be worth looking at carefully and see how it could be modeled to work on this problem. it might be a useful avenue of approach. >> you mentioned search engines. we should be careful about going too far. someone like youtube can have content ided, but also, they have a known universe of copyrighted material that they get from the entertainment industry that they can use to match against this. there's no such thing when it comes to people's private photos. trying to do a general search and match -- webmail sites can do automated screening against child porn because there's no such thing as -- there's no consensual anything there. that's why they are illegal, and there is a hash database assembled by the national association of missing and exploited children, and they can compare the hash of the image, the mathematical shortcut to it, to what is in that database. doing that for the broader universe -- that's not going to work. >> just one clarification -- child pornography is not quite that easy because it's not always easy to tell whether the person depicted is in fact a minor, so they do have to engage in some judgment calls and some investigation. not necessarily more onerous than figuring out whether or not a picture was consensual. unfortunately, we have broken a promise to keep it to about 60 minutes, but we were close. wonderful conversation that i'm sure could continue for hours, but we appreciate all of you coming. thanks very much. [applause] [captions copyright national cable satellite corp. 2014] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org]

Related Keywords

Miami , Florida , United States , Berkeley , California , Arizona , Maryland School , Texas , Virginia , Syria , Washington , District Of Columbia , Texans , Aaron Swartz , Johns Hopkins , Jennifer Lawrence , Marianne Franks , Bob Goodlatte , Sarah Palin ,

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.