comparemela.com

Card image cap

Pretty crazy idea. Facebook Ceo Mark Zuckerberg will testify. And im responsible for what happens here. Narrator can facebook be fixed . In light of recent revelations that the company may have covered up russian inference in the 2016 election. T problem is too big, because facebook is too big. Narrator tonight on frontline, part two of the facebook dilemma. Frontline is made possible by contributions to your pbs station from viewers like you. Thank you. And by the corporation for public broadcasting. Major support is provided by the john d. And catherine t. Macarthur foundation, committed to building a more just, verdant and peaceful world. More information is available at macfound. Org. The ford foundation, working with visionaries on the front lines of social change worldwide. At ford foundation. Org. Additional support is provided by the abrams foundation, committed to excellence in journalism. The park foundation, dedicated to heightening Public Awareness of critical issues. The john and Helen Glessner family trust. Supporting trustworthy journalism that informs and inspires. And by the frontline journalism fund, with major support from jon and jo ann hagler. Corporate support is provided by. The zip code youre born into can determine your future, your school, your job, your dreams, your problems. At the y, our goal is to create opportunities no matter who you are or where youre from. The y, for a better us. I accept your nomination for president of the united states. I humbly accept your nomination for the presidency of the united states. Hey, everyone. We are live from my backyard, where i am smoking a brisket and some ribs and getting ready for the president ial debate tonight. Some of the questions for tonights debate will be formed by conversations happening on facebook. 39 of people get their election news and decisionmaking material from facebook. Facebook getting over a billion Political Campaign posts. I love this, all the comments that are coming in. Its, like, im sitting here, smoking these meats and, um, and just hanging out with 85,000 people who are hanging out with me in my backyard. Make no mistake, everything you care about, everything i care about and ive worked for, is at stake. I will beat Hillary Clinton, crooked hillary, i will beat her so badly, so badly. I hope that all of you get out and vote. This is going to be an important one. Tonights broadcast will also include facebook, which has become a Gathering Place for political conversation. cheers and applause thank you. Thank you. Facebook is really the new town hall. Better conversations happen on facebook. Poke for a vote. Poke for a vote. U. S. A. u. S. A. Hillary Hillary facebook is the ultimate growth stock. Facebook is utterly dominating this new, mobile, digital economy. Have you been measuring political conversation on facebook, things like the most likes, interactions, shares. Hillary clinton has evaded justice. I thank you for giving me the opportunity to, in my view, clarify. 2016 is the social election. Facebook getting over a billion Political Campaign posts. Narrator 2016 began as banner year for Mark Zuckerberg. His company had become one of the most popular and profitable in the world, despite an emerging dilemma that, as it was connecting billions, it was inflaming divisions. People really forming these tribal identities on facebook, where you will see people getting into big fights. Narrator weve been investigating warning signs that existed as facebook grew, and interviewing those inside the company who were there at the time. We saw a lot of our numbers growing like crazy, as did the rest of the media and the news world in particular. And so, as a product designer, when you see your products being used more, youre happy. Its where were seeing conversation happening about the election, the candidates, the issues. Narrator amid all this political activity on facebook, no one used the platform more successfully than Donald Trumps Digital Media director, brad parscale. I asked facebook, i want to spend 100 million on your platform send me a manual. They say, we dont have a manual. I say, well, send me a human manual, then. James jacoby and what does the manual provide . You have a manual for your r. If you didnt have that for your car, there might be things you would never learn how to use in your car, right . I spent 100 million on a platfo, the most in history, it made sense for them to be there to help us make sure how we spent it right and did it right. With custom audnces, you can get your ads to people you already know who are on facebook. Narrator what facebooks representatives showed them s how to harness its powerful advertising tools to find and target new and receptive audiences. Now ill target my ad to friends of people who like my page. What i recognized was the simple process of marketing. I needed to find the right people and the right places to show them the right message. Microtargeting allows you to do is say, well, these are the people that are most likely to show up to vote, and these are the right audiences we need to show up. The numbers were showing in the consumer side that people were spending more and more hours of their day consuming facebook content, so if you have any best place to show your content, it would be there. It was a place where their eyes were. Thats where they were reading their local newspaper and doing things. And so we could get our message injected inside that stream. And that was a seam which was controlling the eyeballs of most places that we needed to win. Narrar it wasnt just politics. By this time, facebook was also dominating the news business. 62 of americans say they get their news from social media sites like facebook. More than a dozen developers have worked with us to build social news apps, all with the goal of helping you discover and read more news. Narrator facebooks massive audience enticed media organizations to publish raight into the companys news feed making it one of the most important distributors of news in the world. Im personally really excited about this. I think that it has the potential to not only rethink the way that we all read news, but to rethink a lot of the way that the whole news industry works. Narrator but unlike traditional media companies, facebook did not see itself as responsible for ensuring the accuracy of the news and information on its site. The responsibilities that they should have taken on are what used to be called editing. And editors had certain responsibilities for what was going to show up on the first page versus the last page, the relative importance of things that dont relate purely to money and dont relate purely to popularity. So they took over the role of editing without ever taking on the responsibilities of editing. Narrator instead, facebooks editor was its algorithm, designed to feed users whatever was most engaging to them. Inside facebook, they didnt see that as a problem. Jacoby was there a realization inside facebook as to what the responsibilities would be of becoming the main distributor of news . I dont think there was a lot of thinking about that, that idea. I dont think there was any, any thought that news content in particular had, had more value or had more need for protection than any of the other pieces of content on facebook. Narrator andrew anker was in charge of facebooks news products team, and is one of eight former facebook insiders who agreed to talk on camera about their experiences. I was surprised by a lot of things when i joined facebook. And as someone who grew up in the media world, i expected there to be more of a sense of how people interact with media and how Important Media can be to certain peoples information diet. applause we have a video from davida from napoli. No. laughter you know, were a technology company. Were not a media company. The fact that so many big, wellknown news brands really pushed into facebook pretty aggressively legitimized it as a place to get, kind of, information. And i think that also strangely created the opportunity for people who werent legitimate, as well. Because if the legitimate players are there, and youre not legitimate, all you need to do is set up a website and then share links to it, and your stuff on facebook is going to look similar enough that youve just gotten a huge leg up. Hillary clinton is the most corrupt person. Narrator but as the 2016 campaign heated up. And ill tell you, some of what i heard coming from my opponent. Narrator . Reporter Craig Silverman was sounding alarms that facebooks news feed was spreading misinformation what he called fake news. Fake news just seemed like the right term to use. And i was trying to get people to Pay Attention. I was trying to get journalists to Pay Attention. I was trying to also get facebook and other Companies Like twitter to Pay Attention to this, as well. Narrator silverman traced misinformation back to some unusual places. We started to see this small cluster of websites being run, the vast majority, from one town in macedonia. How pular is it . About 200 people, maybe. 200 people . Yeah. Are making fake news websites . Yes. Most of them didnt really care about who won the election. They werent in this for politics. If you put ads on these completely fake websites, and youve got a lot of traffic from facebook, that was a good way to make money. There are some people who made, like, 200k or Something Like that. 200,000 euros . Yeah, yeah, yeah. I remember one guy, i think he was 15 or 16 years old, telling me, you know, americans want to read about trump, so im writing trump stuff. Trump earned them money. We saw macedonias publishing Hillary Clinton being indicted, the pope endorsing trump, Hillary Clinton selling weapons to isis, getting close to or above a million shares, likes, comments. Thats an insane amount of engagement. Its more, for example, than when the New York Times had a scoop about Donald Trumps tax returns. How is it that a kid in macedonia can get an article that gets more engagement than a scoop from the New York Times on facebook . Jacoby a headline during the campaign was pope endorses trump, which was not true, but it went viral on facebook. Was it known within facebook that that had gone viral . Um, im sure it was. I didnt necessarily know how viral it had gotten, and i certainly didnt believe that anybody believed it. Jacoby but would that have been a red flag inside the company, that something thats patently false was being propagated to millions of people on the platform . I think if you asked the question that way, it would have been. But i think when you asked, then the next question, which is the harder and the more important question, was, which is, so what do you do about it . , you then very quickly get into issues of not only free speech, but to what degree is it anybodys responsibility, as a Technology Platform or as a distributor, to start to decide when youve gone over the line between something that is clearly false from something that may or may not be perceived by everybody to be clearly false and potentially can do damage . Jacoby over the course of the 2016 election, there was a lot of news about misinformation. I mean, there was, famously, the pope endorses trump. Do you remember th . Absolutely. I, i wasnt working on these issues at the time, but, but absolutely i, i do remember it. Nartor tessa lyons was chief of staff to facebooks number two, sheryl sandberg, and is now in charge of fighting misinformation. She is one of five current officials facebook put forward to answer questions. Jacoby was there any kind of sense of, like, oh, my goodness, facebook is getting polluted with misinformation someone should do something about this . There certainly was, and there were people who were thinking about it. What i dont think there was a real awareness of, internally or externally, was the scope of the problem and the, the right course of action. Jacoby how could it be surprising that, if youre becoming the worlds information source, that there may be a problem with misinformation . There was certainly awareness that there could be problems related to ns or quality of news. And i think we all recognized afterwards that of all of the threats that we were considering, we focused a lot on threats that werent misinformation and underinvested in this one. Narrator but there was another problem that was going unattended on facebook beyond misinformation. One of the big factors that emerged in the election was what started to be called Hyperpartisan Facebook pages. These were facebook pages that kind of lived and died by really ginning up that partisanship were right, theyre wrong. But not even just that, it was also, theyre terrible people, and were the best. And the facebook pages were gettintremendous engagement. A million migrants are coming over the wall, and theyre going , like, rape your children, you know . That stuff is doing well. And the stuff that was true would get far less shares. The development of these hyperpartisan sites i think turned the informational commons into this trash fire. And theres some kind of parable in that for the broader effects of facebook. That the very things that divide us most cause the most engagement. barking, laughing which means they go to the top of the news feed, which means the most people see them. Narrator this worried an early facebook investor who was once close to zuckerberg. I am an analyst by training and profession. And so, my job is to watch and interpret. At this point, i have a series of different examples that suggest to me that there is something wrong, systemically, with the facebook algorithms and business model. In effect, polarization was the key to the model. This idea of appealing to peoples lowerlevel emotions, things like fear and anger, to create greater engagement, and in t context of facebook, more time on site, more sharing, and, therefore, more advertising value. I found that incredibly disturbing. Narrator ten days before the election, mcnamee wrote zuckerberg and sandberg about his concerns. I mean, what i was really trying to do was to help mark and sheryl get this thing right. And their responses were more or less what i expected, which is to say that what i had seen were isolated problems, and that they had addressed each and every one of them. I thought facebook could stand up and say, were going to reassess our priorities. Were going to reassess the metrics on which we run the company to try to take into account the fact that our impact is so much greater now than it used to be. And that as facebook, as a company with, you know, billions of users, we have influence on how the whole social fabric works that no ones had before. cheers and applause ive just received a ca from secretary clinton. Clinton has called trump to concede the elecon. The Clinton Campaign is. Really a somber mood here. The crowd here at Trump Campaign headquarters. Narrator trumps targeted ads on facebook paid off. Did things like Facebook Help one of the nastiest elections ever . Narrator . Leading to complaints that Facebook Helped tilt the election. Facebook elected donald trump, thats basically. Narrator . Which the Trump Campaign dismissed as anger over the results. There has been mounting criticism of facebook. No one ever complained about facebook for a single day until donald trump was president. The only reason anyones upset about this is that donald trump is president and used a system that was all built by liberals. When i got on tv and told everybody after my interview of what we did at facebook, it exploded. The funny thing is, the Obama Campaign used it, then went on tv and newspapers, and they put it on the front of magazine, and the left and the media called them geniuses for doing that. Accusations that phony news stories helped donald trump win the presidency. Narrator trumps victory put facebook on the spot. Facebook even promoted fake news into its trending. Narrator and two days after the election, at a tech conference in northern california, zuckerberg spoke publicly about it for the first time. Well, you know, one of the things postelection, youve been getting a lot of pushback from people who feel that you didnt filter out enough fake stories, right . You know, ive seen some of the stories that youre talking about, around this election. There is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news. You know, personally, i think the, the idea that, you know, fake news on facebook, of which, you know, its, its a very small amount of, of, of the content, influenced the election in any way, i think, is a pretty crazy idea, right . If i had been sitting there in an interview, i would have said, youre lying. When he said, we had no impact on the election, that. I remember reading that and being furious. I was, like, are you kidding me . Like, stop it. Like, you cannot say that and not be lying. Of course they had an impact, its obvious. They were the most important distribution, news distribution. There are so many statistics about that. Like, i dont know how you could possibly make that claim in public and with such a cavalier attitude. That infuriated me. And i texted everybody there, saying, youre kidding me. Jacoby is he not recognizing the importance of his platform in our democracy at that point in time . Yes, i think he didnt understand what he had built, or didnt care to understand or wasnt paying attention, and doesnt. They really do want to pretend, as theyre getting on their private planes, as theyre getting. Going to their beautiful homes, as theyre collecting billions of dollars, they never want to acknowledge their power. Theyre powerful, and they have. They dont. Thank you so much for being here. Thank you, guys. I think it was very easy for all of us sitting in menlo park to not necessarily understand how valuable facebook had become. I dont think any of us, mark included, appreciated how much of an effect we might have had. And i dont even know today, two years later, or almost two years later, that we really understand how much of a true effect we had. But i think more importantly, we all didnt have the information to be saying things like that at the time. My guess is, is that mark now realizes that there was a lot more to the story than, than he or any of us could have imagined at that point. Narrator barely two months later, in washington, an even more serious situation was developing. Intelligence agencies were investigating russian interference in the election, and whether soci media had played a role. Classical propaganda, disinformation, fake news. Does that continue . Yes. In my view, we only scratched the surface. I say we, those that assembled the Intelligence Community assessment that we published on the 6th of january 2017, meaning nsa, c. I. A. , fbi, and my office. But i will tell you, frankly, that i didnt appreciate the full magnitude of it until well after. Narrator amid growing scrutiny. All right. Narrator . Zuckerberg set out on a crosscountry trip he publicized by streaming on facebook. So ive been going around to different states for my personal challenge for the year to see how different communities are working across the country. Narrator but while he was on the road, the news was getting worse. The u. S. Intelligence Community Officially is blaming russian president. Russian president Vladimir Putin ordered an influence campaign aimed at the president ial election. Narrator zuckerbergs chief of security, alex stamos, had been asked to see what he could find on facebooks servers. We kicked off a big look into the fake news phenomenon, specifically what component of that might have a russian part in its origin. Narrator they traced disinformation to what appeared to be russian governmentlinked sources. Jacoby so, what was it like bringing that news to others in the company, and up to mark and sheryl, for instance . You know, we had a big responsibility in the Security Team to educate the right people about what had happened without being kind of overly dramatic. Its kind of hard as a security person to balance that, right . Like, everything seems like an emergency to you. But in this case it really was, right . This really was a situation in which we saw the tip of this iceberg, and we knew there was some kind of iceberg beneath it. Narrator stamos expanded his investigation to look at how the russian operation may have also used facebooks targeted advertising system. So what we did is, we then decided were going to look at all advertising and see if we can find any strange patterns that might link them to russian activity. So we enlisted huge parts of the company. We kind of dragooned everybody into one big, unified team. So you have people in a war room working 70, 80hour weeks, billions of dollars of ads, hundreds of millions of pieces of content, and by kind of a painstaking process of going through thousands and thousands of false positives, eventually found this large cluster that we were ableo link to the Internet Research agency of st. Petersburg. Narrator it was one of the same groups that had been using facebook to spread disinformation in ukraine three years earlier. This time, using fake accounts, russian operatives had paid around 100,000 to run ads that promoted political messages and enticed people to join fake facebook groups. What the Internet Research agency wants to do is, they want to create the appearance of legitimate social movements. So they would create, for example, a proimmigration group and an antiimmigration group. And both of those groups would be almost caricatures of what those two sides think of each other. And their goal of running ads were to find populations of people who are open to those kinds of messages, to get them into those groups, and then to deliver content on a regular basis to drive them apart. Really what the russians are trying to do is to find these fault lines u. S. Society and amplify them, and to make americans not trust each other. Narrator in september 2017, nearly a year after the election, zuckerberg announced on facebook what the comny had found. We are actively working with the u. S. Government on its ongoing investigatns into russian interference. Weve been investigating this for many months now, and for a while, we had found no evidence of fake accounts linked to russian. Linked to russia running ads. When we recently uncovered this activity, we provided that information to the special counsel. We also briefed congress. And this morning, i directed our team to provide the ads weve found to congress, as well. We do know that facebookrelated posts touched about 150 million americans, that were posts that originated either through russian fake accounts or through paid advertising from the russians. But the paid advertising was really a relatively small piece of the overall problem. A much bigger problem was the ability for someone to say they were james in washington, dc, but it was actually boris in st. Tersburg creating a fake persona that would generate followers, and then they would seed it with the fake information and the false news and the political content. One account was set up to try to rally the Muslim Community in, in texas. Another was an attempt to kind of rally the right wing in texas. They created an event. White power stop the hate stop the fear protest, with both sides protesting against each other. yelling at a mosque in houston, in 2016. This is america, we have the right to speak out. yelling but for the good work of the houston police, you could have had the kind of horrible activity take place then and there that i saw unfortunately take place in charlottesville in my state last year. So the real human consequences of some of these. Of some of this abuse, weve been very lucky that it hasnt actually cost peoples lives. Narrator facebook also found that the russians had used the site to orchestrate a protrump rally outside of a Cheesecake Factory in florida and to promote an titrump protest in new york city just after the election. Hey, hey, ho, ho, donald has got to go. We are under threat, and i need to defend the country that i love. We are right in the middle of the protest. Narrator the details of facebooks internal investigation set off alarm bells in washington. Were such a ripe target for that sort of thing, and the russians know that. So the russians exploited that divisiveness, that polarization, because they had, they had messages for everybody. You know, black lives matter, white supremacists, gun control advocates, gun control opponents, it didnt mter they had messages for everybody. Jacoby did you think that was a pretty Sophisticated Campaign . It was. And i believe the russians did a lot to get people out to vote that wouldnt have and helped the appeal for. Of donald trump. Jacoby and the role that social media played in that was what . It was huge. I mean, its really quite both ingenious and evil to, to attack aemocratic society in that manner. Jacoby but there were warning signs along the way in the trajectory of the company. The companys been dealing with the negative side effects of its product for years, right . When you have two billion people on a communication platform, theres a infinite number of potentially bad things that could hapn. The tough part is trying to decide where youre going to put your focus. Narrator buty 2017, facebook was being accused of not focusing on other serious issues in developing, fragile democracies where the company had expanded its business. Countries like the philippines, where almost all Internet Users are on facebook, and problems had been mounting. In a year, i probably met with more than 50 different officials, highranking officials, including Mark Zuckerberg. I wanted them to know what we were seeing, i wanted them to tell me what they thought about it, and i wanted them to fix it. Narrator maria ressa, who runs a prominent news website, says she had been warning facebook since 2016 that president Rodrigo Duterte was using a network of paid followers and fake accounts to spread lies about his policies and attack his critics. The u. N. s branded his war a crime under international law. Narrator especially critics of his brutal war on drugs, which has taken an estimated 12,000 lives. Human rights watch has called governmentsanctioned butchery. President duterte was targeting anyone who questioned the drug war, anyone who questioned the alleged extrajudicial killings. Anyone on facebook who questioned that would get brutally bashed. Were protected by the constitution, weve been stripped of those protections online. Narrator ressa herself would eventually come under attack. There were attacks on the way i look, the way i sounded, that i should be raped, that i should be killed. We gave it a name patriotic trolling. Online, statesponsored hate that is meant to silence, meant to intimidate. So this is an information ecosystem that just turns democracy upsidedown. Jacoby and where lies are prevalent. Where lies are truth. Narrator she traced the disinformation to a network of 26 fake accounts and reported it to facebook at a meeting in singapore in august of 2016. Jacoby what were you asking them to do . Exactly what every news group does, which is, take control and be responsible for what you create. Jacoby were you given an explanation as to why they werent acting . No. No. I think facebook walked into the philippines, and they were focused on growth. What they didnt realize is that countries like the philippines. chanting . Countries where institutions are weak, where corruption is rampant, these countries dont have the safeguards. And what happens when you bring everyone onto a platform and do not exercise any kind of rules, right . If you dont implement those rules beforehand, youre going to create chaos. Jacoby theres a problem in the philippines, weve heard about from people on the ground there, that facebook has been to some degree weaponized by the duterte regime there. What are you doing to, to stem this problem in the philippines . One thing were trying to do, any time that we think there might be a connection between violence on the ground and online speech, the first thing for us to do is actually understand the landscape. Narrator Monika Bickert is facebooksead of Global Policy and worked for the Justice Department in southeast asia. Theres a fundantal question, which is, what should our role be, and as we are identifying misinformation, should we be telling people what were finding, should we be removing that content, should we be downranking that content . And we now have a team that is focused on how to deal with exactly that sort of situation. Narrator in april 2018, facebook created a news Verification Program and hired ressas organization as one of its factcheckers, though she says the problems are ongoing. The company ultimately took down the accounts ressa identified and went on to remove dozens more. I think what is happening is that this company is way in over its head in terms of its responsibilities. Its way in over its head in terms of what power it holds. The idea isnt that its just like you magically add facebook and horrible things happen, but you have facebook as this effective gasoline to simmering fires. shouting narrator elsewhere in the region. Buddhists are inciting hatred and violence against muslims through social media. Narrator . Facebook was also being used to fan ethnic tensions with even more dire consequences. Violence between buddhists and muslims is continuing. Misinformation, disinformation, rumors, extremist propaganda, all kinds of bad content. Narrator for several years, david madden, a tech entrepreneur living in myanmar, as well as journalists and activists, had been warning facebook that the musl minority there was being targeted with hate speech. speaking local language you would see the use of memes, of images, things that were degrading and dumanizing, targeting the Muslim Community. Speaking local language narrator the warning signs had been present as far back as 2014, when a fake ws story spread on facebook. Reports, later proved to be false, that some muslim men had raped a buddhist woman, were shared on facebook. An angry mob of about 400 surrounded the sun teashop, shouting and throwing bricks and stones. Narrator two people died in the incident. One buddhist and one muslim were killed in riots today. I was really concerned that the seriousness of this was not understood. And so i made a presentation at facebook headquarters in may of 2015. I was pretty explicit about the state of the problem. I drew the analogy with what had happened in rwanda, where radios had played a really y role in the execution of its genocide. And so i said, facebook runs the risk of being in myanmar what radios were in rwanda. That this platform could be used to foment hate and to incite violence. Jacoby what was the reaction to that at facebook . I got an email shortly after that meeting to say that what had been discussed at that meeting had been shared internally and apparently taken very seriously. Narrator the violence intensified. Massive waves of violence that displaced over 150,000 people. Narrator and in early 2017, madden and other local activists had another meeting with facebook. The objective of this meeting was, was really to be crystalclear about just how bad the problem was, and that the processes that they had in place to try to identify and pull down problematic content, they just werent working. And we were deeply concerned that something even worse was going to happen imminently. It was a sobering meeting. I think. I think the main response from facebook was, well need to go away and dig into this and come back with something substantive. The thing was, it never came. And how do you know that . We can look at the evidencet . On the ground. What weve seen here tells us a story of ethnic cleansing, of driving muslims out of myanmar. Narrator the United Nations would call the violence in myanmar a genocide, and found social media, and facebook in particular, had played a significant role. The ultranationalist buddhists have their own facebooks and really inciting a lot of violence and hatred against ethnic minorities. Facebook has now turned into a beast than what it was originally intended to be used. Acoby im curious what its like when the u. N. Comes out with a report that says that facebook played a significant role in a genocide. Running content policy at facebook . Well, this would be important to me even if i didnt work at facebook, given my background. My background is as a federal prosecutor, and i worked specifically in asia and specifically on Violent Crimes against people in asia. So Something Like that really hits home to me. Jacoby facebook was warned as early as 2015 about the potential for a really dangerous situation in myanmar. What went wrong there . Why was it so slow . We met with Civil Society organizations in myanmar far before 2015. This is an area where weve been focused. I think what weve learned over time is, its important for us to build the right technical tools that can help us find some of this content and also work with organizations on the ground in a realtime fashion. We are in the process of building those relationships around the world on a much deeper level, so that we can stay ahead of any kind of situation like that. Narrator throughout 2018, Facebook Says its taken down problematic accounts in myanmar, hired more language experts, and improved its policies. Jacoby should there be any liability or any legal accountability for a company like facebook when something so disastrous goes wrong on your platform . Theres all sorts of accountability. But probab the group that holds us the most accountable are the people using the service. If its not a safe place for them to come and communicate, they are not going to use it. We are working here in menlo park in palo alto, california. To the extent that some of these issues and problems manifest in other countries around the world, we didnt have sufficient information and a pulse on what was happening in southeast asia. Narrator naomi gleit is facebooks secondlongest serving employee. And so one change that weve made, along with hiring so many more people, is that a lot of these people are based internationally and can give us that insight that we may not get from being here at headquarters. Jacoby im trying to understand, you know, the choices that are made. Do you regret choices going backward, decisions that were made about not taking into accounrisks or not measuring risks . Yeah, i definitely think we regret not having 20,000 people working on safety and security back in the day, yes. So i regret that we were too slow, that it wasnt our priority. Jacoby but were those things even considered at the time . To kind of amp up safety and security, but there was some reason not to or. Not really. I mean, we had a safety and Security Team. I think we just thought it was sufficient. I just. Its not that we were, like, well, we could do so much more here, and decided not to. I think we. We just didnt. Again, we were just a bit idealistic. Facebook has created this platform that in many countries, not just myanmar, has become the dominant information platform, and it has an outsized influence in lots of countries. That comes with a lot of responsibility. Using social media, rumors of alleged muslim wrongdoing spread fast. Many of those countries are wrestling with some pretty big challenges. Tensions between groups within countries, and we have seen this explode into what Mark Zuckerberg would call realworld harm, what others would just call violence or death, in many other markets. Were seeing it right now in india. Calloo became a victim of indias fake news. Weve seen examples of this in places like sri lanka. To keep the violence from spreading, sri lanka also shut down facebook. The myanmar example should be sounding an alarm at the highest level of the company, that this requires a comprehensive strategy. Narrator but it would be far from myanmar, and a very different kind of problem, that would cause an International Uproar over facebook. Cambridge analytica and its mining of data on millions of americans for political purposes. Cambridge is alleged to have used all this data from tens of millions of facebook use. Escandalo Cambridge Analytica, facebook. reporters speaking different languages narrator it was a scandal over how facebook failed to protect users data, exposed by a whistleblower named christopr wylie. Christopher wylie, he was able to come forward and say i can prove this. Narrator he said that facebook knew that a Political Consulting firm hed worked for, Cambridge Analytica, had been using the personal data of more than 50 million users to try to influence voters. At Cambridge Analytica, we are creating the future of Political Campaigning. This is a company that specializes and would advertise itself as specializing in rumor campaigns. Political campaigns have changed. Seeding the internet with misinformation. Putting the right message in front of the right person at the right moment. And thats the power of data. You can literally figure out who are the people who are most susceptible. Data about personality, so you know exactly who to target. Narrator the firm gained access to the data from a third party, without facebooks permission. The overwhelming majority of people who had their Data Collected did not know. When data leaves facebooks servers, there is no way for facebook to track that data to know how that data is being used or to find out how many copies there are. Narrator facebook eventually changed its data sharing policies and ordered Cambridge Analytica to delete the data. We know that facebook had known about this. Narrator after wylie came forward, they banned the firm from their site, and announced they were ending another controversial practice working directly with companies known as data brokers. But the uproar was so intense that in april 2018, Mark Zuckerberg was finally called before congress, in what would become a reckoning, over facebooks conduct, its business model, and its impact on democracy. We welcome everyone todays hearing on facebooks social Media Privacy and the use and abuse of data. I now turn to you, so proceed, sir. We face a number of important issues around privacy, safety, and democracy. And you will rightfully have some hard questions for me to answer. Facebook is an idealistic and optimistic company. And as facebook has grown, people everywhere have gotten a powerful new tool for making their voices heard and for Building Communities and businesses. But its clear now that we didnt do enough to prevent these tools from being used for harm, as well. And that goes for fake news, for foreign interference in elections, and hate speech, as well as developers and data privacy. We didnt take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake. And im sorry. If, like me, youre following this stuff, you see years and years and years of people begging and pleading with the company, saying, please Pay Attention to this, at every channel people could find. And basically being ignored. We hear you, youre concerned, we apologize, of course we have a responsibility, well do better. And the Public Record here is that they are a combination of unable and unwilling to grasp and deal with this complexity. You may decide, or facebook may decide, it needs to police a whole bunch of speech, that i think america might be better off not having policed by one company that has a really big and powerful plam. Senator, i think that this is a really hard question. And i think its one of the reasons why we struggle with it. These are very, very powerful corporations. They do not have any kind of Traditional Democratic accountability. And while i personally know a lot of people making these decision if we set the norms that these Companies Need to decide what, who does and does not have a voice online, eventually that is going to go to a very dark place. When compans become big and powerful, there is a instinct to either regulate or break up, right . I think were finding ourselves now in a position where people feel like something should be done. Theres a lot of questions what should be done, but theres no question that something should be done. You dont think you have a monopoly . It certainly doesnt feel like that to me. Okay. You know, theres a lot of problems here, there, but all of these problems get worse when one company has too much power, too much information, over too many people. Narrator after years of unchecked growth, the talk now is increasingly about how to rein in facebook. Already, in europe, theres a new internet privacy law aimed at Companies Like facebook. Inside the company, the people we spoke to insisted that facebook is still a force for good. Jacoby has there ever been a minute where youve questioned the mission . You know, internally . Whether anyone has taken a second to step back and say, all right, has this blinded us in some way . Have you had a moment like that . I still continue to firmly believe in the mission, but in terms of stepping back, in terms of reflecting, absolutely. But that isnt on the mission. The reflection is really about, how can we do a better job of minimizing bad experiences on facebook . Jacoby why wasnt that part of the metric earlier . In terms of, how dyou minimize the harm . You know, its possible that we could have done more sooner, and we havent been as fast as we needed to be. Narrator that line was repeated by all the current officials facebook put forward to answer questions. Weve been too slow to act on. I think we were too slow. We didnt see it fast enough. We were too slow. Mark has said this, that we have been slow. One of my greatest regrets in running the company is that we were slow in identifying the russian Information Operations in 2016. And were going to take a number of measures, from building and deploying new a. I. Tools that take down fake news, to growing our Security Team to more than 20,000 people. The goal here is to deepdive on the market nuances there. Narrator the company says its now investing resources and talent to tackle a range of problems, from the spread of hate speech to election inrference. Even if we cant do factchecking, if we can do more work around the programmatic aspect of it. Narrator this is part of the team tackling the spread of misinformation around the world, led by tessa lyons. The Elections Integrity Team has a framework for how theyre thinking about secondary languages in each country. And i feel like from the sinformation side, weve mostly prioritized primary languages. Narrator its a problem the Company Admits it is a long way from solving. The next thing is about the arabic factchecking project. I think the main blocker here is potentially getting a factchecker that can cover an entire region. You know, i came into this job asking myself, how long is it going to take us to lve this . And the answer is, this isnt a problem that you solve. Its a problem that you contain. Awesomeme. Next, segue into upcoming launches. Narrator in advance of the 2018 midterms, facebook mobilized an Election Team to monitor false news stories and delete fake accounts that may have been trying to influence voters. Nathaniel gleicher runs the team. There are going to be actors that are going to try to manipulate that public debate. How do we figure out what are the techniques theyre using and how do we make it much harder . Jacoby is there going to be realtime monitoring on election day of whats going on on facebook, and how are you going to actually find things that may sow distrust in the election . Absolutely, were going to have a team on election day focused on that problem, and one thing thats useful here is, weve already done this in other elections. Jacoby and youre confident you can do that here . I think that. Yes, im confident that we canan do this here. Narrator gleicher says his Team Continues to find foreign actors using the platform to spread disinformation. Iran was revealed to be a new player in worldwide disinformation campaigns, and on top of this. Arrator and in october of 2018, federal prosecutors announced theyd found evidence that russian operatives had been trying to interfere in the u. S. Midterm election. Jacoby what is the standard that the publishould hold facebook to, in terms of solving some of these seemingly enormous problems . I think the standard, the responsibility, what im focused on, is amplifying good and minimizing the bad. And we need to be transparent about what were doing on both sides, and, you know, i think this is an ongoing discussion. Jacoby whats an ongoing discussion . How were doing on minimizing the bad. Jacoby but were dealing with such consequential issues, right . Were talking about integrity of our elections, were talking about. Absolutely. Jacoby . In some cases, playing a role in a genocide. An ongoing conversation means what, exactly, about that . About a standard for success here . I think, you know, this is the numberone priority for the company. Mark has been out there, sheryl is out there, youre talking to me and a bunch of the other leaders. Thats what we mean by having an ongoing conversation. This is something that we need to, as you said, this is serious, this is consequential. We take this extremely. Like, we understand this responsibility, and its not going away tomorrow. Jacoby do you think facebook has earned the trust to be able to say, trust us, weve got this . Im not going to answer that, im sorry. Thats just. I mean, that, everybody can make that decision for themselves. Jacoby but what. Do you trust them . I trust the people who i worked with. I think there are some good people who are working on this. That doesnt mean i dont think we should pass laws to back that up. It has not been a good week for facebook. Social media giant. Narrator for facebook, the problems have been multiplying. Massive setback for facebook, the social media giant. A massive cyber attack afcting nearly 50 million facebook users. Facebook continues to crack down on fake political ads and news. Narrator but Mark Zuckerbergs quest to connect and change the world continues. Hey, everyone hey. Welcome to f8. This has been an intense year. I cant believe were only four months in after all these scandals, facebooks profits just still going up, right . So they dont really have a huge incentive to change the core problem, which is their business model. We are announcing a new set of features coming soon. Theyre not going to do it as long as theyre doing so well financially and theres no regulatory oversight. And consumer backlh doesnt really work, because i cant leave facebook all my friends and family around the world are there. You might not like the company, you might not like its privacy policies, you might not like the way its algorithm works, you might not like its business model, but what are you going to do . Now, theres no guarantee th we get this right. This is hard stuff. We will make mistakes, and they will have consequences, and we will need to fix them. Narrator as he has since the beginning, he sees facebook, his invention not as part of the problem, but the solution. So if you believe, like i do, that giving people a voice is important, that building relationships is important, that creating a sense of community is important, and that doing the hard work of trying to bring the world closer together is important, then i say this we will keep building. cheers and applause hold, hold on hold on to me cause im a little unsteady whats the situation there . How do you explain that . Are you ready for this world that we are facing today . Go to pbs. Org frontline for the latest developments in the facebook story. Then check out the new frontline transparency project, and see key quotes from the film in context. This really was a situation in which we saw the tip of this iceberg, and we knew there was some kind of iceberg beneath it. This isnt a problem that you solve, its a problem that you contain. What im focused on is amplifying good, and minimizing the bad. Connect to the Frontline Community on facebook, twitter, or pbs. Org frontline. Frontline is made possible by contributions to your pbs station from viewers like you. Thank you. And by the corporation for public broadcasting. Major support is provided by the john d. And catherine t. Macarthur foundation, committed to building a more just, verdant and peaceful world. More information is available at macfound. Org. The ford foundation, working with visionaries on the front lines of social change worldwide. At ford foundation. Org. Additional support is provided by the abrams foundation, committed to excellence in journalism. The park foundation, dedicated to heightening Public Awareness of critical issues. The john and Helen Glessner family trust. Supporting trustworthy journalism that informs and inspires. And by the frontline journalism fund, with major support from jon and jo ann hagler. And by the y, for a better us. Captioned by Media Access Group at wgbh access. Wgbh. Org for more on this and other frontline programs, visit our website at prg frontline. To order frontlines the facebook dilemma on dvd visit shop pbs, call 1800playpbs. This program is also available on amazon prime video. Youreatching pbs gentle musi [wanda] womens legal position was pretty much the same as the legal position of felons. [mary] its the need for freedom and rights and civic respect and place in society, women had to have that. [marjorie] it was inevitable that race was going to play a major role in the suffrage movement. And national leaders, in order to achieve a national victory, had to have some southern states. [carole] it is not ladylike politics at all, this fight that is going to take place in the halls of the Tennessee State capital

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.