Transcripts For CSPAN2 The Knowledge Illusion 20170806

Card image cap



thank you very much. we're delighted to have you at this event, the office at mit series. tonight we're featuring steven sloman, a cognitive scientist who studies how people think. he's been a close friend of mine for many years and also the editor of the journal cognition. and he's joined in conversation today with drazem, professor of economics at mit's sloan school. he also holds an appointment in the department of brain and cognitive sciences. before we begin, i ask that you please silence your phone if you haven't already. i should also introduce myself. i'm the director of the mit press, my name is amy brand. it's a pleasure to welcome you. we just started doing this series of events at the mit bookstore, and they've been incredibly successful. we do events about every two weeks, so next one coming up is on a may 23rd, the book dream chasers: immigration and the american backlash. so tonight's event will last for approximately, you know, 45-50 minutes before les a book signing, so we'll start off talking about the book, then drazen will have questions for him, and then we'll have a conversation with the audience before the book signing. i think i covered everything i was supposed to say. and after the presentation with the signing, books will be available more a 20% discount. >> maybe i'll buy one. [laughter] okay. i don't think i need this mic. do i need this mic? >> no, you don't. drazen needs it. >> thank you so much, amy. thank you all for coming out. let me start by telling you a little anecdote from the book. so in the '40s, atomic physicists were still trying to perfect the atomic bomb. and eight of them, eight atomic physicists -- these are people who know atomic physics better than anybody -- they were developing the bomb. so eight of them coalesced in this room in order to run this experiment. and it's an experiment that the famous physicist richard feinman calls tickling the dragon's tail. and it involved taking two hemispheres of beryllium that had plutonium cores and bringing them closer and closer together, and hen neutrons would -- and then neutrons would start shuffling back and forth between the hemispheres, and it was really dangerous because you could have a lot of radioactivity. so the main physicist running this experiment, a guy named louie, was keeping these hemispheres beryllium separated by a flathead screwdriver. so what happened? the screwdriver slipped, the hemispheres came together, radiation filled the room, and louie was dead within nine days. and the other physicists were, they died young probably from the effects of this railed yo activity -- railed yo activity. so the question, and the question potioned by the book, is how can such smart people be so stupid? so the first claim made by the book is that people are relatively ignorant. and that's a fact in the sense that, you know, 25% of americans don't know that the earth revolves around the sun, and 50% don't know that penicillin kills bacteria and not viruses. there's just -- radio, talk show hosts make fun of this fact all the time. many people can't name the vice president of the united states. but the real point of the book is that people think they understand things better than they do, right? so the main form of evidence for this originated by, in the lab of a great psychologist named frank kyle at yale, and what he did was ask people to think about simple, everyday objects like zippers, ball-point pens, toilets. and he asked people how well they understood these things, and people felt they had a sense of understanding, right? on a seven-point scale, they would say four or five. and then he'd say, okay, how do they work? what people discovered, for the most part, they had nothing to say. so when he again asked them how well they understand them, the ratings were lower. the act of trying to explain punctured their illusion of understanding such that they lost some of their hubris. and what i've done with some colleagues, todd rogers at harvard and craig fox at ucla and my co-author, phil fernback, is to take this paradime and run anytime the context of political policy. so we take political policies like should there be unilateral sanctions on iran, cap and trade policy toes or carbon emissions, and we ask people how well they understand them, and then we say how well does it work? sorry, we say how does it work, right? explain how this thing works. and people just don't know. and so when we again ask them what their sense of understanding is, it's lower. we puncture their illusion. not only do we puncture their illusion of understanding, but we also puncture their confidence, their confidence in their attitude. they become less extreme. we reduce polarization in the group simply by asking for an explanation. so this is something that probably is true for only certain kinds of political issues; namely, those that rest on a consequentialist foundation. those it's the causal mechanism that really counts, right? like, for a cap and trade policy what really matters is how it affects people's willingness to put carbon in the area. there are other issues like abortion or assisted suicide which really aren't about the consequences so much as basic values. and i don't think -- and, in fact, we have some data suggesting the same thing -- would not happen for those kinds of issues. so the next thing we do in the book is try to explain why this is the case, why is it that people live in this illusion of understanding or what kyle calls an illusion of explanatory depth. and the answer we offer is that it's because we confuse what we know with what other people know. so there are other people who know how ball-point pens work and, therefore, we think we do. there are other people who know how cap and trade policies work and, therefore, we think we do. we fail to distinguish the knowledge that's in our head from the knowledge that's in other people's head. in other words, the claim is we live in a community of knowledge, right? and so we're built, thought itself is a kind of collaborative process. it's a process which involves a team and does not solely go on inside the skull. so let me quickly describe an experiment that i ran with an undergrad at brown which tries to make this point directly. we told people about a scientific phenomenon, something we made up. so, for instance, we told them about a system of helium rain, which my real scientific friends tet tell me is not actually possible. we had made it up, and we said scientists discovered this thing. they haven't explained it yet, they don't understand how it works, but they've discovered it. how well do you understand how it works? [laughter] and not surprisingly, people said i don't understand at all on a 1-7 scale, they chose 1, essentially. another group, same thing, scientists have discovered this system of helium rain, they understand how it works. they fully explained it. how well do you understand how it works? and now people say 2, right? so it's not like they feel they fully understand helium rain, but there's a little bump in understanding that's attributable to the fact that other people understand. they themselves -- give them no information about it, right? all we said is other people understand, and now they understand. so think of this in the context of, say, to pick a random example, the last election, right? imagine everyone around you thinks they understand why hillary is crooked, or, you know, let's be fair, maybe everyone around you with thinks they understand why diversity is effective, right? merely the fact that everybody else understands -- if understanding is contagious in the way we're suggesting -- is going to give you a sense of understanding. and if everybody's sense of understanding is a result of everybody else, everybody around hem having a sense of understanding, then we can have a lot of confidence and belief based on nothing, right? confidence that's essentially a house of cards. so finally, the last thing we do in the book is draw out the implications for a number of things, science, literacy, our understanding of intelligence, decision making, technology and something, one or two other things that i'm not remembering at the moment. and that's what the book's about. >> very good. thank you, steven. so i just want to say as a reader of the book and as someone who has worked on some related things, i really found much that was new and led me to think about different problems and some familiar things in a difference way. in a different way. now, if you are not in this area, this is a great fun to realize as well. >> thank you. >> now, since maybe we can put you on the couch here as an author of the book. i think it's always interesting to, especially when someone writes a book that builds on scientific research that goes through the editorial process and so on s to think about the deeper motivations for it. [laughter] so i'm going to start with one innocent question then have a follow-up question also. but the innocent question is do you -- is there a reformist impulse in the sense that that the knowledge illusion is a bad thing, or are you not -- because it's not so obvious that it is a bad thing. one could say that the knowledge illusion shows that we are a communetarian species and we rely on others, and if you strip that away, we'll become locked to our own individual selves. so is it a bad thing that there is a knowledge illusion? >> thank god that was the question, because i was sure you were going to ask me about my childhood. [laughter] no. i think you're absolutely right. for the most part, when it comes to ball-point pens and toilets and zippers, it's not a bad thing at all there's this illusion. and, in fact, in many other domains of life, in our spiritual lives, for instance, this is no problem with the fact -- there's no problem with the fact that we think together, that we think as a team, we think collaboratively. indeed, it's a great way to solve problems. in fact, it sort of relieves us of a kind of burden, right? we don't have to understand everything. if we accept this fact about ourselves, our own ignorance, then life is easier. there's less pressure on us. in fact, we got an e-mail from this one guy who said that he had suffered his whole life with some kind of mental disorder. he didn't elaborate what it was, and that it was such a relief for him to learn that it was okay to be ignorant, that everybody's ignorant. so in that regard, i agree completely. i just think that there are certain domains in which there are ill effects. and, you know, i think populism in politics is something that can cause a lot of damage. i'm worried about the state of the world right now. and i think it has something to do with the fact that we're living with this set of beliefs, this ideology that doesn't really have a firm ground. you know, there are other ill effects too. i think that that teams could probably work better together if the individuals and the teams had more respect for the point of view of others. so both on the large scale and the small, you know, i do think there are prescriptions that could be derive from these insights. but overall, i agree with you completely. there's nothing. there's not a problem, there's not an inherent problem with it. >> so i have a related question, and has to do with the general topic of collected knowledge which has been in the popular imagination and in scientific imagination the last, i would say, 10, 15 years. and my question is motivate by is there a particular political preference,. and i'll raise it with an example that's not in your book, but with a discussion of crowd wisdom and prediction markets. it turns out that prediction markets are partly a new instrument for aggregating information, but also much of the interest there -- at least knowing the individuals who are pushing this -- comes from a kind of libertarian, anti-elitist agenda where you don't want to trust credentials experts, you want everybody many a democratic way to contribute and influence the outcome. now, in your book it seems that one of the themes is that we claim too much knowledge on our own part. and that our contribution is smaller than we think. and you mentioned the case of scientific and artistic achievements where people don't realize how much they stand on the shoulders of giants. so that seems to have a perspective that it's not libertarian, it's rather communetarian or that we should understand to what extent we're really a very small part of the whole s. that a motive or a theme in the backgroundsome. >> it's certainly a theme, and it's not in the background, it's in the foreground. it's not a motive, it's a fact. [laughter] so, yes. no, that's definitely an implication of the book, and it's an implication i would be willing to defend. in the domain of prediction markets, i actually didn't -- i mean, i'm certainly aware of prediction markets. we talk about them a little bit in the book. i wasn't aware that they came out of a libertarian perspective. it seems to me that much of the value of prediction markets is that they're a means of sussing out the expertise in a community, right? that is, the people who are willing to pay the most, to risk the most on their prediction presumably, should be, the people who know the most and have the most knowledge to to bring to bear will have the most accurate predictions. and i've always thought that's the reasons prediction markets are successful. it's actually kind of surprising, because there's other evidence showing that the people who are most confident actually often know the least, not the most. so david dunning has done a lot of work showing this simply, you know, asking people what their views are on various issues and then measuring their knowledge about the background for those issues and finding the people who do the worst on his general knowledge tests are the ones who are most confident about their opinions on the issue. there was this great experiment we mention in the book in which a bunch of researchers asked a group to locate the ukraine on a map. and most people were way off. some people got it pretty close. they also asked these people how confident they were that the u.s. should intervene in the war on crimea, unite -- right? the people who were most off many locating ukraine on the map were the ones that were most confident that the u.s. should be in crimea. m. [laughter] i mean, that's an example of the converse, right? how knowledge actually leads to lack of confidence, the sort of lack of hubris. but in the case of prediction markets, i've always assumed it was true expertise that the prediction market was picking out. >> the, i'm just keeping track of the time. no, i was thinking that you get the idea in prediction markets is you would not need a ph.d. or credentials or an m.d., actually, the vote on what should be done with a patient and so on. i don't want to take -- maybe just one more quick question, which goes more to the research. could you, it's known, for example, that people confuse hair -- their own confidence with how much they think some information is generally shared. but you have results that it really matters whether you believe someone else knows something, that that is somehow sucked into a sense that you know it as well. but it also has to be accessing, is that right? the knowledge. >> right. >> it can't be seal. >> so in this contagious understanding experiment, the helium rain that i described earlier, we had a condition in which the researchers worked more darpa, for the defense intelligence agency. and in one condition, they understood the phenomenon, but it was secret. so people had no access to it, right? and the question was would knowing the knowledge was out there but also knowing that you, the judge, had no access to that knowledge, would that also increase your sense of understanding. and the answer is it didn't, right? so knowing that others -- or believing that others understand something increases your sense of understanding but only if you can access that knowledge. and, in fact, it doesn't even have to be other people. it can be machines, right? so if you've -- there's evidence be you've been searching google for answers to questions, then you feel like you're a better question answerer. >> so i think we should give the audience a chance now to ask questions. good. yeah. sure. >> you actually do need the mic because they're taping. >> i'd like to know, you're kind of like promoting your book, so i'm wondering why i should buy your book. in other words, what am i going to benefit from reading your book? because you told me certain things, but i already knew them. so what -- [laughter] i'm being serious. >> well, if you already knew them, then don't buy the book. [laughter] you already know what we have to offer. i don't know what to say. >> for another person that didn't know -- do you hope -- >> well, i thought, so there are two lessons, two main lessons. there are lots of lessons, but there are two main lessons that i think the book draws. one is most people have an inflated sense of their own understanding. no one in this room, i'm sure, but outside this room. and second, that the reason for this is because we should think about thought as a communal enterprise, not something that's going on inside the head. now many, i'm glad you already know that, and it's true that it's not a completely novelling idea by any means -- novel idea by any means. but it's also antithetical to what a lot of scientists assume, right? so i'm a cognitive scientist, and i can tell you that cognitive scientists, for the most part, assume that thought does go on inside the head, and that's a what we study, and that's how we talk about it. [inaudible conversations] >> hi. i'm going to be the annoying person who asks what it mean toss say that we understand things. all right, everybody in this room probably agrees that there are, you know, dna is made up of four bases, double helix, that there are 46 chromosomes for humans and so on. probably almost nobody in this room has actually replicated the experiments that cause us to know these things. is what does it mean to say we know them rather than we just have this illusion that we know it? >> so it seems like you're reiterating my point, right? that we feel, we feel we understand these things and, actually, be we took a poll -- if we took a poll, my guess is fewer people in this room know those things than you think -- >> [inaudible] >> well, okay, so maybe i'm wrong. but even mit can be surprising. >> [inaudible] haven't replicated other experiments. >> no, so exactly. so the point of -- one point of communal knowledge is that we depend on work done by others, right? that most of the thingses we do as lay people and as scientists depend entirely on knowledge that sits in other people's heads. and that seems to be the point you're making, and that's exactly the point to have book, right? so most of the methodologies that we use are methodologies that have been, you know, developed and demonstrated effective elsewhere. right, we don't replicate them. if i use someone else's theorum, rarely do i go out and reprove the theorum. i depend on what other people know. so, yeah, i couldn't agree more. as far as defining knowledge, that's a separate eshoo that a maybe we can -- issue that maybe we can talk about later. >> more questions. >> hi. >> hi. [laughter] >> so i have, i was thinking about what you were saying, and i was wondering do you have any thoughts as to the fact that different groups of people may have different sort of assumed knowledge from each other other? so i'm thinking about people, for instance, in academia and how you might get different fields, and people may be sort of siloed. and they may even if they're approaching a similar problem, may, you know, approach it in a different became i'm thinking of this book that i read recently could inventology where the author talked about some really interesting inventers. and among other things, she talks about how people, some of people who come up with really good solutions to the things may come from a totally different background and maybe a different knowledge base from, you know, people who sort of traditionally, you know, from from people in fields that traditionally solve the problems. so i guess i'm just thinking, you know, how -- if you have any thoughts as to how we might sort of leverage this ability that we have to think collaboratively but also to avoid some of the pitfalls that that can, you know, produce. >> so i think what you're pointing out so eloquently is that there is a division of cognitive labor that we all are experts many only one or two very narrow areas, and that to accomplish anything we need expertise that's distributed across the range of possible fields. you know, look, i think society is structured to a large degree in accordance with what i'm talking about, in accordance with the fact that there's a community of knowledge that exists because we each have our own narrow area of expertise, and to accomplish anything of substance -- in fact, the accomplish most things of little substance we have to divide up cognitive labor, right? to build a ball-point opinion, you need someone who's a material scientist, and you need someone who knows about fluid dynamics. what you're pointing out is society has taken that into account, and the reason we're able to build iphones and send people to the moon and have wonderful bookstores is precisely because we take advantage of that distribution of expertise. so in a sense, we're already doing it for the most part. you also asked what lessons there are about how we can do this better. and, you know, the little bit i would say about that is we could have a little less hubris about our expertise in areas beyond our own narrow field, right? we should better appreciate exactly a what you're saying, that other people do all kinds of useful things and have all kinds of useful knowledge that we don't have direct access to. that the diversity of perspective is the a good thing. >> so my question is. >> [inaudible] >> do you hear me? oh, yeah. now. so i think about this knowledge as a bubble, that it's this booking is a basically breaking a bubble for me, it's an awareness that it's there. and let's say i'm a leader and i have a team, and i want them to understand this or be aware of this. i want you to tell me or give me a suggestion of how can i tell them this is important to take very big decisions, you know? as you say, there are things that we need to know that this is important, but there are others that people should know in a team that this is there, and maybe we should read more or -- >> should buy my book. [laughter] >> you know, i'm not going to give them a test and is tell them, hey, you know less than you think you know. how can i converse this to a team in a way that doesn't need to read your book or give them a test? >> yeah. no, that's a great question. and the truth is i'm not going to offer a satisfying answer. i don't think anybody has a satisfying answer to offer. the one thing i will say is that getting people to focus on mechanism can be reality enlightening, right? so the primary experiment of the paper, the one many which we demonstrate the illusion of explanetary depth, or which frank kyle demonstrated the illusion, simply works by asking people to explain. i think the best way to persuade someone is to have them persuade themselves. and the way to do that in this case is to ask people for an account. now, you're right, that can be kind of threatening. and, indeed, we have -- we've run experiments where we burst people's illusion of understanding by asking them to explain, and then we see if now they want to learn more. have we reduced their hubris and now they'll be open to new information. and the answer is, no. ing -- right? once you make people feel ignorant, they really don't want to talk to you anymore. [laughter] so the it's a fine line, right, between getting people to see what they can't do on one hand and stroking them on the other. and that's an art. >> i have actually two questions that are related. on the one hand, i would like to know whether you have anything to contribute to understanding why fake news currently are so successfully spreading. i mean, you need to have trust in what you, on what you believe or what you not believe and apparently there are many people believing these news. and on the other hand, i would like to know whether you think that the phenomenon that you describe is changing, whether -- i mean, there are many people, there is a general perception that there is polarization, political polarization in the country, and does this core respond with an increase -- correspond with an increase in confidence in your opinions, and does it increase with ignorance in some way? >> so the basic idea that we live in a community of knowledge and that the mind is built for collaboration i take to be basic facts about humanity that have been true since, you know, the beginning of time, since we were punt hunters and gatherers. what's changing, obviously, is our modes of interaction. and it seems to me that america, especially america hasn't faced the dynamics that it's facing now. you know, the answer i'm going to give you is kind of self-evident, i think. because of the internet, and the internet has changed everything, right? and it's not only that we now live in these much, in these bubbles that have firmer walls because we're cut off from the people who live next door who might have a different political perspective and instead are reaching out to people in serbia or bosnia who have the same political perspective. and that in turn is made worse by the fact that much of our news is individualized, we're seeing only what we want to see, facebook and google, delivered to us, what we want to see which tends to be stuff we already agree with, right? so our bubbles are getting firmer and firmer for that reason. but at the same time, the old style leadership is disappearing. so, you know, we know people don't go to church as much. and so they don't hear this common voice that delivers the same perspective on the news regardless of the members' political persuasion, right? and there have been a lot of people who have analyzed changes in demographics suggesting that intellectuals are moving more to the city and so there's less intellectual leadership outside the city, and as a result, people are appealing more to their bubbles in order to get their perspectives both validated and to just acquire them in the first place. you know, one way to characterize the point to have book is that -- the point of the book is rather than thinking of people as rational processers of information, we should think about people as channeling their communities. so what this implies about fake news is both, i mean, there's good news and there's bad news about fake news. the bad news is that it ises the mean -- it is the means by which people acquire their beliefs, the a large degree. if we just channel our community and our community is telling us all this stuff that's not true, it's going to make a big difference to what we believe. the good news is that i'm not sure it matters so much, righting? in the sense that whether the news is real or fake, people are going to believe what their community tells them to believe anyway. and so it's not obvious to me that fake news is having a huge effect on the distribution of belief in society. yeah, so there's a lot to be said about that. >> so you seem very oriented toward with thinking about how american adults think about things, so i want to ask you about two different populations; non-americans and non-adults, particularly people from very different cultures. do you really think bush men in the kalahari would show this? do you think little children would show similar illusions? >> great question, and the data aren't in and, you know, i'll follow the data. i have no reason to think that they wouldn't. i mean, i certainly have no reason to think that bush men in the kalahari live, you know, more inside their own individual minds than anybody else, right? i assume they're also collaborators, they're also team players, and that, i think, is the source of the illusion. so my prediction would be they would, indeed, show the illusion. kids too. it's harolder the show with -- it's harder to show with those populations, and that's important work to be done. >> i would like to ask you about reasoning, and in particular the difference between individual and group reasoning. a recent theory of reasoning by -- [inaudible] suggests that reasoning's something that developed evolutionaryily -- [inaudible] and one of the predictions of that theory is that group or -- reasoning should therefore be somehow more powerful or more effective in finding the truth or some kind of optimal solution to a problem. but on the other hand, there are other findings from psychology that suggest that when people try to persuade each other, there are -- [inaudible] for example, the backfire effect. you try to to convince someone, and you end up mobilizing their cognitive question maatta which leaves them with -- [inaudible] so what would be your call? is group reasoning somehow more powerful than individual reasoning? >> so i haven't yet read the book you're referring to, although it's on my bedside table. clearly, i've got to answer yes to your final question. i think that it's got to be more powerful because that is the nature of reasoning. and i think thinking about what goes on inside the head as a conversation with somebody else, right, is probably the best way to think about that form of deliberation. so i should say i'm a two-system theorist, right? and i do believe that we have strong intuitions that we can think of as forms of reasoniing right? so we will see patterns and make predictions about the future and explain things by, essentially, a sophisticated pattern completion process that is intuitive in the intense that we -- in the sense that we see what the outcome is although we don't see the process that delivers that outcome. and i would distinguish that from deliberation or deliberative reasoning, right? and i love this word deliberation precisely because it plays on two different senses of deliberation. so on one hand deliberation you can think of as a cognitive process, right in the process of thought, the sequential process that goes on inside the head. but deliberation is also something that happens between people, right? juries deliberate to come to a conclusion. and, you know, we do that, but i agree with mercier and spurber that we do that more effectively at a group level. the other assumption of their argument, that the mind evolved the argue, i have more trouble with, right? i think the mind's evolved to do all kinds of things. i think that storytelling is at least if not more important than argumentation. right? so there are all kinds of formal systems that we learn culturally, that we pass and discuss and develop culturally. but i'm not sure that thinking about reasoning as a process of argumentation is, in the end, going to lead us in the best direction. >> i just wanted to ask a question because i think it follows up on this. it to occurred to me that there's some similarity to your example which is show me how a zipper works with the socratic method which is an old trick of asking a person questions and showing them that they don't really understand what they mean when they're using some terms. that's what comment, i don't know if you want to comment on that. the other is more a question of rhetoric, that a standard device in persuasion is to to show the other person that you can recreate their argument, you can recapitulate their reasons and then disagree. does the that have any connection with this? >> so if if you want to put me in the same camp as socrates, i'm perfectly happy with that. again, i don't want to say there's anything fundamentally new here, right? i mean, a lot of these ideas have a long history. you know, when kyle originally did this work on the illusion of explanatory depth, he showed that people suffer from the illusion with regard to their understanding about how things work, right? mechanism, causal model. but they don't show it with respect to narratives or scripts or, you know, facts like state capitals. so it's a very specific kind of illusion, or it was in his hands. but the one domain in this which his student showed it also obtained is in logical justification. so people do apparently have the seasons that they can justify their beliefs to a greater extent than they can. [inaudible conversations] >> i think matt has a question. [inaudible conversations] >> there we go. >> so how do people know what competence you're asking about when you understand helium rain? if my uncle says, do you understand helium rain, i could maybe muddle through it. but if a student of mine asks do you understand helium rain, no go. i won't go there. and if my broker asks do you understand helium rain, it's a different comment. so when you -- it's just an experimental question. when you ask these questions, what do you think these people are thinking? >> so when we run the experiment, we give a details -- we give a whole sheet of instructions on what we mean by understanding, right? so we provide the scale and we say one means you're completely unfamiliar with it and seven means you're the world's expert and you could answer all possible questions. and so we do try to define understanding. i take your point to be that in an experimental context, people can understand what you're saying in lots of ways, and the word understanding is vague k and they could understand it, you know, to mean many different things. and that's true. i don't think that actually affects our conclusion, right? so in this, under contagious understanding experiment what we show is simply saying other people understand causes people to put a higher number down on the scale. so what exactly does that number mean? i don't know exactly. your point is well taken. but i can tell you this, it's higher than the number that the people put down when the scientists don't understand. right? so whatever understanding means, when other people have it, people have a greater sense of it. >> again, i'm listening to all of your talk and only so compelled to ask or at least bring up the topic of the argument of climate change in the the political arena. i can, i know there's climate change, i know that we have to go alternative energy, but be you were to ask me, you know, to explain, i don't think i could either. so why should i be so, you know, proud of myself? my opponents will say, well, you don't know what you're talking about, but how would you respond given what you know? >> well, i would respond in the same way i did to this gentleman earlier, that we all rely on expertise, and really the question is which experts are we going to believe. and in the case of climate change, i'm with you, you know? i believe, you know, upwards of 90 president of sign -- 90% of scientists who say we should worry about it. but if someone says, oh, it's a conspiracy and these scientists only believe in climate change because it's the only way they get published or get grant money, you know, the truth is all i can do is say, you know, that's not the cull which are of -- culture of science that i know. but it's a problem, right? there is some uncertainty that comeses from the fact that we have to trust our experts. but look, we have to trust our experts. it's all we can do do, right? i can't fix my toilet by myself. i have to trust the expert. you know, i can't eat potato e chips if i don't trust the expert who makes them. and i think it's a much, it's a hard problem. it's a very hard problem, but it's a hell of a lot easier than trying to explain how everything works to every individual. >> i was bond oring if you could comment -- wondering if you could comment on the people that are most wrong most often, whether there was any consistency across those people and maybe the extent to which academics might be part of this group and that they're individuals that are subjecting matter experts and, therefore, might believe, therefore, their subject matter expertise in one area makes them experts in a wider variety of areas. >> so i can't speak to that last issue because i haven't studied it and i haven't heard of any studies of it. i can tell you that the people who really suffer from the knowledge illusion are people who are not reflective. right? so some of you may be familiar with this cognitive reflection test, this simple three-item test developed by shane fred ricks at yale, and essentially it asks whether people tell you what's on their minds immediately or whether they verify before telling you what's on their mind. so this is not from the test, but it's a nice example that gives you the flavor of it, you know? again, many of you may have heard of this. if i ask you how many animals of even kind did moses put on the ark, some people say two and other people say moses didn't put any animals on the ark, it was noah that did. and the difference is that generating the response is non-reflective. you just -- two came to mind and two comes to of course's mind, right? and other people verify it before uttering it. no, that's not right because it wasn't moses at all. so people who suffer from the knowledge illusion tend to be the nonreflective type, right in and, in fact, if you test people who are reflect i, you don't see any evidence for it. presumably, that's because they explain before they respond. how well do you understand how well ball-point pens work, people who are reflective think and try to explain how the pens work before they put down their response the fist time, right? so are scientists more reflective? on average, yes. but you'd be surprised how many nonreflect i scientists already there are, right? so shane ran this cognitive reflection test on, i hate to say it, mit students as well as harvard students and all kind of other -- >> mit, we're best. >> we're the best, but it's still case that well over half turned out to be to -- turned out not to be reflective. >> including some in this room. [laughter] >> that's the best i can do. >> so in the financial markets as we keep listening to the experts, at some level all of them are standing next to the cliff, and there's nobody standing on the other side. so people in the big shot movie and book, the only people who stood out were ten guys, five guys, a hundred guys, but they were th irrational thinkers. could this be rational thinking -- [inaudible] because they did not listen to the entire herd? so shouldn't irrationality or being skeptical be a part of learning? >> absolutely. it should be a part of it. i mean, look, i'd never argument that we shouldn't develop critical reasoning skills. you know, i bought into that part of the judgment in decision making program a long time ago, and i've tried to learn statistics, and i try not to commit various cognitive fallacies that i know people commit. yes, one more is following the herd, and we shouldn't necessarily do it. yeah, we can reap great advantage by not doing it, as your examples demonstrate. but don't mistake not following the herd with always being right, right? so if the herd is wrong, then you're brilliant if you don't follow the herd, but often the herd is right. and, you know, and i wouldn't, i wouldn't necessarily call someone irrational just because they don't follow the heard, i would call -- follow the herd. you know, if the herd is the community of scientists who are trying to figure something out, they do okay most of the time, at least in most doe -- domains. financial markets are different and special that way, right? you know, and i'll let those who understand financial markets better than me speak to this. but clearly, following the herd can earn you a lot of money at least for a while. >> so we've run out of time, but thank you very much. i want to thank our speaker, thank you. [applause] we have a short book signing. some people reserved books ahead of the event, and you're also welcome to buy them there. steven will be signing books here, so thanks again. >> thanks, amy. thanks, all of you, for coming. [inaudible conversations] >> booktv recently visited capitol thoil ask members of congress what they're reading this summer. >> what i'm reading this summer, and it might take me all summer, is this commentary on romance, on the book of

Related Keywords

United States , Togo , Is Abad , Markazi , Iran , Crimea , Krym , Avtonomna Respublika , Ukraine , America , David Dunning , Todd Rogers ,

© 2024 Vimarsana

comparemela.com © 2020. All Rights Reserved.