Skip navigation

How a neuroscientist is using the scientific method to take on a public crisis in critical thinking

In his latest award-winning book, Weaponized Lies, McGill professor emeritus Daniel J. Levitin pushes readers to cut through information overload to identify bad stats and faulty arguments.

Photo courtesy of Daniel Levitin.

Daniel J. Levitin has built a reputation as a popular science writer and public scholar through his ability to condense neuroscience and psychology in straightforward and often wry terms for a wide audience in bestselling books like This Is Your Brain on Music and The Organized Mind. His latest book, Weaponized Lies: How to Think Critically in a Post-Truth Era, is a kind of proof-of-concept that builds on that reputation, showing readers how to convey complicated information simply without misrepresenting it. In this book, Dr. Levitin identifies a crisis of critical thinking, one which he hopes to remedy by encouraging readers to use the scientific method to identify distortions of fact and confront misinformation head-on.

In April, Weaponized Lies won the $30,000 National Business Book Award. University Affairs spoke with Dr. Levitin, the James McGill Professor Emeritus of Neuroscience and Music at McGill University and a distinguished faculty fellow at the Haas School of Business at University of California Berkeley, about the book.

University Affairs: Who would you say this book is for?

Daniel Levitin: I say with all of my books the audience is about the same. It’s a broad range of people, and I try to write the books with different layers, so that people with different backgrounds and interests can find something that they will like, or that they’ll respond to. I’m writing simultaneously for high school students and college students and professionals, and hoping that in each group the message and information will be able to reach them and they’ll be able to find some meaning in it.

What I really want is to have a public conversation about critical thinking, a public policy conversation about critical thinking, and get this kind of stuff into the curriculum starting at Grade 9. The current generation and the generation after them are going to be running the world.

UA: Is there anything in our culture now that made you feel like this issue warranted such an in-depth look outside of academia?

DL: [Laughs] That’s a loaded question. Well, I’ve been noticing that we now have an entire generation of students who’ve grown up with the internet. The internet didn’t come into being until I was in my 30s. I knew about the ARPANET when I was in my 20s, and I had been programming computers since about 1969. But we now have an entire generation of students in their late teens and early 20s who have only known a world with Google. And they are very savvy about getting the answers to their questions very quickly, but not so savvy, as a cohort, about determining whether the information they’re getting is true or not. I’ve been seeing this manifest itself each year when I stand in front of a new class of students. I realize it’s getting worse and worse. They think that if they’ve found it on the internet that it must be true. And many Americans, and some Canadians now, think that there’s a gatekeeper to the internet and that the information wouldn’t be there if it wasn’t true.

So many people have the instinct to ask questions and the instinct comes from the right place, but what they lack is the follow-through because they don’t have the tools, experience or practice. This manifests itself most vividly with conspiracy theories, like the 9/11 conspiracy folks. Sure, I think the instinct that maybe the government is putting one over on you comes from the right place [to trust authority but verify the information it puts out]. But, I’m trying to look at the whole picture. A handful of unexplained anomalies shouldn’t cause you to tank a well-formed theory built on thousands of verified observations.

Statistics are a big part of the book and a big part of my message, because most of us aren’t trained to think statistically and to think, “It’s not a perfect world, the evidence isn’t all going to be on one side or another. I’m going to have to place my bets where the evidence is strongest.”

UA: When the book was released in Canada last year, it was originally titled A Field Guide to Lies: Critical Thinking in the Information Age. What warranted the title change and updated preface with the March release of the paperback edition? Did you update anything else in the book?

DL: Sometimes in the publishing industry when a paperback comes out, you add a new preface that incorporates any thoughts you’ve had since the manuscript, which could’ve been written many months or even a year prior. Sometimes they change the title, and in this case, the title came out of an op-ed I wrote in December.

After Field Guide came out in September, this whole business with the lie – people call it fake news, I call it a lie – circulating that Hillary Clinton was running a child sex-slave ring out of a pizza parlour. I was just so aggravated that the story got so much play, and that people seemed to believe it.

UA: Right, and it had a really awful conclusion, somebody showed up to that pizza parlour with a gun.

DL: With an automatic weapon! Yes, and discharged it there. So I wrote an op-ed about how we have to fight this tendency to forward and redistribute things that might be false, because they can have consequences, and I called the lie a “weaponized lie.” Somebody had actually taken up a weapon in response to this lie. My publisher liked the phrase and thought it might make a good title for the paperback.

UA: One of the examples that stands out in the book involves research fraud in the scientific community. It’d be easy to say that you personally have familiarity with and skill in critical thinking because you’re a scientist, and you use the scientific method every day. But many scientists also have that knowledge and experience, and still commit fraud. Can you think of a particular time when the importance of critical thinking and upholding the truth were instilled in you?

DL: Well I started out [as a student] at MIT and we had an honour code. I remember the day we arrived for orientation, Jerome Wiesner, who was the president of MIT, convened all of the freshmen on a big lawn in front of his office facing the Charles River on a nice New England warm, muggy day. One of the things he instilled in us then was that we were now part of the MIT community. He talked about important people who’d attended MIT, that we were now ambassadors and representatives of MIT by virtue of being enrolled there. And he explained the honour code – the purpose and the importance of it. I later transferred to Stanford University, which has a different honour code but still one that they take very seriously, and [both] highlighted for me the importance of being in a community; that scientific fraud is an antisocial act.

I think the people who engage in scientific fraud are trying to increase their profile or standing or wealth, or whatever. It’s usually for very selfish reasons. But I encoded it as antisocial. I don’t want to lie and cheat in terms of my own self-image, and I also don’t want to commit an antisocial act. And I don’t want to commit an act that will reflect badly on my community. Whether that community is McGill professors, MIT and Stanford alums, scientists, academic psychologists or popular science writers, I value the communities that I’m a member of and I don’t want to do anything to harm them.

UA: Those feelings of kinship or honour toward your community would also make you want to grow that community as well, which explains why you’d want to see the next generation of people also sharing those same values.

DL: Well, yeah. In the classes I taught at McGill all those years, we always had an ethics section. We would read about the cases of [disgraced Harvard researcher] Marc Hauser and [Dutch social psychologist] Diederik Stapel, other psychological scientists who’ve been found guilty of fraud. We’ve read about Andrew Wakefield, the pediatrician who started this whole measles vaccine-causes-autism controversy. And we talked about research ethics and reporting requirements and subject confidentiality and all of that stuff. You’re right, my role as a professor is in part trying to expand and enhance the community that I’m in, and wanting to train students to do the right thing.

This interview was condensed and edited for clarity.

 Weaponized Lies: How to Think Critically in the Post-Truth Era was released in paperback by Penguin Canada in March. It was originally published in September 2016 as A Field Guide to Lies: Critical Thinking in the Information Age. In addition to the 2017 National Business Book Award, it won the Mavis Gallant Prize for Non-Fiction and was shortlisted for the Donner Prize.

Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *