One key marker of the ethical seriousness of a society is the extent to which it is trying overcome its various forms of systematic error, ignorance, and delusion. This is why when an organization like CFAR springs up, it's a cause for celebration. Below is my interview with its president and co-founder.
1) What initially drew you to skepticism and the study of reason? Was it a book or magazine? A mentor? Perhaps encounters with flagrant expressions of irrationality?
I think my parents were instrumental in getting me on this path (I made two Youtube videos about their parenting techniques, in honor of recent Fathers' and Mothers' day). In brief, they’re both intellectually curious, thoughtful people who love learning, and who genuinely think about things, instead of parroting ossified opinions, or holding their ground just for the sake of it.
I’ve had my share of frustrating encounters with irrationality, of course, but I try not to dwell on them too much – you run the risk of becoming bitter if you let yourself seethe too much about how “Someone is wrong on the Internet!” And I also try to remind myself that I’ve been wrong – and overconfidently so – about plenty of things in the past, and probably still am. Which helps me not be angry about other people’s apparent wrongness, at least if they’re not being too trollish about it!
2) In What Intelligence Tests Miss, Keith Stanovich wrote, "The lavish attention devoted to intelligence (raising it, praising it, worrying when it is low, etc.) seems wasteful in light of the fact that we choose to virtually ignore another set of mental skills with just as much social consequence--rational thinking mindware and procedures." He goes on, " I simply do not think that society has weighted the consequences of its failure to focus on irrationality as a real social problem. These skills and dispositions profoundly affect the world in which we live. Because of inadequately developed rational thinking abilities--because of the processing biases and mindware problems discussed in this book--physicians choose less effective medical treatments; people fail to accurately assess risks in their environment; information is misused in legal proceedings; millions of dollars are spent on unneeded projects by government and private industry; parents fail to vaccinate their children;" etc., etc. How would you characterize the scale of the problem of irrationality in the world? How serious do you think it is?
Quite serious. In addition to Stanovich’s examples, I’d say irrationality plays a pivotal role in our voting choices as individuals (often against our own interests), and our national policies (almost no economists think our current immigration policy makes any sense, and does anyone really think the TSA is doing anything to justify its massive cost at this point?). It’s a major part of why so many of us are in debt, why governments enter into wars, and why we judge the actions of other races and countries more harshly and unsympathetically than our own. Why? Because we seek out evidence that supports what we want to believe, we don’t like changing our minds, we trust people who seem confident rather than people who express less-than-full certainty, and more.
3) Could you please talk about the origin and first steps of CFAR? What were some of the biggest challenges you faced early on?
We founded CFAR about 18 months ago to develop, test out, and train people in rationality techniques, mental habits that would combat the kinds of biases that affect our own lives and our society. The time just seemed right for an organization like us: cognitive science had been booming, with Danny Kahneman having recently won the Nobel prize for his work on cognitive biases, and rationality was just really entering the public awareness, with popular books like Predictably Irrational and Thinking Fast and Slow.
Everything certainly seemed daunting at first. We really didn’t feel like we knew how to teach rationality, and I didn’t feel like we should run any workshops before our classes were more polished and well-tested. My co-founder Anna, however, advocated a “Try things!” policy and urged us to jump in and announce some pilot workshops anyway, on the logic that you learn so much more about how to do things if you’re not overly worried about being perfect off the bat.
I’m glad she did. Even though our first workshops were somewhat chaotic, we’re so much better now than we were then – at running a workshop smoothly, at explaining things, at keeping people engaged, making habits stick, and so on. And our growing alumni network has been invaluable to our development too – they suggest ideas for techniques we could test out, they give us useful suggestions and plenty of feedback about how they use the techniques and how they work.
4) CFAR conducts workshops to improve thinking and decision-making. What are the most important lessons you've learned from these workshops? Acknowledging the fact that CFAR is a relatively new organization, do you have any preliminary research yet on how effective the workshops are at inducing long-term changes in individuals?
We ran a randomized controlled trial this past year on one of our workshops – we interviewed a group of 50 people and had all of them fill out a survey about their current life situation (as well as send the survey to two friends or family members to fill out about them). Then we randomly admitted 25 of the 50 to our workshop, and sent the others off with a “Sorry, welcome to our control group!”
We’re following up with both groups this summer to see if there are any significant differences. But I should note that the purpose of this study was really just to suggest promising hypotheses about the effects of rationality, rather than to prove anything definitively. A sample of 25 is small enough that any effects would have to be huge in order to be very statistically significant at this scale. We’ll be running more studies as we go.
So far, the cognitive science literature has mainly concentrated on demonstrating the existence of biases, rather than testing debiasing techniques. So the pre-existing evidence is sparse. But a few techniques that have repeatedly proven to work include calibration training, which is where you learn to adjust your confidence in your beliefs, so that when you feel 90% confident about something, you’re actually 90% likely to be right – rather than the default most of us start at, in which our feeling of 90% confidence usually only corresponds to 50-60% accuracy. The set of proven-to-work techinques also includes the practice of prompting yourself to consider at least one alternate hypothesis. Sounds simple and obvious, and it is, but we don’t do it enough – and it’s low-hanging fruit.
5) What kinds of social and institutional changes would you like to see that would improve society's overall rational performance?
If I were queen of the universe? Well, of course CFAR would have the budget to run rationality workshops for an unlimited number of teachers, students, scientists, activists, politicians, and more. But also: Applied rationality would be a part of school curricula – and so would basic statistics. How can school boards believe that an additional year of literature class, on top of the ones students have already taken, is more important to them being responsible members of society, than learning about variance and p-values?
Fact-checking politicians’ claims (and TV pundits’ claims, for that matter) would be a much bigger deal, and public figures would be shamed at least as much for committing willful deceit as they currently are for sexting.
6) How would you respond to the following hypothetical criticism? "Irrationality is a big problem, but humans are many-sided, and much of who we are is underdeveloped. We could all improve our creativity, our compassion, our ability to hold attention in the present moment, and to listen. The cultivation of imagination is utterly ignored in most educational settings. Howard Gardner has spoken about how we don't address the 'synthesizing mind' in education. Levels of psychological health can always be improved. And so forth. Do you see any tension between CFAR's focus on rationality the need to address the fullness of our humanity?"
No, honestly, all of those things sound good too! But of course they're not mutually exclusive, with each other and certainly not with rationality improvement either. I think they're synergistic. For example, improving your ability to feel compassion makes it much easier to consider an issue from someone else's point of view, an important piece of rationality. Increasing your capacity for attention makes it easier to notice when you're committing the confirmation bias. And learning about rationalization makes you able to notice, for example, "Hey, I'm telling myself that these people don't deserve any help, but it feels like I'm rationalizing my desire to ignore them" - which makes you able to feel compassion.
Thank you, Julia.
Within organized skepticism, atheism and humanism, innovations with potentially far-reaching influence don't happen that often. When they do, we should make very effort to ensure their financial viability and maximum impact. Thus, please consider becoming a one-time or ongoing supporter of CFAR.