The Best Books on Critical Thinking | Five Books Recommendations

It’s been a little over a year since you explained to us what critical thinking is all about and recommended books about it. is one of our most popular interviews, probably because we all feel the need to do more. Since hundreds of people continue to read your recommendations every day, could you update us on the books that have appeared since we first spoke?

Two recent books, both published this year, that I’d like to add to my list are Calling Bullshit by Carl Bergstrom and Jevin West, and How to Make the World Sum Up by Tim Harford.

You are reading: Best critical thinking books

calling nonsense started as a course at the university of washington. it is a book, a manual really, written with the conviction that nonsense, particularly that circulating on the internet, is damaging democracy and that misinformation and disinformation can have very serious consequences. Liars don’t care about the truth. but the truth is important, and this book shows why. it focuses on examples from science and medicine, but also has a broader scope. it’s a lively read. It covers not only verbal nonsense, statistical nonsense (particularly in relation to big data), and on causality, but also has a chapter on nonsense data visualizations that distract from the content being discussed or present that data in a misleading way. Like all good books on critical thinking, this one includes a discussion of the psychology of being misled by misleading contributions to public debate.

in how to make the world add up, tim harford gives us ten rules for thinking better about numbers, along with a rule of thumb (“be curious”). Anyone who has listened to his long-running radio series will more or less know how brilliant Tim is at explaining claims based on numbers: while reading it, I was blown away by Tim’s reassuring, skeptical, reasonable, funny and patient voice. draws on a rich and fascinating range of examples to teach us (gently) how not to be fooled by statistics and ill-founded claims. there is some overlap with talking nonsense, but they complement each other. together they provide excellent training on how not to be fooled by data-driven claims.

[end of annex]

___________________________

We are here to talk about critical thinking. Before we discuss your book recommendations, I wonder if you would first explain: what exactly is critical thinking and when should we use it?

There’s a whole bunch of things that go under the label of “critical thinking.” there is what we could call formal logic, the most extreme case of abstractions. For example, take the syllogism: If all men are mortal, and Socrates is a man, you can deduce from that structure of arguments that Socrates is mortal. you can put anything in the spaces for ‘men’, ‘socrates’, ‘mortal’, and whatever you put, the argument structure is still valid. if the premises are true, the conclusion must be true. that kind of logic, which can be represented using letters and signs instead of words, has its place. formal logic is almost a mathematical subject (some would say mathematical).

but that’s just one element of critical thinking. critical thinking is broader, though it encompasses that. In recent years, it has been very common to include a discussion of cognitive biases: the psychological errors we make when reasoning and the tendencies we have to think in certain patterns that don’t give us good, reliable results. that’s another thing: focusing on cognitive biases is part of what is sometimes called “informal logic,” the kind of reasoning errors people make that can be described as fallacious. they are not, strictly speaking, logical fallacies, always. some of them are simply psychological tendencies that give us unreliable results.

The gambler’s fallacy is famous: someone who rolls an unloaded die has rolled it three times without getting a six, and then imagines that, by some kind of law of averages, the fourth time they are most likely to get a six, because they don’t have one yet. that’s just a bad kind of reasoning, because every time you roll the dice, the odds are the same: there’s a one in six chance of getting a six. there is no cumulative effect and a die has no memory. but we have this tendency, or indeed players often do, to think that somehow the world will balance things out and give you a win if you’ve had a series of losses. that’s a kind of informal reasoning error that a lot of us make, and there are plenty of examples like that.

I wrote a little book called thinking from a to z, which tried to name and explain a whole series of movements and errors in thinking. I included logic, some cognitive biases, some rhetorical moves, and also (for example) the theme of pseudo-depth, whereby people make seemingly profound claims that are actually shallow. the classic example is to give an apparent paradox, saying, for example, “knowledge is only a kind of ignorance”, or “virtue is only achieved through vice”. really, that’s just a rhetorical trick, and once you see it, it can give you any number of such ‘depths’. I suppose that would fall under rhetoric, the art of persuasion: persuading people that you are a deeper thinker than you are. good reasoning is not necessarily the best way to persuade someone of something, and there are many devious tricks that people use in discussion to persuade people of a particular position. the critical thinker is someone who recognizes the moves, can analyze the arguments and get their attention.

so, in answer to your question: critical thinking is not just pure logic. it is a group of things. but its goal is to be clear about what is being argued, what follows from the evidence and arguments provided, and to spot any cognitive biases or rhetorical moves that might lead us astray.

many of the terms it defines and illustrates by thinking from a to z, such as “straw man” arguments and “weasel words”, have gone mainstream. I see them lying on twitter. Do you think our increased familiarity with debate, thanks to platforms like Twitter, has improved people’s critical thinking or worsened it?

I think improving your critical thinking can be quite difficult. but one of the ways to do that is to have memorable labels, which can describe the kind of move someone is making, or the kind of reasoning error, or the kind of persuasive technique they’re using.

for example, you can step back in a particular case and see that someone is using a “weak analogy”. Once you’re familiar with the notion of a weak analogy, it’s a term you can use to draw attention to a comparison between two things that aren’t really the same in the ways someone implies they are. then the next move for a critical thinker would be to point out where this analogy fails, and thus show how poor it is at supporting the provided conclusion. Or, to use the example of weasel words, once you know that concept, it’s easier to spot and talk about them.

social networks, particularly twitter, are quite combative. people often look for critical angles on things people have said, and you’re limited on words. I suspect the labels are probably used there as a form of shorthand. As long as they are used accurately, this can be a good thing. but remember that responding to someone’s argument with “that’s a fallacy”, without really explaining what kind of fallacy it’s supposed to be, is itself a form of dismissive rhetoric.

There are also now a plethora of resources online that allow people to discover definitions of critical thinking terms. when I first wrote thinking from a to z, there weren’t the same number of resources available. I wrote it in ‘a to z’ form, partly just as a fun device that allows for a lot of cross-referencing, but partly because I wanted to draw attention to the names of things. Naming the moves is important.

“people seem to like the idea of ​​sharing irrelevant characteristics (may be a birthday or a hometown) with someone famous. but so what?”

The process of writing the book greatly improved my critical thinking, because I had to think more precisely about the meaning of particular terms and find examples of them that were not ambiguous. that was the most difficult, finding clear examples of the different movements, to illustrate them. I coined some of the names myself: there’s one called the ‘van gogh fallacy’, which is the pattern of thought when people say, ‘well, van gogh had red hair, he was a bit crazy, he was left-handed, was born on March 30, and, what do you know, I share all those things’ —which I do— and therefore I must be a great genius too’.

See also  Official TDCJ Commissary & Trust Fund Deposit | Texas.gov

That’s an obviously wrong way of thinking, but it’s very common. I was originally going to call it the “mick jagger fallacy”, because i went to the same elementary school as mick jagger (although not at the same time). people seem to like the idea of ​​sharing irrelevant characteristics (may be a birthday or hometown) with someone famous. and that? doesn’t mean you’re going to be mick jagger, just because you went to the same elementary school. I ultimately called it the van gogh fallacy, and it’s quite amusing to see that it now has some currency online and elsewhere. people use it like it’s an established term, which I guess it is now.

I love that. Well, another title dealing with psychological biases is the first critical thinking book you want to talk about, Daniel Kahneman’s Fast and Slow Thinking. why did you choose this?

This is an international bestseller by Nobel Prize-winning behavioral economist, though primarily a psychologist, Daniel Kahneman. he developed research with amos tversky, who sadly died young. I think otherwise it would have been a co-written book. is a brilliant book summarizing his psychological research on the cognitive biases (or his thought patterns) that we are all prone to, which are unreliable.

There is a lot of detail in the book. sums up a lifetime of research, two lifetimes, really. But Kahneman is very clear about the way he describes thought patterns: using “system one” or “system two.” system one is the quick, intuitive, emotional response to situations where we jump to a conclusion very quickly. you already know: 2 + 2 is 4. don’t think about it.

system two is more analytical, conscious, slower, methodical, deliberative. a more logical process, which consumes much more energy. we stop and think. how would you answer 27 × 17? you would have to think hard and do a calculation using the system of two types of thinking. the problem is that we rely on this system, this almost instinctive response to situations, and we often get poor responses as a result. that’s a framework within which much of their analysis is set.

I chose this book because it’s a good read and it’s a book you can come back to, but also because it’s written by a very important researcher in the field. then you have the authority of the person who did the actual psychological research. but he has some great descriptions of the phenomena he investigates, I think. anchor, for example. do you know the anchor?

I think so. is that when you provide an initial example that shapes future responses? maybe you better explain it.

that’s more or less. if you present someone with an arbitrary number, psychologically, most people seem prone to move in the direction of that number when you ask them a question. for example, there is an experiment with judges. they were asked offhand: what would be a good sentence for a particular crime, say theft? perhaps they would say that would be a six month sentence for a persistent thief.

but if you prep a judge by giving them an anchor number, if you ask, “should the sentence for larceny be more than nine months?” they are more likely to say on average that the sentence should be eight months than there would be been otherwise. and if he says, ‘should I be punished with a sentence of more than three months?’, they are more likely to go down in the area of ​​five, than they otherwise would.

See Also: Books About Friendship and Kindness – Pre-K Pages

so the way you phrase a question, by entering these numbers, provides an anchoring effect. influences people’s thinking towards that number. If you ask people if Gandhi was over 114 years old when he died, people will give you a better answer than if you just asked them, “How old was Gandhi when he died?”

I’ve heard of this in the context of charitable giving. asking if people will donate, say, £20 a month yields a higher average pledge than asking for £1 a month.

People also often use this anchoring technique to sell wine on a list. if there is a higher priced wine for £75, then somehow people are more attracted to one costing £40 than they otherwise would have been. if that was the most expensive on the menu they wouldn’t have been attracted to the £40 bottle, but seeing the higher price they seem to be attracted to a higher number. this phenomenon occurs in many areas.

and there are so many things that kahneman covers. there is the sunk cost fallacy, this tendency we have when we give our energy, money or time to a project: we are very reluctant to stop, even when it is irrational to continue. this is seen a lot in descriptions of withdrawal from war situations. we say: ‘we have given the lives of all those people, all that money, surely we are not going to stop this campaign now’, but it could be the most rational. all that money thrown in there doesn’t mean that throwing more in that direction will pay off. we seem to have a fear of future regret that outweighs everything else. this dominates our thinking.

What Kahneman emphasizes is that system one thinking produces overconfidence based on what is often a misjudgment of a situation. We are all subject to these cognitive biases, and they are extremely difficult to eliminate. Kahneman is a deeply pessimistic thinker in some respects; he recognizes that even after years of studying these phenomena he cannot eliminate them from his own thinking. I once interviewed him for a podcast and said, “Surely if you teach people critical thinking, they can get better at removing some of these biases.” he was not optimistic about it. I am much more optimistic than him. I don’t know if he had any empirical evidence to back that up, whether studying critical thinking can increase your thinking skills. but he surprised me how pessimistic he was.

interesting.

unlike some of the other authors we are going to discuss. . .

regarding kahneman for a moment, you mentioned that he had won a nobel prize, not for his research in psychology per se, but for his influence in the field of economics. His and Tversky’s groundbreaking work on the irrationality of human thought and behavior forms the backbone of a new field.

There’s been a significant tendency in economics to talk about an ideal subject, making rational decisions on their own, and that didn’t take into account the kinds of cognitive biases we’ve been discussing. the discipline of behavioral economics, which is now firmly established, is a kind of antidote to that. you take into account these behavior patterns that real people have, rather than these idealized individuals making rational assessments of how well they satisfy their desires. It’s probably a caricature of economics, but that’s the gist.

let’s see hans rosling’s book below, this is truth. What does it tell us about critical thinking?

rosling was a swedish statistician and physician who, among other things, gave some very popular ted talks. His book Factfulness of Him, which was published posthumously (his son and his daughter-in-law completed the book), is very optimistic, with a completely different tone than Kahneman’s. but it similarly focuses on the ways people make mistakes.

We make mistakes, classically, by being too pessimistic about things that are changing in the world. In one of Rosling’s examples, he asks what percentage of the world’s population lives on less than $2 a day. people almost always overestimate that number, and also the direction things are moving and the speed at which they are moving. in fact, in 1966, half of the world’s population was in extreme poverty by that measure, but by 2017 it was only 9%, so there has been a dramatic reduction in global poverty. but most people don’t realize this because they don’t focus on the facts and are possibly influenced by what they may have known about the situation in the 1960s.

If you ask people what percentage of children are vaccinated against common diseases, they almost always underestimate it. the correct answer is a very high proportion, something like 80%. ask people what is the life expectancy of each child born today, the world average, and again they are wrong. he is now over 70, another surprisingly high number. What Rosling has done as a statistician is take a hard look at what the world is like.

“pessimists tend not to notice changes for the better”

People assume that the present is like the past, so when they have learned something about the state of world poverty or learned about health, they often forget to do a second reading and see the direction in the that things are in motion, and the speed with which things are changing. that is the message of this book.

See also  The 3 Best Books Ever Written on Body Language

It’s an interesting book; it is very challenging. may be too optimistic. But it has this startling effect on readers of challenging widely accepted assumptions, as Steven Pinker, Our Nature’s Best Angels, has done. it is a plea to look at the empirical data, and not simply assume that you know how things stand now. but pessimists tend not to notice changes for the better. In many ways, although clearly not in relation to global warming and climate catastrophe, the statistics are actually very good for humanity.

That’s reassuring.

so this is numerical and statistical type critical thinking. it’s a bit different from the more verbal critical thinking I’ve been involved with. I’m really interested in having my assumptions challenged, and factfulness is a very readable book. it’s lively and uplifting.

Going back to what you said earlier about formal logic, statistics is another dense topic that needs specialized training. but it has a lot in common with critical thinking and many people find it very difficult, i.e. often counterintuitive.

one of the big problems for a common reader looking at this type of book is that we are not equipped to judge the reliability of its sources and therefore the reliability of the conclusions it draws. I think we have to take it with confidence and authority and hope that, given the intellectual division of labor, there will be other statisticians who will look at his work and see if he really was justified in drawing the conclusions that he did. he made such public statements for a long time and responded to criticism.

but you are right that there is a problem here. I think most people can equip themselves with critical thinking tools that work in everyday life. they can learn something about cognitive biases; they can learn about reasoning and rhetoric, and I think we can put ourselves as members of a democracy in a position where we think critically about the evidence and the arguments that are put before us, politically and in the press. that should be open to all smart people I think. it’s not a particularly onerous task to equip yourself with the basic tools to think clearly.

but statistics requires a kind of numerical dexterity, a comfort with working with numbers, and it’s hard for some people to get to a level where they can think critically about statistics. but it’s interesting to watch how it’s done, and that’s what I think you’re invited to do with this book, to watch someone think critically about statistics, in a series of measures.

absolutely. next i wanted to talk about the black box thinking of five book alumnus matthew syed.

yes, quite a different book. Matthew Syed is famous as a former international table tennis player, but, probably not known to most people, he also has a first class degree in philosophy, politics and economics (ppe) from oxford.

This book is really interesting. it is an invitation to think differently about failure. the title, black box thinking, comes from the black boxes that come standard on all airliners so that if an accident occurs, there is a recording of the flight data and a recording of the audio communications as the plane falls. when there is a crash, rescuers always try to retrieve these two black boxes. The data is then analyzed, the causes of the accident are dissected and examined, and the information is shared within the aviation industry and beyond.

“this is a model of an industry where, when there is a failure, it is treated as a very significant learning experience”

Obviously, everyone wants to avoid aviation disasters because they are so costly in terms of loss of human life. they undermine confidence throughout the industry. There is almost always some kind of technical or human error that can be identified, and everyone can learn from particular accidents. this is a model of an industry where, when there is a failure, it is treated as a very important learning experience, with the result that air travel has become a very safe form of transportation.

This is in contrast to some other areas of human endeavor, such as unfortunately much of healthcare, where failure information is often not widely shared. this can be for a number of reasons: there may be fear of litigation, so if a surgeon does something unorthodox or makes a mistake, and someone doesn’t survive the operation as a result, the details of what exactly happened on the table of operations will not be widely shared, generally, because there is great fear of a legal comeback.

Hierarchical aspects of the medical profession may also play a role here. higher-ranking people in the profession can keep a closed book and not share their mistakes with others, because it could be detrimental to their careers for people to know about their mistakes. There has been, historically anyway, a tendency for medical malpractice and medical error to be kept very quiet, hidden, difficult to investigate.

“an empirical hypothesis can never be completely confirmed, but it can be refuted by finding a single piece of evidence against it”

See Also: TuneIn Introduces TuneIn Premium Audio Subscription Service to Listeners Worldwide | TuneIn

what matthew syed argues is that we should take a different attitude to failure and see it as the aviation industry does. he is particularly interested in this being done within the field of health, but also more broadly. It’s an idea that comes in part from his reading of the philosopher Karl Popper, who described how science progresses not by proving the truth of theories, but by trying to disprove them. he can never completely confirm an empirical hypothesis, but he can refute one by finding a single piece of evidence against it. So, in a sense, hypothesis failure is the way science progresses: conjecture followed by refutation, not hypothesis followed by confirmation.

as syed argues, we make progress in all sorts of areas by making mistakes. he was an excellent table tennis player and he knows that every mistake he made was a learning experience, at least potentially, an opportunity to improve. I think you would find the same attitude among musicians, or in areas where practitioners are very attentive to the mistakes they make, and how those failures can teach them in a way that allows them to leap forward. the book has a wide range of examples, many from industry, of how different ways of thinking about failure can improve the process and outcome of particular practices.

When we think about raising children to be successful and put an emphasis on avoiding failure, we may not be helping them develop. Syed’s argument is that we should make failure a more positive experience, instead of treating it as something scary and always to be avoided. If she’s trying to achieve success and thinks, “I have to do it by racking up other successes,” maybe that’s the wrong mindset for success at the highest levels. maybe you need to think, ‘okay, I’m going to make some mistakes, how can I learn from this, how can I share these mistakes, and how can other people learn from them too?’

That’s interesting. in fact, just yesterday i was discussing a book by atul gawande, the new york surgeon and writer, called the checklist manifesto. In that, Gawande also argues that we should take the success of aviation, in that case, the checklists they go through before takeoff, etc., and apply it to other fields like medicine. A system like this is meant to get rid of human error, and I guess that’s what critical thinking is also trying to do: get rid of the gremlins in the machine.

Well, it’s also recognizing that when you make a mistake, it can have disastrous consequences. but it doesn’t eliminate errors by simply pretending they didn’t happen. With the Chernobyl disaster, for example, there was an initial unwillingness to accept the evidence in the public’s eyes that a disaster had occurred, combined with a fear of being seen to be wrong. there is this tendency to think that everything is going well, a kind of cognitive bias towards optimism and fear of being responsible for the error, but it is also this unwillingness to see that in certain areas, the admission of failure and the shared knowledge of that errors have occurred is the best way to minimize failures in the future.

very beckettian. “Fail again. Fail better.”

I guess. but that is a kind of pessimism: that you will never achieve anything. whereas i think matthew syed is a very optimistic person who believes that things can actually be a lot better, and the way they will get a lot better is by thinking critically about how we achieve things, about the best way to achieve success. not following established practices that hide failure, but seeing failure as a likely condition of success, not just a prelude to more failure. although, in a way, the Popperian line is that progress is a process of failing better, so maybe you’re right.

See also  Square Books, Jr. | SQUARE BOOKS

absolutely. well, shall we turn to rolf dobelli’s 2013 book, the art of thinking clearly?

yes. this is a fairly light book compared to the others. it is really a summary of 99 movements in thought, some of them psychological, some of them logical, some of them social. what I like is that it uses many examples. each of the 99 entries is pretty short, and it’s the kind of book you can really dive into. You’d think it would be too hard to digest to read cover to cover, but it’s a book to come back to.

I included it because it suggests that you can improve your critical thinking by having labels for things, recognizing movements, but also by having examples that are memorable, through which you can learn. this is an unassuming book. Dobelli does not claim to be an original thinker; he is a summary of other people’s thoughts. what he has done is bring together a lot of different things in one place.

Just to give you an idea of ​​the book: It has a three page chapter on the paradox of choice called “less is more”, and it’s the very simple idea that if you present someone with too many options, instead of freeing them up and improving their life and making them happier, makes them waste a lot of time, even destroys the quality of their life.

“if you present someone with too many options, they will waste a lot of time”

I saw an example of this the other day at the supermarket. I passed a friend who was standing in front of about 20 different types of coffee. the type he usually buys was not available, and he was just frozen in this inability to make a decision between all the other brands in front of him. if there had only been one or two, he would have gone for one of those quickly.

dobelli is here summarizing the work of psychologist barry schwartz, who concluded that, in general, a wider selection leads people to make worse decisions for themselves. We think that when we go out into the world, what we need is more options, because that will allow us to do what we want to do, acquire the correct consumable, or whatever. but perhaps just by raising that possibility, the greater number of options will lead us to make poorer decisions than if we had less to choose from.

Now, that’s the descriptive part, but at the end of this short summary, it asks “what can you do about it in practice?” his answer is that you should think carefully about what you want before looking at what is going on. offer. Write down the things you think you want and stick with them. don’t get carried away with other options. and don’t get caught up in some kind of irrational perfectionism. This is not profound advice, but it is thought provoking. and that’s typical of the book.

You can browse these entries and take them or leave them. it is a kind of self-help manual.

oh, I love that. a critical thinking self help book

It’s really in that self-help genre, and it’s very well done. he goes in and out on a couple of pages for each of these. I wouldn’t expect this to be on a philosophy reading list or anything like that, but it has been an international bestseller. it’s a clever book, and I think it’s definitely worth diving in and coming back to. the author does not pretend that it is the greatest or most original book in the world; rather, it is just a book that will help you think clearly. that’s the point.

is also optimistic, unlike kahneman. dobelli is not saying that you are trapped in all these biases and that there is nothing you can do about it. he is saying that there is a sense in which you can do something about it. that may just be another cognitive bias, wishful thinking, but I’m biased to think that thinking about things can change the way we behave. It can be hard, but reflecting on the things you’re doing is, I think, the first step to thinking more clearly.

absolutely. Let’s move on to the final title, Tom Chatfield’s Critical Thinking: Your Guide to Effective Argumentation, Successful Analysis, and Independent Study. we had tom at five books many moons ago to discuss books about computer games. this is quite different. what makes it so good?

well, this is a different kind of book. I was trying to think of someone reading this interview who wants to improve their thinking. Of the books I’ve discussed, the ones that are obviously aimed at that are Black Box Thinking, the Dobelli Book, and Tom Chatfield’s Critical Thinking. the others are more descriptive or academic. But this book contrasts quite a bit with Dobelli’s. The Art of Clear Thinking is a very short and punchy book, while Tom’s is longer and more of a textbook. includes exercises, with summaries in the margins, is printed in textbook format. but that shouldn’t put off a general reader, because I think it’s the kind of thing you can work on and immerse yourself in.

is clearly written and accessible, but is also designed to be used in courses. chatfield teaches a point, then asks you to test yourself to see if you’ve learned the moves he describes. is very broad: it includes material on cognitive biases as well as more logical moves and arguments. its goal is not just to help you think better and structure arguments better, but also to write better. it is the kind of book you would expect a good university to present to all incoming freshmen, across a wide range of courses. but I include it here more as a recommendation for the self-taught. If you want to learn to think better: here is a course in book form. you can figure this out on your own.

Fantastic.

It’s also a contrast to the other books, so that’s part of my reason for putting it there, so there’s a variety of books on this list.

definitely. I think that readers of five books, almost by definition, tend towards self-education, so this is a perfect book to recommend. And finally, to close: do you think that critical thinking is something that more people should make an effort to learn? I suppose the lack of it could help explain the rise of post-truth politics.

It is actually quite difficult to teach critical thinking in isolation. in the philosophy department of the open university, when I worked there writing and designing the course materials, we ultimately decided to teach critical thinking as it emerged when teaching other content: stepping back from time to time to observe the movements of thought that were being made by philosophers, and the critical thinking moves that a good student might make in response to them. pedagogically, that often works much better than trying to teach critical thinking as a separate topic in isolation.

This approach can also work in scientific areas. a friend of mine has taught a successful university course for zoologists on critical thinking, looking at correlation and cause, particular types of rhetoric used in articles and experiments, etc., but all the while prompted by real examples from zoology. if you have any topic, and you have examples of people reasoning, and you can get away from it, I think this approach can work very well.

But in answer to your question, I believe that having some basic critical thinking skills is a prerequisite for being a good citizen in a democracy. if you are too easily swayed by rhetoric, weak at analyzing arguments and the ways people use evidence, and prone to all sorts of biases you are not aware of, how can you participate politically? So yes, we can all improve our critical thinking skills, and I think that’s one aspect of living the examined life that Socrates was so interested in that we should all do.

See Also: Max Brand – Book Series In Order

Leave a Reply

Your email address will not be published. Required fields are marked *