Tuesday 1 July 2008

How wrong can a book be?

I've enjoyed reading lots of non-fiction books in the last few years. I'm hoping to start condensing the ideas from them into this blog, to see if I can piece it all together into something coherent.

But whenever I finish a 300-odd page book in which the author is setting out their theories and viewpoints, I get a few quibbling doubts. The more convincing the book seems, the more I wonder if I have been deceived in some way.

When you read a non-fiction book, the author has your complete attention for the whole time that you read it, without any opposing viewpoints getting a look in, unless the author chooses to represent them. So compared to a conversation, live debate or peer-reviewed scientific paper, there is a lot more scope for persuasion tricks.

So this has led me to wonder - How wrong can a book be? Is it possible to write a 300 page book about some set of ideas that seems convincing but is actually completely wrong?

I think the answer is probably "Yes, its possible for an apparently convincing book to be totally 100% wrong".

Imagine how easy it would be to walk into a bookshop and pick two books that passionately argued for completely opposite positions. If either of them was close to the real truth, then the other would have to be completely wrong. Or they might both be partially wrong, and so if you make a sum total of their combined wrongness it would approach 100%. Is that right? Or did I just pull off a persuasion trick?

Types of Persuasion Trick

So what are the 'persuasion tricks' to look out for in these kinds of books? Here are some of the main ones that I've noticed:
(When I write about books on this blog I'll try and highlight examples)

1. Confirmation bias

I think the number one persuasion trick is the biases in our own minds. Confirmation bias - the tendency to search for or interpret information in a way that confirms one's preconceptions - is often quite strong, and will be in the authors mind when they write the book, and often in the reader's mind when they read it too.

Naturally, lots of people have noticed this bias over the years, so it has been given many different labels. Timothy Leary's concept of Reality Tunnels can be a useful way to envision Confirmation bias - the image of a tunnel is quite apt. 'Myside bias', 'Belief preservation' and 'Selective thinking' are other terms used for roughly the same phenomenon.

Also, one of my favorite Wittgenstein quotes touches on this: "Nothing is so difficult as not deceiving oneself."

For other biases, and the experimental evidence that demonstrates them, see Wikipedia's List of cognitive biases. Information bias, anyone?

2. Selective evidence

This is similar to Confirmation bias in a way, but is more of a conscious technique of persuasion. When reviewing previous studies of a subject, an author will often be tempted to skip the ones that pose problems, or find a reason to write them off.

When a book I'm reading tells me that a major study was later shown to be full of holes by someone or other, I get a little suspicious. True, often scientific papers and studies do get pulled apart for good reason sometimes, but on the other hand, even very well regarded studies have plenty of critics with axes to grind.

3. Misquoting scientific papers

I think this is common, but its slightly harder to track down. A book might quote an obscure scientific paper and say that it demonstrates support for idea 'Z'. But how do you know unless you actually go and check? Was the paper about idea 'Z', or was it actually all about 'Y' and 'Z' only gets a tiny, prospective mention near the end?

4. Emotional appeals

By appealing to the reader's emotions (outrage, sense of injustice, etc) a book can persuade you to leave the path of rational critical thinking and go on a scramble through the hedges and shrubs of knee jerk reactions.

How Thinking Goes Wrong

Another related web page that I'd like to mention here is How Thinking Goes Wrong, by Michael Shermer. Its actually a chapter from his book Why People Believe Weird Things. It lists 25 ways that people can make mistakes in their thought. Its relevant to this post because it basically describes 25 ways that superficially convincing ideas can be mistaken.

It well worth a read, I'll list his 25 categories here to give you an idea:

-Problems in Scientific Thinking
1. Theory Influences Observations
2. The Observer Changes the Observed
3. Equipment Constructs Results

-Problems in Pseudoscientific Thinking
4. Anecdotes Do Not Make a Science
5. Scientific Language Does Not Make a Science
6. Bold Statements Do Not Make Claims True
7. Heresy Does Not Equal Correctness
8. Burden of Proof
9. Rumors Do Not Equal Reality
10. Unexplained Is Not Inexplicable
11. Failures Are Rationalized
12. After-the-Fact Reasoning
13. Coincidence
14. Representativeness

-Logical Problems in Thinking
15. Emotive Words and False Analogies
16. Ad Ignorantiam
17. Ad Hominem and Tu Quoque
18. Hasty Generalization
19. Overreliance on Authorities
20. Either-Or
21. Circular Reasoning
22. Reductio ad Absurdum and the Slippery Slope

-Psychological Problems in Thinking
23. Effort Inadequacies and the Need for Certainty, Control, and Simplicity
24. Problem-Solving Inadequacies
25. Ideological Immunity, or the Planck Problem

Getting into the ring

I'll end this post with another Wittgenstein quote:
A philosopher who is not taking part in discussions is like a boxer who never goes into the ring.
If somome reads a convincing book but doesnt then go searching for critical reviews or responses to the book, its a similar situation. The book's ideas will sit smugly in their head, appearing to have seen off all challengers, without really facing any.

No comments: