Book review: Value-Focused Thinking: A Path to Creative Decisionmaking, by Ralph L. Keeney.
This book argues for focusing on values (goals/objectives) when making decisions, as opposed to the more usual alternative-focused decisionmaking.
The basic idea seems good. Alternative-focused thinking draws our attention away from our values and discourages us from creatively generating new possibilities to choose from. It tends to have us frame decisions as responses to problems, which leads us to associate decisions with undesirable emotions, when we could view decisions as opportunities.
A good deal of the book describes examples of good decisionmaking, but those rarely provide insight into how to avoid common mistakes or to do unusually well.
Occasionally the book switches to some dull math, without clear explanations of what benefit the rigor provides.
The book also includes good descriptions of how to measure the things that matter, but How to Measure Anything by Douglas Hubbard does that much better.
Book review: The Motivation Hacker, by Nick Winter.
This is a productivity book that might improve some peoples’ motivation.
It provides an entertaining summary (with clear examples) of how to use tools such as precommitment to accomplish an absurd number of goals.
But it mostly fails at explaining how to feel enthusiastic about doing so.
The section on Goal Picking Exercises exemplifies the problems I have with the book. The most realistic sounding exercise had me rank a bunch of goals by how much the goal excites me times the probability of success divided by the time required. I found that the variations in the last two terms overwhelmed the excitement term, leaving me with the advice that I should focus on the least exciting goals. (Modest changes to the arbitrary scale of excitement might change that conclusion).
Which leaves me wondering whether I should focus on goals that I’m likely to achieve soon but which I have trouble caring about, or whether I should focus on longer term goals such as mind uploading (where I might spend years on subgoals which turn out to be mistaken).
The author doesn’t seem to have gotten enough out of his experience to motivate me to imitate the way he picks goals.
Book review: The Willpower Instinct: How Self-Control Works, Why It Matters, and What You Can Do To Get More of It, by Kelly McGonigal.
This book starts out seeming to belabor ideas that seem obvious to me, but before too long it offers counterintuitive approaches that I ought to try.
The approach that I find hardest to reconcile with my intuition is that self-forgiveness over giving into temptations helps increase willpower, while feeling guilt or shame about having failed reduces willpower, so what seems like an incentive to avoid temptation is likely to reduce our ability to resist the temptation.
Another important but counterintuitive claim is that trying to suppress thoughts about a temptation (e.g. candy) makes it harder to resist the temptation. Whereas accepting that part of my mind wants candy (while remembering that I ought to follow a rule of eating less candy) makes it easier for me to resist the candy.
A careless author could have failed to convince me this is plausible. But McGonigal points out the similarities to trying to follow an instruction to not think of white bears – how could I suppress thoughts of white bears of some part of my mind didn’t activate a concept of white bears to monitor my compliance with the instruction? Can I think of candy without attracting the attention of the candy-liking parts of my mind?
As a result of reading the book, I have started paying attention to whether the pleasure I feel when playing computer games lives up to the anticipation I feel when I’m tempted to start one. I haven’t been surprised to observe that I sometimes feel no pleasure after starting the game. But it now seems easier to remember those times of pleasureless playing, and I expect that is weakening my anticipation or rewards.
Book review: The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t by Nate Silver.
This is a well-written book about the challenges associated with making predictions. But nearly all the ideas in it were ones I was already familiar with.
I agree with nearly everything the book says. But I’ll mention two small disagreements.
He claims that 0 and 100 percent are probabilities. Many Bayesians dispute that. He has a logically consistent interpretation and doesn’t claim it’s ever sane to believe something with probability 0 or 100 percent, so I’m not sure the difference matters, but rejecting the idea that those can represent probabilities seems at least like a simpler way of avoiding mistakes.
When pointing out the weak correlation between calorie consumption and obesity, he says he doesn’t know of an “obesity skeptics” community that would be comparable to the global warming skeptics. In fact there are people (e.g. Dave Asprey) who deny that excess calories cause obesity (with better tests than the global warming skeptics).
It would make sense to read this book instead of alternatives such as Moneyball and Tetlock’s Expert Political Judgment, but if you’ve been reading books in this area already this one won’t seem important.
Book review: The Righteous Mind: Why Good People Are Divided by Politics and Religion, by Jonathan Haidt.
This book carefully describes the evolutionary origins of human moralizing, explains why tribal attitudes toward morality have both good and bad effects, and how people who want to avoid moral hostility can do so.
Parts of the book are arranged to describe the author’s transition from having standard delusions about morality being the result of the narratives we use to justify them and about why other people had alien-sounding ideologies. His description about how his study of psychology led him to overcome his delusions makes it hard for those who agree with him to feel very superior to those who disagree.
He hints at personal benefits from abandoning partisanship (“It felt good to be released from partisan anger.”), so he doesn’t rely on altruistic motives for people to accept his political advice.
One part of the book that surprised me was the comparison between human morality and human taste buds. Some ideologies are influenced a good deal by all 6 types of human moral intuitions. But the ideology that pervades most of academia only respect 3 types (care, liberty, and fairness). That creates a difficult communication gap between them and cultures that employ others such as sanctity in their moral system, much like people who only experience sweet and salty foods would have trouble imagining a desire for sourness in some foods.
He sometimes gives the impression of being more of a moral relativist than I’d like, but a careful reading of the book shows that there are a fair number of contexts in which he believes some moral tastes produce better results than others.
His advice could be interpreted as encouraging us to to replace our existing notions of “the enemy” with Manichaeans. Would his advice polarize societies into Manichaeans and non-Manichaeans? Maybe, but at least the non-Manichaeans would have a decent understanding of why Manichaeans disagreed with them.
The book also includes arguments that group selection played an important role in human evolution, and that an increase in cooperation (group-mindedness, somewhat like the cooperation among bees) had to evolve before language could become valuable enough to evolve. This is an interesting but speculative alternative to the common belief that language was the key development that differentiated humans from other apes.
Book review: Thinking, Fast and Slow, by Daniel Kahneman.
This book is an excellent introduction to the heuristics and biases literature, but only small parts of it will seem new to those who are familiar with the subject.
While the book mostly focuses on conditions where slow, logical thinking can do better than fast, intuitive thinking, I find it impressive that he was careful to consider the views of those who advocate intuitive thinking, and that he collaborated with a leading advocate of intuition to resolve many of their apparent disagreements (mainly by clarifying when each kind of thinking is likely to work well).
His style shows that he has applied some of the lessons of the research in his field to his own writing, such as by giving clear examples. (“Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular”).
He sounds mildly overconfident (and believes mild overconfidence can be ok), but occasionally provides examples of his own irrationality.
He has good advice for investors (e.g. reduce loss aversion via “broad framing” – think of a single loss as part of a large class of results that are on average profitable), and appropriate disdain for investment advisers. But he goes overboard when he treats the stock market as unpredictable. The stock market has some real regularities that could be exploited. Most investors fail to find them because they see many more regularities than are real, are overconfident about their ability to distinguish the real ones, and because it’s hard to distinguish valuable feedback (which often takes many years to get) from misleading feedback.
I wish I could find equally good book for overuse of logical analysis when I want the speed of intuition (e.g. “analysis paralysis”).
Book review: Influence: The Psychology of Persuasion by Robert B. Cialdini.
This book gives clear descriptions of six strategies that salesmen use to influence customers, and provides advice on how we can somewhat reduce our vulnerability to being exploited by them. It is one of the best books for laymen about heuristics and biases.
It shows why the simplest quick fixes would produce more problems than they solve, by showing that there are good reasons why we use heuristics that create opportunities for people to exploit us.
The author’s willingness to admit that he has been exploited by these strategies makes it harder for readers to dismiss the risks as something only fools fall for.