Wednesday 29 October 2014

The Willpower Muscle

I read a good book called Willpower: Rediscovering the Greatest Human Strength by Roy Baumeister. It has some fascinating discussion about willpower being a muscle that can fatigue and can be strengthened.

He starts the book with the famous marshmallow experiment by Walter Mischel of Stanford.  Children were left in a room with a marshmallow on the table and told they could eat the marshmallow now, if they wanted, or if they waited until the experimenter returned, they would get two marshmallows.  Some children ate the marshmallow right away.  Others tried to hold out by distracting themselves.  The children were followed for many years after the original experiment.  Mischel found that the children, who were able to exercise self-control, had much better life outcomes than the children who were immediate gratifiers.

Baumeister conducted similar experiments and found that willpower is a muscle that can be fatigued.  He left people in a room with a plate of cookies and a plate of radishes.  Afterwards, he asked them to hold their hands in ice water.  The people who were told they could eat the cookies could hold their hands in the water longer than those who were told they could only eat the radishes.

He did a similar experiment to deplete the willpower muscle then gave two groups of people either lemonade with sugar or lemonade with sweetener.  He found that those who were given the sugary lemonade worked longer on difficult puzzles after the willpower depletion.

This led to an interesting finding about dieters.  They need to eat food to have the willpower to avoid eating food.  It is a Catch-22.

Baumeister found that judges made the more difficult decision to grant parole after they had eaten lunch or a snack.  They reverted to the default decision not to grant parole when they were tired and hungry.

He found that decision making is fatiguing.  When car dealers asked their clients to make decisions about the features they wanted in their new car, he found that if the dealers started with difficult decisions with many choices, the clients paid for more expensive features compared to when the decisions the clients started with were simpler with few choices.   Baumeister hypothesized that this is the reason why politicians seem to make such bad choices in the personal lives, after work, because their willpower is fatigued from a long day of making decisions.

Baumeister also found that sleep is important to increase willpower.  Tired people have less willpower than rested people.

The good news is that willpower can be strengthened.  He related how Korean parents can teach their children self-control.  Although, east Asians represent only 4% of the population in North America, 25% of the students in college are east Asian.  After they graduate, they also receive 25% higher salaries.

Baumeister suggests that self-control can be improved by setting “bright lines” in behaviour (i.e. having clear, simple, unambiguous rules).  You should plan ahead to avoid impulsive decisions: “if x happens, then I will do y”.  You should never say “never”; instead say “not now but later”.  You should pre-commit to your resolutions by making your commitments public or signing an agreement ahead of time as to what you will do (e.g. donate to a charity) if you don’t meet your commitments.  To avoid procrastination, reflecting on what you have done will make you happier but reflecting on what you have to do will make you more productive.  Another way to avoid procrastination is to tell yourself you will not do the desired work, instead you will do nothing.  This is because procrastination is not doing nothing but doing lower priority tasks.  You should make only one New Year’s resolution because when you make many resolutions, there is a good chance that some of these resolutions will be conflicting.

Baumiester suggests that your willpower muscle can be strengthened by conscious habits such as concentrating on standing and sitting with good posture at all times or taking time to record everything you eat.  Maintaining good personal hygiene is very useful for improving willpower.  He found that students, who were fatigued from studying, invariably wore dirty socks.  Those who took better care of their hygiene did better on their essays and exams.

Baumeister suggests that we should look for symptoms of willpower muscle depletion.  We should pick our battles to avoid wasting energy on trivial things.  We should pre-commit to a to-do list but beware of the planning fallacy.  We should employ positive procrastination with the “do nothing offense”.

How Not To Be Wrong



I read a good mathematics book by Jordan Ellenberg called “How Not to Be Wrong: The Power of Mathematical Thinking”.  The book covered just four topics: linearity, expectation, regression to the mean, and inference.

The book opened with a discussion of a World War II mathematician who was asked to analyze the bullet holes in the fuselage of bomber planes to determine where to put extra armor.  He looked at that bullet holes in the returning planes and told the air force they should put the armor where the bullet holes weren’t because planes hit in those places didn’t return.  This was an example of the power of mathematical thinking.

In the linearity section, Ellenberg described the Laffer curve in political economy.  The statement was made by a right-wing journalist in United States: Why would the United States want to be more like Sweden when Sweden wants to be more like United States?

The fallacy in this argument is the assumption of linearity.  The economist, Arthur Laffer, explained it on the back of a napkin to Dick Cheney and Donald Rumsfeld at an afternoon meeting during the Ford administration.  The Laffer curve explains the value of taxation policy from left leaning countries (high taxes and many government services) to right leaning countries (low taxes and few government services).  The curve has a hump in the middle.  He explained that the United States may be too low on the curve and should move towards the middle in the direction of Sweden while Sweden might be too high on the curve and should move towards the middle in the direction the United States.  Since the curve has a hump in the middle, the optimal amount of taxes and government services is between the amounts in the United States and Sweden.

To explain expectation, Ellenberg describes the expected winnings in lotteries.  He tells about a mathematician at Stanford who has won four major lotteries by knowing the distribution of winning "scratch and win" ticket sales.  He also describes a syndicate who could obtain a positive expected winnings by buying all the tickets in the Virginia lottery when the pot got high enough because of roll-over.  He also talked about the law of very large numbers where if you have enough observations almost anything can happen.  With all of the lotteries in the world every week, it is not unusual for somewhere at sometime to have the same set of numbers come up twice in a row.  When this actually happened in Bulgaria, the government conducted an investigation into whether the lottery was fixed.

In the inference section, Ellenberg talked about how medical treatments are studied for their effectiveness.  He described hypothesis testing and how there are many unreplicable studies because of false positive results.  One of the items he discussed was mutual fund performance and survivor bias.  All mutual fund results look good because poor performing funds are subsumed into more successful funds. Thus only data on survivors is available.

Finally, Ellenberg talked about regression to the mean.  He described a study at the turn of the century in which a business analyst studied many successful firms and then 30 years later found that these firms were no longer successful while firms that were not very successful in the earlier time might now be successful.  He tried to publish his finding as a remarkable law of business.  The study was reviewed by a famous mathematician who tried to explain that the finding was nothing more than regression to the mean and the success of the firms in both periods was probably luck not business acumen.

Later after reading “How Not to Be Wrong: The Power of Mathematical Thinking”, I heard an interview with the author.  He said he had written a book proposal with thirteen sections.  But after four sections, he had written 300 pages.  So he asked for permission to stop there and save the rest of the topics for future books.  I look forward to reading volume II and III, etc.


Thursday 5 June 2014

Delay and Immunity



Nickolas Epley in his book Mindwise discusses how to encourage people not to lie. 

He notes a clever experiment by Daniel Gilbert (the author of Stumbling on Happiness ) that showed people were more willing to admit to having done something wrong in the face of evidence when confronted sometime after the event than when questioned immediately after the event.

Epley discusses the investigation into the Deepwater Horizon oil rig explosion, the largest oil spill in history.  The rig workers said they feared reprisals for reporting safety concerns.  One worker said “the company was always using fear tactics”.  Apparently, not only would they kept quiet about their concerns, they would fake data in the company’s safety system.  Epley felt strongly that if the company’s executives had not threatened the workers' with reprisals if they reported safety concerns, the disaster could have been averted.

Epley also describes a University of Michigan hospital that began a medical-error-disclosure program.  The doctors were encouraged to “openly admit their medical mistakes in meetings with patients, explain what led to the mistake, and then offer fair compensation.”  This policy resulted in a reduction in malpractice lawsuits from 39 per year to 17 per year and reduced overall liability costs by 60%.

According to Epley, the problem was caused by requiring the patients to imagine what their doctors were thinking rather than allowing doctors to explain how a mistake happened, then encouraging doctors to share their experiences to ensure the same mistake doesn’t ever happen again in the future.

Know Thyself



I just finished reading the book Mindwise: How We Understand What Others Think, Believe, Feel, and Want by Nickolas Epley. 

He suggests we conduct a simple experiment on ourselves.  He wants us to think of an important task we want to complete in the next few weeks.

Then write down on a piece of scratch paper our most accurate prediction of when (date and time) we are going to complete this task. 

Then write the best-case scenario if everything goes as quickly as possible.

Finally, estimate the worst-case scenario, if everything goes as badly as it possibly could.

Then he bets us that we will not make our worst-case scenario.

One case Epley describes is students working on their honours thesis.  The students’ predictions were on average 27 days in the best-case, 34 days in the realistic case and 49 in the worst-case.

The actual average turned out to be 55 days.

In another experiment, only 45% of the projects were done by the time they were predicted, with 99% certainty, to be completed.

Epley believes that the most interesting thing about the planning fallacy is that “despite having so much experience committing it ourselves, we so consistently think that our own mistakes are things of the past rather than the present.”

Tuesday 30 July 2013

Cognitive Dissonance


I recently read the book Mistakes Were Made (But Not by Me) . The authors relate the problems of cognitive dissonance. In particular, how in the face of clear evidence, we often deny that we made a mistake.

They discuss clinic psychology and repressed memory. They discuss police interrogation techniques and false prosecutions. They discuss how cognitive dissonance and self-justification can led from small problems in marriages to divorce, or from small acts of dishonesty to major crimes and fraud.

I suggest that project managers when faced with concrete evidence that project managers' cost and time estimates are wrong will find ways to protect their egos by self-justification. They will find reasons why this evidence does not apply to them. They will explain how their project will be different.

Furthermore, the authors provide evidence that people with the highest self-esteem will be the most likely to deny the evidence. These experts' estimates will not be any better but they will have much more confidence in their estimates.  They will hold more strongly to their original estimates in the face to evidence proving they are wrong.

The authors describe the training of police investigators who are trained using a manual on interrogation techniques that will help obtain a confession from the suspect. The manual provides suggestions on how to determine if a suspect is lying. However, in controlled experiments, those police investigators, who were trained with this manual, did no better than untrained university students at determining if a suspect was lying. The trained investigators were however much more confident that they had correctly distinguished the liars from those who were telling the truth.

This makes me wonder if the courses taught by the Project Management Institute using their Book of Knowledge do something similar. They give project managers more confidence in their estimates but not more accurate estimates.

Deception and Intelligence


I recently read a book called The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life . Robert Trivers writes about how and why animals and humans try to deceive each other.

Trivers opens the book with a discussion of animal behaviour. In particular, he mentions the cuckoo. Cuckoos lay their eggs in other birds' nests and thereby get out of the effort of roosting and feeding the newborns.

Some of these birds have learned to count their eggs. If they find that there are more eggs in their nest than they laid, they abandon the nest and go somewhere else.

So to counter this the cuckoos have learned that when they lay an egg in some other bird's nest, they should push one of the existing eggs out of the nest.  Then the count is the same.

To counter this, the other birds have learned to look for broken eggs on the ground below their nests.

In this way, both the cuckoo and the other birds are constantly learning different strategies to deceive and counter the deception.

Trivers suggests that this deception and counter-deception is how intelligence has been formed over time.  Also it happens much faster than evolution would suggest.

We have seen in earlier posts that project managers tend to have an optimism-bias.  They believe their projects will come in on time and on budget.

They may be attempting to deceive the senior decision makers. According to Trivers, the decision makers should be learning from this deception and trying to counter it. 

I have not seen this type of learning taking place.  Senior decision makers do not appear to be attempting to counter project managers' optimism bias.

The only person who I have seen who appears to be recommending that this deception should be countered is Bent Flyvberg.

I recommend Flyvberg's article “Over Budget, Over Time, Over and Over Again” found here and his books Megaprojects and Risk and Decision-Making On Mega-Projects in which he suggests methods to counter project managers' optimism bias.

Thursday 6 June 2013

On Being Strategically Wrong

I just finished a book by Robert Kurzban called Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind. Kurzban is a evolutionary psychologist.

I have read many behavioural economics books recently, such as Thinking, Fast and Slow, Wait: The Art and Science of Delay, and How We Decide. Also in the last couple of years, I have read books like Predictably Irrational: The Hidden Forces That Shape Our Decisions , and its critic's book The Logic of Life: The Rational Economics of an Irrational World.

All of these books describe the same famous psychological experiments which try to prove that humans are not rational beings always weighing the costs and benefits of their actions and then taking actions that are in their own self-interest as described by economic theory. However, they often draw different conclusions. It can be quite confusing.

Kurzban's book seems to be the exception. It makes the other books look silly. His concept of the modular mind explains how our brains have developed over millions of years of evolution. Our minds contain many parts or “modules” each with different functions. Sometimes these modules don't communicate very well to each other.

So saying “I think” or “Someone is acting in their self-interest” begs the question who is "I" and what is a "self".

As Kurzban describes most of us have the impression that somewhere inside our head is a central control that is the brains controlling our thoughts. Also, we have the impression that the part of the brain that controls speech actually speaks for all of our modules.

However, split brain patient experiments prove that parts of the right side of the brain can become disconnected from the speaking part of our brain on the left side.

Similar experiments with normal people show that there are many parts of the brain that are not connected to our speaking part. Kurzban suggests that this is why we may have strong opinions about subjects like legalizing abortion. recreational drugs or prostitution without being able to explain the logic behind our opinions.

I highly recommend Kurzban's book.

However, my own field of expertise is optimism bias in project management. Late in my career in the Department of National Defence after I had completed my PhD dissertation entitled Cost Estimation and Performance Measurement in Canadian Defence, I remarked to the Director of Costing Services that Project Managers' estimates of costs should not be trusted. They are unrealistically optimistic about their projects and will systematically underestimate the costs and overestimate the benefits. The Director brushed my comments aside and quickly replied, “Project managers have to be optimistic”.

Kurzban has an interesting insight into this optimism bias from an evolutionary point of view. Being unrealistically optimistic should have put people at an evolutionary disadvantage over time. If some people were unrealistic about their chances of survival in risky situations and acted irrationally, evolution would suggest that their genes would be killed off.

Kurzban hypothesizes that there might be an evolutionary advantage from optimism or being “strategically wrong” in social settings. Namely, it may be helpful in persuading others to do your wishes if you truly belief your plans will be successful.

Although part of your brain may know the facts about the likely success of your plans, the part of your brain that wants to persuade people is able to take control of your behaviour. In that way, you can be convincing in your overly optimistic statements about your project and not actually be lying in the sense of saying something that you don't actually believe.

Therefore, project managers are not really lying about the future costs of their projects. The part of their brain that controls speech may truly believe what they are saying. No amount of factual information about the costs of similar projects will be able to convince them that they are being unrealistic. In fact, it is likely that part of their brain already knows the facts. Unfortunately, that part of their brain is not able to take control of their behaviour.