The secret of success: Do your homework!

The secret of success: Do your homework!
Updated 19 May 2012
Follow

The secret of success: Do your homework!

The secret of success: Do your homework!

To err is human but how often do we get it wrong? More often than we realize, according to new research, and the biggest mistake corporations make is not being prepared. In September 1999, the Mars Climate Orbiter crashed due to a simple miscalculation. NASA reported that the loss of the $125 million probe was the result of ground crew’s failing to translate English units into metric units. While the Orbiter example may be extreme, human error — miscalculation, systematic bias or “fat-fingers” — costs businesses billions of dollars a year and is responsible for the creation and mis-marketing of some very forgettable products: Think Ford Edsel or New Coke. Organizations need accurate forecasts and estimates to decide where to invest, how much and how fast. And, while they are becoming more sophisticated in the way they use the increasing mass of statistical data which enters their orbit to improve productivity and create or sell new products, human judgment remains a vital part of this decision-making process.
Big Blunders
It’s accepted humans, even experts, get things wrong at times, but how often do they make really big blunders? More frequently than expected, according to Miguel Lobo, INSEAD assistant professor of decision sciences, and the biggest mistake businesses can make is not acknowledging the fact. “Companies grossly under-invest in the collection of information,” Lobo told Knowledge. “They rely too much on a single piece of information and they fail to make contingency plans for when thing go horribly wrong.” Take the slump in semi-conductor demand during the early 1990s. According to Lobo, IT giant Intel massively under-invested in fabrication capacity. When the boom driven by the dot.com bubble came, they were caught out. The loss of income in not being prepared cost much more than the $1 billion outlay for a new plant. “It was an error in estimation,” Lobo says. “An error in not having thought about the consequences of the world not being as you think it will be.”
The Bell Curve
For decades scientists have used a bell curve to depict the distribution of uncertainties and previous research assumed the size of human error fell into this normal bell curve distribution pattern. “I wondered how often they fell outside this curve,” Lobo says. "I had a hunch it was going to be more often than expected. That motivated me to do the research.”
— Using 17 databases and over 20,000 forecasts from two sets of focus groups — a panel of MBA graduates and the forecasts of 50 New York economists — Lobo’s research ‘Human Judgment is Heavily Tailed’ found really big mistakes, outside the normal bell curve range, occurred much more frequently than expected. “What surprised me was how consistent it was across a lot of different tasks. The size of the typical error changed but the frequency was constant.”
The huge variety of the economists’ forecasts included big systematic errors, while the panel of MBA graduates who were asked to estimate questions such as the number of countries in the United Nations, the value of daily global oil production, and the market capitalization of Google, often gave wildly inaccurate responses. “Shockingly no matter what we asked, whether it was something they knew a lot about or something outside their field, there were errors,” says Lobo.
He then looked at the extreme standard variation in the size of the errors. “We looked at, for example, an error which is so large it should happen only one out of 1,000 times, an extremely rare event to completely mis-estimate a forecast… it turns out those mistakes happen 10 times more often than would be predicted by the standard distribution pattern.”

The Cost of Mistakes
These errors of judgment become costly mistakes when the faulty figures are accepted by senior managers who make decisions based on too little advice.
Either through similarity bias (projecting their preference and similarity onto others), or over-confidence bias (grossly underestimating their own ignorance) managers are not arming themselves with enough or appropriate information, says Lobo.
“The vast majority of people under-appreciate the importance of collecting advice or opinions from a diverse range of people. Asking more people from different backgrounds gives better value.”
Even when plenty of information is gathered decision-makers still tend to lock themselves into one piece of information on one person’s judgment. “They latch on to a salient piece of information without looking at the whole picture.” Making decisions based on too little information can make for very big marketing errors, Lobo adds, noting the big mistakes made by mobile phone manufacturer, Nokia.
“They thought ‘We don’t know what’s going to be popular next so we’ll make phones of every type’ and they still missed the transition to large touch screen PDAs.” Microsoft spent years trying to develop tablets and got it completely wrong when guessing who would use the technology and for what purpose, assuming it would be a niche market.
Meanwhile Apple, renowned for the inordinate amount of research it conducts and questions it asks, makes some very innovative decisions which often pay off.

Being Prepared
While the first lesson for avoiding costly mistakes is to broaden the sources of information, the second, and equally important lesson, is to maintain awareness of the fact things go wrong more often than one expects and to ensure contingency plans are in place.
Many organizations, particularly auditing companies and large mining and pharmaceutical firms, now better understand the extent of human error and ensure they are prepared when an oil strike isn’t made or a drug test produces unexpected results.
“Over the last decade companies are increasingly paying attention to real options in their decision making — looking not just at the value of a capital investment under a reference forecast, but by also taking into account its consequences under different and uncertain economic conditions and market outcomes,” says Lobo.
It’s better to take action early, to have knowledge and back up plans to avoid adding to mistakes.
“It’s about risk management,” he warns. “Looking at all possible outcomes and how to protect yourself. You have to think about different ways people can be wrong, due to random error.”

— Courtesy INSEAD Knowledge, INSEAD Business School, Abu Dhabi)
(knowledge.insead.edu/home.cfm)