LIVING WITH RISK

Can We Take It Seriously?

The more complicated a project gets, the greater the risk something will go wrong.  The more interconnected the systems become for managing it, the greater the inevitability for catastrophic failure.  We see this now with oil spilling into the Gulf of Mexico.  We saw it two years ago with the credit crisis.  It happened with NASA.  It’s now a fact of life.

As David Brooks put it in Friday’s New York Times, we are at “the bloody crossroads where complex technical systems meet human psychology.”  Here (in my words) is his summary of our psychological failures.

*     Focused on our own piece of the puzzle we fail to see the interconnectedness of the whole.

*     We get used to risk, and discount it.

*     We put too much faith in the back-up systems that are supposed to deal with the risks we do pay attention to.

*     Management, preoccupied with the pressure for greater productivity, is not geared to deal with failure.

*     We tend to discount bad news.

*     Groupthink eliminates the thinking that might alert us to existing problems.

For someone who is not a psychologist, he offers a brilliant summary of the interlocking issues we all face in such circumstances.  (See, “Drilling for Certainty.”)

But there is one missing element:  the motivation for change.  What is needed to overcome these powerful cognitive, emotional and group tendencies?  Part of the problem is that there are powerful motivations to neglect the warning signs of danger, many of them suggested in Brooke’s list.  To go against the group’s coercive power, for example, means being willing to arouse its dislike, possibly even its derision.  Whistle blowers don’t usually have an easy time of it.  There is, as well, the expense of caution and delay, and the pressure of bosses who want immediate results.

But much of the power of the psychological factors Brooks outlines comes from the fact that they operate unconsciously.  Even when faced with obviously clues, people simply do not want to think thoughts that are inconvenient, difficult, and stand a chance of making them unpopular.  They may not even notice the evidence that something is wrong.  If a convenient scapegoat is not available to take the blame, an easy way to “solve” the problem, the usual resolution is to allow the information to simply disappear out of our minds.

People can be trained, however, to pay attention to the hints or fleeting signs of unconscious thoughts.  Moreover, they can work together to pick up on unformulated feelings, or hunches that something just doesn’t seem right.  The unwanted information can be reclaimed.

But to do this would require determination – and incentives.  We would need to have a collective understanding of our unconscious tendencies and how they operate in the workplace.  We would also need to put pressure on organizations to make the investment in such reflective processes.

Though extra thought can seem expendable, and we may never know for sure what disasters will have been averted as a result, it may be our key to a safer world.