Saturday, September 25, 2010
Friday, September 10, 2010
Imagine this horrific leadership scenario. You are trapped in a mine 2300 feet underground with 32 of your colleagues with no real knowledge of when you will get out. What do you do to survive? Some of the more basic human survival needs, such as food and water will be lowered to you through a small hole, but the questions you don’t have answered are similarly troubling. What do we do with our time, how do we organize ourselves, how to we keep up hope with no clear solution in our sights?
No, this is not some crazy organizational behavior exercise to test team cohesion, it is a real situation faced by 33 miners trapped in a Chilean mine. The answer to the question: Form a structure, keep to your normal routine, and keep your mind occupied. That is what NASA psychologists suggested when asked to lend their expertise to the improve the chances that the miners can keep themselves focused the mission. See the links here:
Also proving useful is lessons learned from the wars in Iraq and Afghanistan.What can be learned from this effort, social organization, psychology, and leadership are essential to survival in the most extreme situations.
Thursday, September 2, 2010
Systematic thinking WAS THE PROBLEM in the mortgage crisis . . . ignoring the idiosyncratic warning signs of impending disaster
This blog focuses on applying organizational behavior and leadership concepts to current events. Last year, I focused attention on the financial meltdown of 2008 and its aftermath. It makes sense to carry the financial crisis theme into the first blog of this semester before moving on to other issues. One area of interest for me as a researcher and observe of events is how learning breaks down in organizations and the consequences of these break downs for organizational viability.
Evidence that learning is breaking down takes several forms. One form of evidence can be found when an individual in a powerful position recognizes a challenge to conventional wisdom, but writes it off as anecdotal because he cannot find systematic evidence of a trend.
While many write off anecdotes as untrustworthy information, an anecdote often provides an early warning sign of impending disaster.
An example of failing to take seriously anecdotal information occurred prior to the subprime mortgage meltdown when regulators noticed homeowners began to default on their mortgages, but failed to see systematic trends and therefore took no action. An interview with US Federal Reserve Chairman Alan Greenspan on CNBC highlights the problem.
Click here for a link to a section of the interview in the CNBC documentary House of Cards
(Unfortunately, the section where Greenspan discusses how and why he discounted early warning signs as ‘anecdotal’ is only the DVD that must be purchased.
The reason that regulators such as Greenspan ignored the data was because of a fundamental disbelief in the nature of anecdotal data. Greenspan, by his own admission, ignored early warning signs of the subprime meltdown because the signs were simply stories, one time events, even willy-nilly Yet, by the time the meltdown went into full force, subprime accounted for as much as 20 % of the entire mortgages in the country. The stories proved more than anecdotes, they were in fact representative of a larger situation.
To understand the problem that results from valuing systematic trends over anecdote is to understand why organizations often overlook early warning signs of disaster. Systematic evidence of impending disaster is just as likely to displays itself in a story, personal accounts, even sketchy evidence, as it is to display itself in systematic evidence such as statistics or trends.
Systematic evidence of problems often occurs too late in the cycle of disaster to be of much good to ward off the crisis. Some have even referred to reliance on non-systematic data as a biases that limits decision making. See Max Bazerman’s wonderful text on this topic, Judgment in Managerial Decision Making.
But the failure to heed antidotal or seemingly idiosyncratic data is not as rational as some may lead us to believe. In fact, failure to heed early warning signs is itself a bias, ignoring certain data at the expense of other data (what sociologists refer to as cognitive ‘editing’) is a skill, something developed over time – a quality of judgment. In many ways, those in the storm of the financial crisis ignored the behavioral or psychological side, they wrote off the use of judgment as irrational, when in fact they are simply embracing their own form of bias . . . which seems to have been a pandemic of bias during the financial crisis.
A recent documentary by the group at Frontline entitled The Warning
exposes some of the oddities of Greenspan’s thinking. For example, did you know his philosophy on free-markets with little or no government regulation, his so-called ‘laissez-faire’ approach, is based on a philosophy put forth by cold war novelist Ayn Rand? Rand wrote novels such as Atlas Shrugged a favorite of corporate titans that supports only the use of strict rational logic in business and economics.
By his own admission, the principles to which he had clung, and which dominated US economic policy for almost two decades was not a systematic philosophy that describes how the world really works, but rather a system of belief about how things ‘should work’. I am not necessarily saying that the ideas of ‘objectivism’ the philosophy of Ayn Rand isn’t without merit, every young capitalist, including myself goes through a phase of admiring the objectivist world view. But then we must grow up, and come to terms with the idea that laissez-faire is really characterized by lazy thinking.
The financial crisis that came to a climax in 2008 revealed several important insights into the psychology of learning from failure. Assumptions about the nature financial markets and how they works, personal investment in prior success, organizational tendency to downplay failure and over emphasize success result not simply from systematic bias, but rather from efforts to maintain ‘alleged’ systematic thinking at the expense of good judgment. In the end, learning broke down.
How is a great leader to assess these warning signs and separate the representative anecdote from the non-representative one. The first is to hold your biases loosely and be open to challenges. Had Greenspan and those like him been smart enough to recognize the limits of ‘objectivism’ and embraced the subjective, we might all be in better shape today.