運用 PDCA, Bill 說戴明博士故事,「擦り合わせ」透明化(transparency)
Peanut Butter on the Chin
by James O’Toole
9/23/08
What The Lucifer Effect, Philip Zimbardo’s landmark book on a prison experiment at Stanford University, tells us about the dangers of corporate conformity.
At company retreats in Aspen, Colo., over the last five decades, between skiing and spa-ing, corporate executives often attended performances at the Crystal Palace dinner theater. There, more often than not, they would implore the cabaret’s founder, Mead Metcalf, to sing his signature tune, “Peanut Butter on the Chin.”
Metcalf, who recently retired, claims he hated “the stupid song,” but couldn’t resist the shouted requests from a roomful of bigwigs. And it’s clear why they loved the ditty, whose main character is a corporate CEO who, in a rush to get to work, fails to clean his face after a hasty breakfast — and then passes the entire day with a lump of peanut butter on his chin. Of course, no one dares to give the boss a heads-up about the embarrassing blob. When he finally gets home after a busy day and takes his first look in a mirror, he is horrified by what he sees and concludes that he has made a fool of himself in the eyes of his minions. The song’s second stanza finds the CEO back in the office the following day. And, lo and behold, his entire management team sports lumps of peanut butter on the chin!
I thought about this lighthearted ode to corporate conformity while reading a much more serious account of the psychology that causes it: The Lucifer Effect: Understanding How Good People Turn Evil, Philip Zimbardo’s riveting — and chilling — account of the prison experiment he conducted at Stanford University in 1971. Older readers likely have at least heard something about this storied psychology experiment that quickly got out of hand. Young men had been assigned to play the roles of guards and inmates in an ersatz jail in the basement of a campus building, but the participants took their playacting so seriously that the scheduled two-week experiment had to be aborted at midpoint when the student-guards began to psychologically and physically abuse the student-prisoners. Zimbardo’s book is the first detailed, popular account of what happened. The retelling was prompted by the torture of Iraqi detainees by U.S. military grunts at the Abu Ghraib prison in Iraq in 2003, a bizarre and terrible incident that eerily replicated the earlier experiment.
In the book, Zimbardo’s goal is to understand why good people do bad things — to unravel the psychological and social sources of evil. He begins with a nearly 300-page, day-by-day account of all that transpired during the hellish experiment in 1971, and follows it with a review of the real-life horrors that occurred in Nazi concentration camps, the My Lai massacre during the Vietnam War, the cultish mass suicides at Jonestown in 1978, and the more recent genocides in Rwanda and Darfur. He reanalyzes these familiar events in light of two decades of research into the psychology of evil and the emotional causes of the worst manifestations of human behavior. In the process, he turns what we thought we knew about the subject on its head.
For years, experts had asserted that people do bad things because that is the inherent human condition; something in our DNA compels us to give in to the persistent temptation in life to do wrong. Zimbardo believes this assumption has no merit and uses his Stanford experiment and hundreds of subsequent psychological studies to disprove it.
Zimbardo’s prison experiment at Stanford demonstrated that human behavior is determined not by nature but by situational forces and group dynamics — the nurture, as it were, of our jobs and relationships, groups we belong to, and daily interpersonal interactions. Almost all of us can be drawn over to “the dark side,” where good people can end up participating in out-of-character, unspeakable activities, given either a large dose of peer pressure or some arm-twisting (obvious or subtle) by individuals we view as superior. In The Lucifer Effect, Zimbardo shows how easy it is to create situations and systems in which people are driven to do bad things by the nature of what’s around them. He highlights, for instance, the phenomenon known as groupthink, which occurs when all members of an organization become so inward-looking they fail to recognize that the assumptions driving their behavior are false, outmoded, or even self-destructive. But he concludes on a hopeful note: We can just as readily design systems and group behavioral models that lead to positive actions.
Letting Information Flow
Although Zimbardo mentions business organizations with only a passing reference to the WorldCom, Enron, and Arthur Andersen accounting and corporate governance scandals of the past decade, his general conclusions illuminate the source of unethical company behavior more adequately than do most of the published analyses specifically addressing that topic. His observations belie the standard explanation offered by business leaders when people in their organizations are caught misbehaving: Hey, there are a few bad apples in any barrel. Zimbardo argues that, in fact, ethical problems in organizations originate with the “barrel makers” — the leaders who, wittingly or not, create and maintain the systems within which participants are encouraged to do wrong. Hence, instead of companies wasting millions of dollars on ethics courses designed to exhort employees to “be good,” it would be far more effective for managers to make an effort to create corporate cultures that reward people for doing the right thing all of the time.
Zimbardo’s conclusions are important for business leaders not simply because they explain the behavior that leads to costly debacles like the Enron scandal. More immediately, they shed light on the organizational pressures to conform and the reluctance to speak the truth to supervisors and others in power that the CEOs in Aspen found so hilariously familiar in “Peanut Butter on the Chin.” These same forces hamper a company’s capacity to innovate, solve problems, achieve goals, meet challenges, and compete.
Research shows that successful organizations need a free flow of information, much as the heart needs a continuing supply of oxygen-bearing blood. For example, organizational theorists Robert Blake and Jane Mouton documented in one study that the ways in which airplane pilots interacted with their crews determined whether the crew members would provide essential information to the pilots in the midst of an in-air crisis. Stereotypical take-charge “flyboy” pilots who acted immediately on their gut instincts were far more likely to make the wrong decisions in trying to avoid disaster than were the more open and inclusive pilots who, in effect, said to their crews, “We’ve got a problem. How do you read it?” before they made up their minds on a course of corrective action. In essence, the silent crew members knew from experience that their leaders were not going to listen to them, wouldn’t listen even if they volunteered useful information, and, worse, were likely to reprimand them if they dared speak out of turn. It’s a matter of trust, and it is the leaders themselves — and their organizations — who suffer most in untrusting cultures. By not listening to their colleagues, too many leaders shut out sources of potentially useful information. That’s why transparency is simply good management.
Indeed, there is only one effective antidote to organizational opacity and groupthink: creating organizational transparency — a culture of candor in which information flows unimpeded to those who need it when they need it, and in which no one fears the consequences of being forthright and honest with those above him or her in the company.
The potential benefits of transparency are, on the one hand, quite tangible. Edward Lawler, a professor at the University of Southern California Marshall School of Business and founder and director of USC’s Center for Effective Organizations, has found, for example, that posting everyone’s salaries on a company bulletin board or in a database boosts employee morale and increases trust in top management. Equally important, however, are the less obvious gains that transparency offers. When executives gratefully welcome information or suggestions from those down the line — even stories that perhaps they don’t want to hear — organizational perspective is broadened and groupthink is marginalized. In practical terms, the information those at the top need at any given time may be located anywhere in the organization, and that’s why clear channels of communication are a sine qua non of organizational effectiveness.
Yet, despite the apparent value of transparency, few companies can be characterized as transparent. (Indeed, 63 percent of the Midwestern executives I recently surveyed described their companies’ cultures as opaque.) As Zimbardo demonstrates, transparency runs against the grain of human organizational interactions. In all groups, there is a powerful desire to belong. Everybody wants to be liked, to be part of the family. Hence, the pressure to conform in groups is almost irresistible. Nobody wants to be the one to tell the boss that he misused a big word during an impromptu speech or that he is hopelessly mistaken about a set of facts.
At the top of the hierarchy, leaders understandably try to hide their mistakes; they hope to prove that they are smarter than those below them and, hence, don’t need their underlings’ input. At the same time, information is the most precious currency in most organizations. Leaders horde it and share it only grudgingly; fast-trackers, golden boys, and members of A-teams view information as a perquisite of their positions. Thus, some in the organization are in the know and will always get heard, while others are left out, their ideas squelched to the detriment of the entire organization, including, paradoxically, the insiders themselves.
Corporate leaders who recognize the importance of transparency take practical steps to create cultures of candor. Some practice “open-book management,” as pioneered by SRC Holding’s CEO Jack Stack, who provides employees with full access to company financial and managerial data. Others, like Kent Thiry, CEO of health-care provider DaVita, systematically collect data and solicit candid feedback from employees, former employees, customers, and suppliers in order to, as Thiry puts it, keep from “messing up.” Still others use Weblogs to give voice to the expertise at the bottom of the organization; reward employees who offer up their honest assessments freely; use formal exercises to challenge the organization’s basic assumptions about its commercial environment and stakeholders; and diversify membership in the C-suite to gain the benefit of multiple perspectives. Above all, these leaders adopt the first rule of information: “When in doubt, let it out.”
James O’Toole, the Bill Daniels Distinguished Professor of Business Ethics at the University of Denver’s Daniels College of Business, is coauthor (with Warren Bennis and Daniel Goleman) of Transparency: How Leaders Create a Culture of Candor (Jossey-Bass, 2008).
沒有留言:
張貼留言