orgtheory.net

social significance of non-events

In my science and technology studies (STS) courses we always end the semester discussing science and engineering disasters (and what can be learned from them). This fall we are visiting Three Mile Island (TMI) and meeting with Ed Frederick, board operator at TMI during the accident in March of 1979 (see Chapter Four in Three Mile Island: A Nuclear Crisis in Historical Perspective). The view of the cooling towers is awe-inspiring, which only gets better the closer you get to the anti-tank pylons. The mock control room, which runs a training module, is always popular with the students and so is the site’s exceptional security: occasionally a guy in riot gear with a machine gun appears from nowhere, counts the number of visitors, and then disappears.

Post-visit, on the bus ride back to campus, I loosely lead discussion toward two ideas that I don’t have the slightest answer to.

1. After studying the Tylenol Poisoning Tragedy (see chapter one of Minding the Machines:Preventing Technological Disaster) and many others, we ask: are there circumstances under which a firm might gain, over the long run, from a carefully handled crisis? Students, especially of the conspiracy theory bent, go nuts with this one, and reformulate my question: are there circumstances under which a firm might gain, over the long run, from a carefully planned and handled crisis?

2. After meeting with Ed Frederick at TMI, we ask: what is the sociological significance of a tragedy averted?  Many of our well-rehearsed case studies are of those tragedies that have taken their course (see, for example, The Challenge Launch Decision); however, tragedies avoided or what might be called “non-events,” seem less obvious in terms of theory.

Written by Nicholas

May 4, 2010 at 3:31 pm

9 Responses

Subscribe to comments with RSS.

  1. in response to question #1, one way a market leader could benefit from a crisis/scandal is if the crisis inspired new regulation that serves as a barrier to entry.

    Like

    gabrielrossman

    May 4, 2010 at 4:47 pm

  2. Wouldn’t learning from averted errors depend on having an organizational culture that is focused on safety and control? Otherwise, don’t people just forget averted disasters?

    Like

    fabiorojas

    May 4, 2010 at 6:19 pm

  3. Claus Rerup at the B school at Western Ontario looks a lot at the importance of near failures (and near successes) that he calls the ‘gray zone’. Worth checking out.

    Like

    cwalken

    May 4, 2010 at 6:56 pm

  4. I asked Kathleen Suttcliffe your students’ “conspiratorial” question at a conference last summer. She had presented the case of a museum whose roof caved in after a snowstorm. Because the museum just happened to be undergoing a sweeping organizational change process under the leadership of new high-powered director, it was well positioned to make the best of the crisis. As community support starting pouring in, the rebuilding went exceptionally well and the museum went from being a local attraction to being a world-class historical site (or something like that).

    Now, there was no concrete basis for any conspiracy theory in that case, but the question was: what, in these studies, counts against the following “implication for practice”:

    “if you want major organizational changes to succeed, set a big strategy process in motion and then plan and execute a crisis”

    The only answer we could come up with is that it’s too risky–a crisis is (statistically) not often enough good for an organization. But is that empirically true? Are there, as you suggest, circumstances that increase the odds? Could those circumstances be identified in advance or even established?

    That is: there is a difference between noting that a crisis is sometimes good for an organization and identifying “circumstances under which” a crisis is *likely* to be good for an organization. If org theory did the latter, we would be quite close to offering a general theoretical basis for the popular conspiracy theories. After all, in cases where the circumstances obtain, planning a crisis *would* be rational.

    Like

    Thomas

    May 4, 2010 at 7:07 pm

  5. Concerning Thomas’ last point:

    There has been similar movements in certain corners of the institutionalist literature (with the goal being a general theory of institutional change) where instead of asking what sort of great events (shocks/crises/whatever) might work to dislodge deeply entrenched institutional arrangements, they ask what it is about institutions that make them susceptible to certain events while being resistant to others. Or to rephrase Fabio’s comment: whether a crisis is going to be helpful depends not on the crisis, but on the properties of the organization.

    (And for the philosophy geeks, I’d argue that this is basically an example of going from thinking about causation in terms of regularities among sequences of events to thinking about causation in terms of powers and capacities, or more generally, going from empiricism to scientifc realism.)

    Asking questions like that seems to me to be exactly what the social sciences should be doing. After all, surely it would be more helpful if we could say not just that (e.g.) the chances of Somalia transforming into a functioning democracy in the near future are rather slim, but also what it is about Somalia that makes it so.

    Like

    Mike

    May 5, 2010 at 7:38 am

  6. Yes, Mike, though I would say that there has to be a certain degree of fit between the properties of the organization and the properties of the crisis.

    But there’s a disturbing implication, which was really the point of my comment. If the social sciences can describe the circumstances under which a particular kind of crisis will be good for a particular kind of organization (i.e., under which it will occasion a desired change) then those same social sciences are implicitly saying that one rational change strategy is to first arrange the right circumstances and then produce the crisis.

    Michael Brown seems to trying this sort of argument out on Obama’s response to the oil spill.

    http://tpmlivewire.talkingpointsmemo.com/2010/05/chris-matthews-to-michael-brown-your-oil-spill-theories-sound-insane-video.php

    If the age-old dictum “every crisis is an opportunity” can be developed into a full-blown theory of institutional change then the idea that crises are produced in order to create opportunities is much less “insane” than Chris Matthews, for example, is suggesting.

    Like

    Thomas

    May 5, 2010 at 9:39 am

  7. “The view of the cooling towers is awe-inspiring, which only gets better the closer you get to the anti-tank pylons.”

    My favorite bass fishing spot, when I lived in PA, was in the Susquehanna River at TMI. Something about fishing in view of those towers made it seem a little more daring. And the fish sure grow big there!

    Like

    eric

    May 8, 2010 at 1:44 am

  8. If anyone is still reading … to point #1 re: gaining from disasters.

    Columnist Naomi Klein has an idea she calls Disaster Capitalism (2007) where she claims that there are those who go thru lengths to take opportunities from disasters; possibly even participating in the creation of disasters.

    I’m not a conspiracy theorist myself, but the idea may not be as far-fetched as you imagine. Consider how much “defense” work is outsourced to private military contractors; or the work of disaster relief. For these companies to create market demand is not that different an ideal than marketers creating a need for $200 sneakers.

    Like

    Ken

    May 27, 2010 at 3:45 pm

  9. […] writing about the social significance of non-events for orgtheory.net, I asked the following […]

    Like


Comments are closed.

%d bloggers like this: