orgtheory.net

systemic risks: too big, too complicated or too central?

Duncan Watts had an article in the Boston Globe last week (hattip: Karl Bakeman) looking at how the network structure of the banking industry might have amplified the financial crisis.  Watts comments:

Traditionally, banks and other financial institutions have succeeded by managing risk, not avoiding it. But as the world has become increasingly connected, their task has become exponentially more difficult. To see why, it’s helpful to think about power grids again: engineers can reliably assess the risk that any single power line or generator will fail under some given set of conditions; but once a cascade starts, it’s difficult to know what those conditions will be – because they can change suddenly and dramatically depending on what else happens in the system. Correspondingly, in financial systems, risk managers are able to assess their own institutions’ exposure, but only on the assumption that the rest of the world obeys certain conditions. In a crisis, it is precisely these conditions that change in unpredictable ways.

He suggests that regulators assess a company’s network position and take action to ensure systemic viability:

On a routine basis, regulators could review the largest and most connected firms in each industry, and ask themselves essentially the same question that crisis situations already force them to answer: “Would the sudden failure of this company generate intolerable knock-on effects for the wider economy?” If the answer is “yes,” the firm could be required to downsize, or shed business lines in an orderly manner until regulators are satisfied that it no longer poses a serious systemic risk. Correspondingly, proposed mergers and acquisitions could be reviewed for their potential to create an entity that could not then be permitted to fail.

This is a very interesting idea.  But it also raises a number of intriguing questions worth fleshing out in a little more detail:

1.  First, measurement: How would one measure the network connectedness of companies in a way that could adequately inform policy?  Many a network analyst would give up a limb in return for a government mandate requiring companies to provide network data of this kind.  But which network ties are the most relevant when it comes to robustness?  Cross-holdings?  Joint ventures?  Insurance instruments?

2.  Would the act of publicly measuring these network ties change the network structure?  Could simply collecting the data and educating people on what they mean be enough to influence tie formation?   Could doing that achieve robustness more effectively than regulating against being “too central to fail”?  In what ways might doing that pervert the “organic” process of network formation?  What unintended consequences might result?

3.  Lastly, a general question concerning the relationship between networks and institutions.  Is it a problem of a particular network structure which introduces too much systemic vulnerability?  Or is the problem a complex of rules that are too obscure to allow actors to take rational action?

On this last one, contrast Watts’s discussion of the Aisin-Toyota crisis in Chapter 9 of his book with an earlier paper on the topic of robustness (coauthored with Sheridan Dodds and Chuck Sabel).  In the book, Watts discusses the case of a fire at Aisin, a supplier to Toyota, which occupied a bottleneck node in the production system.  The fire threatens to bring production throughout the system to a halt.  But the system quickly adapted because “this… is where all the training kicked in.  After years of experience with the Toyota Production System, all the companies involved possessed a common understanding of how problems should be approached and solved….”

To me, that speaks to the transparency and complexity of the rules by which the system operates more than it does to the particular network structure.  The rules were complex, but so well understood that others could adapt efficiently.  But the paper (and the Globe piece) focus on the network structure.  The theory is (1) that distributed network relationships generate more information at the moment a crisis hits allowing actors to adjust more quickly and (2) a more distributed network can keep a cascade from spreading by diluting the risk associated with any one node.

These interpretations lead in two different directions when it comes to policy (complementary, perhaps, but still different). If the interpretation is that rules and associated risks were obscure to actors two, three, four (more) links away in the chain of dependencies, then it seems to me that the policy proscription should be to make the rules governing the system more transparent.  If the interpretation is that too much risk came to be associated with too few nodes, then policy proscription he offers in the article — influencing the network structure to eliminate risk congestion — makes some sense in that it might slow down the cascade (theoretically giving other actors in the system time to adjust their strategies).  But that leads to one last question: is it sufficient?

32 Responses

Subscribe to comments with RSS.

  1. Is it a problem of a particular network structure which introduces too much systemic vulnerability? Or is the problem a complex of rules that are too obscure to allow actors to take rational action?

    It’s both because the institution makes the organization — they define how the organization can adapt in response to particular stimulus.

    I wish Watts would have said something about time-correlation. Judge Posner’s been blogging about financial regulatory reform and noted last week that the size of the institutions was not the issue so much as the correlation in decisionmaking, which in turn led to correlated losses. Watts is an expert in synchronization.

    …but on the measurement issue… I am pretty much alone in arguing this right now, but I believe that we have to start with a more general hypothesis about human behavior than the rational hypothesis if we are going to make sense of both individual and group dynamics. As far as I can tell, the rational hypothesis has been verified only by empirical data on group behavior — i.e., revealed preferences of large groups of consumers over long periods of time. The flaw in using such data and models drawn from it to predict individual behavior is well-appreciated by readers of this blog, I know.

    One candidate hypothesis to build on in modeling both individual and group behavior is that there is a characteristic time period associated with particular economic activities — i.e., that consumption and production have a characteristic frequency. By aggregating individual frequencies into a frequency distribution, one has an empirically verifiable model of individual and group behavior.

    If such frequency distributions are stationary in time, then one should be able to produce a one-one-one map between revealed preferences for a population and a ranking of frequency of consumption for the same population.

    Like

    Michael F. Martin

    June 29, 2009 at 7:08 pm

  2. Well, Duncan’s article does make the point that a cascading crisis can immediately shift the preferences making them not only obscure (I don’t know what risks someone 3 or 4 links away from me has taken), but also unpredictable. So, I wonder how that fits into the systems theory you’ve been working out…

    Then again, the article is more about influencing behavior in such a way as to keep the system robust than it is about predicting behavior accurately.

    That said, if I understand your frequency suggestion (and I’m not sure I do), its basically that the economy is not unlike the Tacoma Suspension Bridge:

    The goal of regulators should be to analyze what amounts to the “vibration frequency” of the system and take action to “tune” it rather than either (1) stiffening its core (to make it so predictable as to be rigid) or (2) make it more supple (as I think would be the implication of Duncan’s article; by making relationships among the actors more robust the bridge — in this metaphor — would not collapse as easily when the vibration gets intense).

    Like

    Sean Safford

    June 29, 2009 at 7:27 pm

  3. I agree with Martin. The Savings and Loan crisis showed that many small companies can all fail at once. Meanwhile, Australia and Canada both have highly concentrated banks, but few problems. It’s easier to monitor a few banks, and the relative lessening of competitive pressure reduces the urge to chase every last penny. The quality of company decisions matter, not just the interconnectedness.

    Like

    Thorfinn

    June 29, 2009 at 7:43 pm

  4. Sean,

    Nice connection to the Tacoma Narrows disaster. That is an example of resonance — where a system (the bridge) composed of lots of similar parts (the girders) has a “natural frequency” of oscillation. When driven at that frequency, the bridge can collapse. The Cowles Commission in fact published a treatise on this type of analysis in 1941. See here.

    What I’m talking about is actually a bit more subtle than that — although even that would be an improvement on some macro analysis. What I’m talking about is a self-organized state. The Tacoma Narrows Bridge, for example, was driven to collapse by the wind. Markets and complex systems generally collapse as a result of self-initiated forces.

    The Kuramoto model that Duncan Watts, Steve Strogatz, Damian Zanette, and others have studied demonstrates how oscillations that are weakly coupled can self-organize into stable or unstable equilibriums through a process of synchronization or desynchronization.

    There is already empirical data out there that demonstrates that synchronization is happening. So this is not pie in the sky.

    But the way neoclassical economics tests the rational hypothesis is by assuming that preferences are stable and then watching them revealed over some period of time. This methodology assumes away the kinds of dynamics that the Kuramoto Model, for example, describes. You have to start with a different empirical hypothesis about human behavior in order to observe synchronization.

    Another way to put it is that the rational hypothesis has been tested rigorously (e.g., by Varian) against aggregate consumer data over long periods of time. As such, the empirical model of individual behavior it produces is a “mean-field approximation” — a model in which every individual within the aggregate has the same preferences. Mean-field approximations permit some insight into group dynamics — that’s why, for example, they’ve been adopted into law & econ. But they cannot be used to describe the local and temporary dynamics that define group behavior or culture. You need more data about individual behavior to do that. (As readers of this blog know!)

    Like

    Michael F. Martin

    June 29, 2009 at 9:34 pm

  5. The millenium bridge wobble is what I’m talking about:

    http://www.newswise.com/articles/view/515795/

    Like

    Michael F. Martin

    June 29, 2009 at 10:37 pm

  6. @ Michael F. Martin,

    I find your ideas, as written, very hard to follow, and I doubt I’m alone.

    Like

    Michael Bishop

    June 30, 2009 at 7:37 pm

  7. @Michael Bishop

    I’m doing the best I can.

    Like

    Michael F. Martin

    June 30, 2009 at 8:56 pm

  8. Actually, Mr. Bishop, it would be helpful if you could say what part of what I’m saying is hard to follow — is it the idea that the rational hypothesis has been tested only against group behavior, or the idea that individuals can spontaneously organize into groups without explicit verbal communication?

    Like

    Michael F. Martin

    June 30, 2009 at 9:03 pm

  9. It did take me several read-throughs to get a handle on your ideas Mr. Martin. Partly its a question of the fact that you’re working in an area that isn’t taught in Econ 101. But its also written for an audience that is steeped deeply in economic theory so it takes a little more effort than most people are willing to put into reading a blog to grasp.

    But, I do find the ideas conceptually interesting and I’m trying to get a handle on how they address the questions I posed.

    If I understand you and Thorfinn, you are saying that the three systemic risks I identified — size, complexity and centrality — are besides the point. The real issue is that a lot of little people taking some small, seemingly rational, action simultaneously is what destabilizes the system. As in the millennium bridge example.

    I’ll get back to the question of whether the structure is irrelevant here. First is a question of why everyone is doing the same thing at the same time.

    In the millennium bridge example, the mechanism that produces conformity is some kind of an instinct that people have. Thinking back to that scene in Dead Poets Society when the boys all end up matching their strides; there is a cognitive bias — an instinct — toward conformity.

    (see 6:34 to 8:48 in this video)

    This is why the Army has a set policy of telling soldiers to break cadence on bridges: you need to consciously overcome that instinct (and the socialization of marching in step) in order to avoid destroying the structure.

    I haven’t read Judge Posner’s thoughts on this, but if that is also his view, I’d be curious what his proscription for it is? How do you get people not to act in ways that would destabilize the system?

    The typical response in terms of the economy is for the President to bring out some highly respected economist types and speak in measured tones about how they are working to bring everything back to normal. So you can go back to walking the way you always have; no need to go nuts. Go back to putting pressure on the system the way it is essentially designed to be pressured.

    I see what Duncan’s trying to do as examine the social-structural qualities that produce conformity. Rather than cognitive (instinct) psychology based approach, he’s looking for the structural/sociological drivers. Specifically, how risk dominoes through the structure to destabilize boundedly-rational action.

    He sees the problem as too much centralization or really too little redundancy in the network structure. He wants to create paths that would distribute the load. He wants to build a robust structure.

    But if I follow the logic, you are actually saying that Duncan’s ideas of centralization is precisely the opposite of what is happening. The problem is not that the system is too dependent on information and risk flowing through a few nodes–or that there are a few pinch points that if destabilized would jeopardize everything. For you, there it is that the structure is so dense that observation, comparison, learning and culture produce a common — indeed simultaneous — set of reactions which combine to violently shake the structure.

    Translating into physical imagery, Duncan sees this as a progressive collapse where one piece falls and topples the next. Its a domino effect. If I were to characterize the structure as Duncan sees it, it would be like a gothic arch. If you remove one stone the rest fall out so you want to build redundancies into the structure to keep it standing. You need some flying buttresses.

    You see the structure as a suspension bridge. All of the pieces are linked together in a tensile structure (the bridge deck) so dense that an action on one part of the structure affects the whole structure simultaneously. The problem is not cascading risk as Watts sees it. It is conformity in the reaction to a shock that generates perverse feedback vibrations within the structure itself as small, seemingly inconsequential, moves made simultaneously by hundreds of thousands of actors in the market shakes the structure almost the point of breaking.

    So if I have this right, then I wonder what the structural solution would be?

    The solution of the army is behavioral (get people to walk out of step).

    I don’t see how a behavioral approach is possible in the context of a financial crisis. Is there a structural approach along the lines of what Duncan proposes (i.e., clear enough for policy makers and Mr. Bishop to get it), but that is better suited to the problem as you define it?

    Would you propose “dampers”? If so, what what they look like?

    Like

    Sean Safford

    June 30, 2009 at 9:49 pm

  10. Sean,

    I appreciate your taking the time here. I’ve been steeped in the economic literature for too long trying to identify the flawed assumptions that led to systemic failures. I’m relieved that you, at least, seem to understand what I’m talking about.

    The easiest thing to do is to break this down into pieces. Duncan and I both seem to picture the economy in a similar way — i.e., as a network of economic actors. Network theory often treats nodes and links as static, and then proceeds to analyze the network structure in terms of how many links there are per node as you zoom out from local to global by grouping what used to be clusters of nodes and links into single nodes and links at larger scales.

    It is not too difficult to see how some such network structure might be more fault tolerant than others. Consider two large networks connected by a single node. If that node goes out, no communication can take place. If I understand him right, Duncan is concerned that the particular network structure that markets tend to spontaneously form into (i.e., scale free structure) is not fault-resistant. I *think* this is what he had in mind talking about the problems of size, complexity, and centrality.

    I am NOT saying that network structure doesn’t matter. What I am saying is that this kind of static analysis of network structure doesn’t give you the whole picture, and that the dynamics it leaves out are probably more important to understanding what caused and how to prevent another systemic failure.

    A dynamic network can be modeled as a network in which a set of values that oscillate in time are fixed to each node, with links defining whether and how the oscillating values can influence one-another. It is a very counterintuitive fact that inanimate objects and insects — pendulums, fireflies — will spontaneously synchronize their oscillations under certain limited circumstances.

    To see how this applies to financial regulation, consider how each firm is a nexus of cash flows in and out. Each of these cash flows represents a frequency of oscillation in $ per unit time. Increased leverage, decrased capital requirements, and the proliferation of derivatives led to an exponentially more dense links between firms through the slicing and dicing and trading of these cash flows.

    These dense links provided the preconditions for synchronized buying and selling — i.e., for the very high degree of correlations in markets across asset classes that are now observable. It is the synchronization that is responsible for the increased volatility of the market, not the lack of capital requirements, not the degree of leverage, not the failure of regulatory oversight.

    In the end, I agree with Duncan’s prescriptions. I just wish he would have been more explicit about how the structural characteristics he mentions lead to the peculiar dynamics (of synchronization and correlation) that we have observed in this crisis.

    Judge Posner has been on the right track pointing out that centralizing regulatory oversight isn’t going to avoid this type of risk, and neither will shrinking banks since the problem isn’t the size of the banks, but the synchronized response to new market conditions. He’s also on the right track by proposing that regulators be rotated through various agencies so that a system-wide view can be developed over time. That might not prevent another crisis, but at least it would give us a better shot.

    At a very high level of generality, what is happening is that high-quality information is now available to all decisionmakers in a market at almost the same moment in time. So long as everybody gets the same information at the same time and has the same way of processing it, the market is going to remain very tightly correlated. What looks like increased volatility may simply be the same price shifts that were always there, but smeared out over long enough times to be lost in the “noise” of buying and selling unrelated to new information.

    Like

    Michael F. Martin

    June 30, 2009 at 10:40 pm

  11. As far as solutions go, I don’t know the answer. I guess the first question should be do we want to put up with correlated drops because they also promote correlated growth? The answer to that must be yes at least partly.

    One can imagine selectively decoupling parts of the market to create redundancies. In a sense, we have done this politically in the United States by separating government officials into different groups with different, but overlapping jurisdiction. Maybe we can do that with financial institutions too?

    The tradeoff is between stability and cost-efficiency.

    Like

    Michael F. Martin

    June 30, 2009 at 10:48 pm

  12. OK. But I’m getting confused when it comes to the theory of the crisis and the theory of the structure.

    Is the crisis caused by everyone doing the same thing at the same time? Or is it caused by some exogenous shock which then gets amplified by the simultaneous and essentially coordinated reaction?

    Duncan and you are both suggesting that Citibank, Lehman, Bear Stearns, Bank of America, etc were too important because a moderate shock to them reverberated through the system creating a cascading domino effect.

    You differ though in that Duncan’s point is that actors several steps down the line couldn’t behave rationally. You are saying that the ubiquity of quality information (and not so quality information) led everyone to react the same way at the same time: classic run on the bank material.

    Duncan says the thing to do is keep companies from getting so big that they than thump the system hard enough to trigger this massive cascade. You say, the thing to do is to keep them from getting so big they trigger this massive simultaneous reaction.

    This is where my point in the original post comes in: if the problem is information, not the structure, then I don’t actually see how the structure affects anything. The problem isn’t the structure, its that people couldn’t see the risks 2, 3, 4 links away and act accordingly.

    But you are saying that the structure *does* matter and this is where you go into a discussion of oscillation frequencies. And that, to be honest, is where I’m losing the plot. As you put it “inanimate objects and insects — pendulums, fireflies — will spontaneously synchronize their oscillations under certain limited circumstances.” But that’s a big black box which any good sociologist would demand needs be opened to make sense of the rest of what you are writing about. Its basically what we’re into. How are they coordinated? And as importantly, why does it matter?

    Like

    Sean Safford

    June 30, 2009 at 10:52 pm

  13. hmmm… my earlier comment looks a little curt, sorry.

    We could probably spend hours discussing this, and I doubt either of us want to, but I’ll ask a couple questions and make a couple points before dropping the subject.

    I got lost on your first sentence: “Its both because the institution makes the organization — they define how the organization can adapt in response to particular stimulus.”

    You said: “There is already empirical data out there that demonstrates that synchronization is happening. So this is not pie in the sky.” But you don’t tell us what the data is, what is synchronizing, or how it explains the financial crisis.

    You frequently refer to the “rational hypothesis.” What is that? I have a feeling you are referring to a model, not a hypothesis. If I’m right, my follow up point is this: All models are false, some are useful because they help us make more accurate predictions. Does your alternative model make more accurate predictions?

    Like

    Michael Bishop

    June 30, 2009 at 11:01 pm

  14. Not everybody acted at the same time. Citibank, Lehman, Bear Sterns, Bank of America acted at the same time. Then the next tier. And so on. So there was a cascade, but it resulted because there was almost no bank left with cash at each tier.

    I actually think I agree with you that it’s not the structure, but rather the decisionmaking, but I’m not sure we have the same reasons. I don’t think the structure is an inherently bad thing — the tiered structure is the most cost-efficient if it can be stabilized.

    So why didn’t anybody see this coming? Because everybody was relying on the same price and risk models. Nobody had an eye on the possibility of correlated buying and selling, or cascading failures.

    The point of the physics models is to understand how decisionmaking at banks that were not talking to each other might nonetheless become correlated because of links between their cash flows.

    Suppose one bank demonstrates that it can turn a profit bundling and reselling subprime mortgages, other banks will tend to imitate so long as there are buyers. Now assume that every bank can produce similar subprime mortgage bundles at about the same rate and cost, and that every bank can see the same market prices at which bundles are being sold. These are equivalent to the conditions under which inanimate oscillations will spontaneously synchronize. The point of the physics models is to warn us that we should not be suprised to see banks acting in perfect unison even without communicating with each other under these conditions. Maybe that doesn’t seem suprising to you. It did to me.

    What is the useful prescription? To answer that I think we need more data. One of the problems we have had in understanding and analyzing this crisis is that nobody but the accountants inside each bank has access to the cash-flows that I’m talking about. The SEC does not require cash-flows to be reported more than once a quarter. To see these dynamics, you have to see how every bank’s cash flows changed on a day by day basis. The SEC could ask for that, and maybe they’ve seen it already. But doing stress tests and using models of past price fluctuations are not going to provide any insight.

    Ultimately, I don’t think any small group of regulators will be able to spot systemic risks. As hard as it may be for some people to stomach, I think the answer to these market failures is more market-based responses. To spot the links and synchronization I’m talking about, investors have to see more cash-flow information. So there, that’s one prescription.

    Like

    Michael F. Martin

    June 30, 2009 at 11:30 pm

  15. Ok well, interesting.

    First, plenty of sociologists have looked at what we’re talking about as “cascades”… its basically mimetic isomorphism. There are also theories about broadcasting information and how this generates conformity.

    But this discussion is raising a very interesting question about the speed with which that process takes hold and what impact that speed has on the stability of the system.

    There’s a long standing discussion in the institutions literature about exogenous shocks as instigators of institutional change. But this conversation is pointing in a direction which is different from existing theory (which makes it interesting).

    So, thanks.

    Like

    Sean Safford

    June 30, 2009 at 11:37 pm

  16. Sean, what you’re saying makes a lot of sense. I think your questions are tough ones, especially number 3. This is not my area of expertise, so let me side step the tough questions and make a practical suggestion. In terms of policy, I think this might be a case where we can more accurately estimate the cost of various regulatory responses than the benefits.

    So lets start by estimating the costs of various policy proposals: increased information disclosure, increased capital requirements, or as Watts suggests, limits on the size of investment banks, insurance companies, etc.

    Even if our estimate of the benefits of these proposals has a wide variance (which I believe it inevitably will) we may find that some of the reforms are no-brainers because the costs (which are easier, if not easy, to evaluate) are sufficiently low. I know this leaves many questions unanswered.

    By the way, I’m a grad student at U of C, so maybe I’ll see you around one of these days.

    Like

    Michael Bishop

    June 30, 2009 at 11:40 pm

  17. @Michael Bishop

    No sweat. Glad you were willing to say more. Maybe this will be the last time people engage with me on this topic, but I appreciate it nonetheless.

    1. Institutions and organizations: Rules define how people interact; but people add, modify, and remove rules over time.

    2. Empirical data: http://arxiv.org/abs/0903.2099

    There are also lots of places where the increased correlation of market indices over the past decade has been measured.

    3. Rational Hypothesis: Posits that people tend to maximize the fit between available opportunities and preferences. It has been rigorously tested only against data on large populations of consumers over long periods of time. We simply don’t know if it is a correct description of individual behavior.

    I agree with your point about models. The problem with this one is that it has been applied far outside the scope in which it has been tested empirically. In this sense it’s not so much the rational hypothesis as the way in which it’s been tested using data on group behavior over long periods of time.

    Like

    Michael F. Martin

    June 30, 2009 at 11:44 pm

  18. @ Michael F Martin,

    ok, I have a somewhat clearer idea about what you’re saying after reading over your subsequent comments.

    You call synchronization or correlated decisions the cause of problems, but I think they are better thought of as symptoms of problems. This is my concern about a lot of “systems” analysis, it seems to ignore the mechanisms. I’m inclined to believe that the cause of the problem is perverse incentives for many individuals and firms. Admittedly, that is a very vague answer, and the devil is in the details.

    Like

    Michael Bishop

    July 1, 2009 at 12:22 am

  19. The devil indeed is in the details. It seems likely that many mechanisms, including (maybe especially) perverse incentives, were at work in this crisis. Maybe I’m too infatuated with sync.

    Like

    Michael F. Martin

    July 1, 2009 at 12:40 am

  20. If you could predict them, they would not be called “surprises.”

    It is not an accident that “statism” shares its root with “static.” It seems that the anti-capitalist mentality may finally have accepted the natural existence of cyclical booms and busts. Now, the new monster in the closet is “systemic shock” i.e, events that are completely unexpected.

    The unexpected make life worth living. Win some; lose some. The alternative is a slow, steady grind that ends at the grave.

    Wonderous inventions are unexpected events that ruin existing markets. A world without “systemic shock” is a lifetime without space stations, computers, organ transplants, contact lenses and hearing aids, anti-oxidant drinks, CAT and MRI and digital x-rays, cellphones, or a thousand other niceties that are nearly invisible because we take them for granted.

    Merchantry has always been interconnected. The Babylonians did it with clay tablets. Modern insurance was born in a London coffee house. I recently read THE MAN WHO FOUND THE MONEY about John Stewart Kennedy and the 19th century railroads. Markets were interconnected then, too.

    The present so-called “crisi” is nothing new. And you do not want to prevent it from happening again.

    Like

    Michael E. Marotta

    July 2, 2009 at 1:28 pm

  21. I’m not sure anyone is saying crises are new or even bad. The real question is how a system absorbes the crisis and reacts. Does it reassemble itslef? Does it crumble? Do these processes happen quickly or not? Who wins? Who loses? Those are the big questions.

    Like

    Safford, Sean

    July 2, 2009 at 2:11 pm

  22. I think its worth retaining the usual understanding of crises as bad. But it is reasonable to fear that our attempts to avoid the next crisis could do more harm than good (though I’m not prepared to endorse this).

    @ Michael E. Marotta,
    When people talk about “systemic shocks” they are not talking about events that are merely unexpected. They are talking about events which endanger the functioning of a system that we depend on.

    “The unexpected make life worth living. Win some; lose some. The alternative is a slow, steady grind that ends at the grave.”

    While we must accept that there will always be unexpected events, you seem to be rationalizing, even worshiping our ignorance. If we actually wanted more unexpected events, then we should quit science, and quit planning for the future. That doesn’t sound like to promising a philosophy to me.

    Like

    Michael Bishop

    July 2, 2009 at 5:14 pm

  23. I don’t think that crises are bad so long as everybody is playing by the same rules. The people who get my sympthaty in crises are the ones who played by the rules and still ended up losing because somebody else bent the rules.

    Like

    Michael F. Martin

    July 2, 2009 at 5:21 pm

  24. I think most people think crises are bad, by definition. It doesn’t matter what definition we assign what word, the important thing is that we agree to use the same definitions so that we can communicate.

    If you don’t think a particular situation is bad, I think you’d be better off arguing that “it isn’t a true crisis.”

    Like

    Michael Bishop

    July 2, 2009 at 5:45 pm

  25. See this blog post on disputing definitions:

    http://lesswrong.com/lw/np/disputing_definitions/

    Like

    Michael Bishop

    July 2, 2009 at 5:45 pm

  26. […] on va pouvoir évaluer si un acteur induit un risque pour le réseau dans son ensemble ou non. Sean Safford sur Orgtheory.net pose une autre bonne question : le problème pour un réseau est-il d’être trop vulnérable […]

    Like

  27. Here’s a post I did a while back with more discussion and links to articles about sync, the Kuramoto Model, the Millenium bridge, and market price signals.

    Like

    Michael F. Martin

    July 7, 2009 at 5:22 pm

  28. Here’s a post I did a while back with more discussion and links to articles about sync, the Kuramoto Model, the Millenium bridge, and market price signals.

    http://brokensymmetry.typepad.com/broken_symmetry/2008/07/a-mathematical.html

    Like

    Michael F. Martin

    July 7, 2009 at 5:22 pm

  29. coming to this one a little late, but with a slightly different question. has anyone seen any work that tries to actually map some of the complexities in this particular collapse? i am working on a syllabus for an undergrad networks class that i’ll be teaching this fall, and think this would be a great addition, but thus far haven’t come up with a ton. i did see Valdis Krebs’s blog post with some simulated networks for demonstration purposes, but that was about the extent i’ve been able to dig up thus far. any pointers appreciated! thanks.

    Like

    shrinkingisaac

    July 9, 2009 at 12:55 am

  30. shrinkingisaac,

    Although there aren’t great diagrams, this paper from Gary Gorton provides the most thorough look at what was going on that I’ve seen as a member of the public:

    Click to access Gorton.08.04.08.pdf

    Like

    Michael F. Martin

    July 9, 2009 at 2:41 pm

  31. […] Stafford over at orgtheory.net brings ups 3 questions that get to the heart of the problem. I’ll mention some ways I think […]

    Like

  32. Michael F. Martin

    September 30, 2009 at 3:52 pm


Comments are closed.

%d bloggers like this: