Earthquakes and enterprise-architecture

What happens when other people take our cautious ‘It depends…‘ as an explicit Yes or No? What are the risks that we face as enterprise-architects when others force us to give a definite ‘Yes’ or ‘No’ in relation to something that’s inherently uncertain?

What can we do if those others base their actions and choices on that ‘definitive’ answer that they demanded from us – even though we’d told them that it was inherently uncertain? And that can we do to protect ourselves from the way people want to blame us – yet never themselves – when things turn out the opposite way to what we were forced to ‘predict’?

Like a lot of people in the EA ‘trade’, I’ve been very concerned about the implications of the ‘earthquake trial’ in Italy. To quote the BBC report:

This week six scientists and one government official were sentenced to six years in prison for manslaughter, for making “falsely reassuring” comments before the 2009 L’Aquila earthquake. But was this fair?

Reading somewhat between the lines, to me this sounds like a classic clash of paradigms:

  • the linear-paradigm, which expects and demands that everything can be reduced to simple true/false logic, that everything should be certain, known, predictable, safe
  • the flow/probability paradigm, in which everything is somewhat blurry and uncertain, and can only be described in statistical terms, in a modal logic of probability, possibility and necessity.

Many of the sciences will now only describe their work in terms of probabilities: weather-modelling is one obvious example. The catch is that many people want definite answers, a definite weather-forecast, because they need to base concrete real-world decisions on those answers. It should only take a few moments’ thought to realise that that clash has the potential for a really nasty wicked-problem… dangerous for everyone involved…

That’s exactly the kind of clash that we live with everyday in enterprise-architecture. For example, it’s the key source of others’ frustration with us whenever we reply to their question with the ubiquitous – and, to them, iniquitous – answer of “It depends…”. It’s why some planners fail to accept that a ‘roadmap for change’ will always be provisional – especially so in times as turbulent as ours, in terms of the scale and scope of changes in technology and just about everything else. And it’s why those so-certain seeming models created within most of our current toolsets are actually darn dangerous for us, because they give a spurious illusion of certainty that does not exist – and cannot exist – in the real world.

To quote the BBC report again, the real key is communication:

This case is not about the scientists’ ability to predict earthquakes – it is about their statements communicating the risk of an earthquake.

This is why communication is such a key feature in enterprise-architecture frameworks such as PEAF and TOGAF. But again the same challenge will arise: how do we explain the nature and reality of risk? How do we explain something that’s inherently uncertain, to people who live in a world of ‘they-shoulds’, and who in some cases refuse to accept even the existence of uncertainty?

The Italian seismologists understood that an earthquake was unlikely but not impossible. In a press conference, however, the message seemed to be that that meant there was nothing to worry about at all. This is the falsely reassuring statement which formed part of the case against them.

Reading the BBC report, to me it sounds like the L’Aquila scientists and officials were pushed so hard for a non-existent certainty that they ended up giving way out of sheer exasperation:

The government official, Bernardo De Bernardinis – deputy chief of Italy’s Civil Protection Department at the time – is reported to have advised worried residents to go home and sip a glass of wine. He even specified what kind: “Absolutely a Montepulciano.”

This turned out to be a classic example of kurtosis – a seemingly low risk that, if it eventuates, will more than wipe out the gains that have been made from ignoring the risk. At L’Aquila, the risk for a serious earthquake was real, though very low; yet the demand for certainty forced a translation from ‘low risk’ to ‘no risk’. So the people went home, and ignored the ongoing minor tremors. Which was not a good choice: the much more serious earthquake did eventuate, all but flattening the town, killing more than 300 people and injuring many thousands more. One of the certain consequences of this kind of shock – especially on this scale – is the search for the scapegoat, for some one (else) to blame: and the scientists who had give the ‘wrong answer’ about the risk were the all too obvious targets.

From an enterprise-architecture perspective, though, notice one of the themes I’ve recently been exploring here: even though it’s quite subtle, there’s a really serious power-issue in play in this case. What’s happened is that the townsfolk not only passed to the experts the (mental etc) work of assessing the risk, but also in effect ‘exported’ the (emotional etc) work of facing the risk. In this sense,‘export’ is the attempt to offload onto others some form of work that should or can only be done by the self. In both a technical and very literal sense, ‘export’ is an active form of abuse. In reality, the work of facing the risk of earthquake could only be done by the townsfolk, because the risk was, by definition, theirs: and having avoided that work, the supposed ‘natural’ response is to try to blame those to whom the work was ‘exported’ – in other words, to now also ‘export’ the (emotional etc) work of dealing with the consequences of having avoided that work in the first place. That the scientists have now been ‘punished’ for their ‘misleading advice’ merely serves to anchor the delusion that people have a ‘right’ to export the fears to others in this way: in other words, a systemic or structural form of abuse.

Don’t laugh: most current organisations and enterprises are riddled with such forms of abuse – which is why there are so many disastrous power-problems and suchlike in those selfsame organisations and enterprises. Ouch…

Even more to the point, every enterprise-architect is inherently at high personal risk from that type of abuse. It’s an inherent outcome of the type of work that we do: we link across silos and projects that really don’t want to talk with each other, and would much prefer to have someone else to take the blame for the own frustrations. Wicked-problems always make things worse in this sense: and, by its nature, enterprise-architecture is wicked – hence, all too easily, ‘the wicked one’ who can be blamed by everyone for everything. On top of that, we’re dealing all the time with inherent-uncertainty, and we’re surrounded by stakeholders who need concrete, actionable answers, and who really don’t want to hear the words “It depends…”. So don’t laugh at the scientists of L’Aquila, or the townsfolk either: it could very easily be you that’s next up for that kind of (mis)treatment – when you might find your stakeholders holding a rather different and more sharpened kind of stake…

Some practical suggestions here:

  • do ensure that your stakeholders understand and acknowledge that a probability always remains uncertain – that it is never certain, or reducible to a simple true/false answer
  • do ensure that your stakeholders understand that opportunity and risk are inherent flipsides of each other – opportunity always implies risk, and risk also always implies opportunity
  • do clarify the nature and scale of each opportunity/risk – preferably with explicit metrics to underpin each assessment
  • do ensure that your stakeholders understand the consequences of each opportunity/risk, in terms of its eventuation or non-eventuation, and of the (probable) implications of each choice for action or non-action
  • do document the risks, and others’ acknowledgement of those risks
  • don’t use terminology that implies any greater level of certainty than is actually the case
  • don’t use ‘hard-edged’ diagrams (Archimate, BPMN, UML etc) where there is significant risk of their being interpreted as implying a spuriously high degree of certainty
  • don’t allow others to ‘export’ their fears of uncertainty onto you – especially through systemic channels which afford you no defence against such actions

You Have Been Warned, perhaps? But in any case, do take care on this: as we head into ever more turbulent times, this kind of wicked-problem is all too likely to get much, much worse. 🙁

[Update, 06 April 2013: Dany Mitzman, on the BBC website – see ‘Why evacuate for an earthquake no-one can feel?‘ – reports on the all-too-predictable outcome of the prosecution and conviction of the L’Aquila scientists: hyper-caution on the part of the region’s scientists, and excessive false-alarms, triggering unnecessary evacuations night after night.

The case has produced its own kind of aftershock. … [Civil Protection Department] spokesperson Francesca Maffini says it’s inevitable that scientists are now erring on the side of caution.

“It’s not the verdict itself, it’s the very fact they were put on trial,” she says. “If the risk is between zero and 40%, today they’ll tell us it’s 40, even if they think it’s closer to zero. They’re protecting themselves, which is perfectly understandable.”

Whenever inherent-uncertainty is arbitrarily compressed down to absolute-certainty, we’re going to get problems – such as, in this case, either lethal complacency or relentless alarm. Far wiser to build a sense of perspective and proportion, and accept the uncertainty for what it is.]

2 Comments on “Earthquakes and enterprise-architecture

  1. That is a good checklist but we seem to be deluged with risk managers, risk management consultants and other risky professionals. The trend to make ‘risk’ as someone else’s problem rather than recognising that risk, like strategy and change, is just part of the business of managing.
    While I accept wholeheartedly the avoidance of the ‘hard-edged’ representation of risk -the alternative seems to be words that are notoriously taken to mean whatever is convenient at the time or simply weasel words that sound as though an assessment has been made when it has not.
    Prefacing a project or programme management report with the statement that ‘we are progressing with these risks known but the likelihood of their occurrence unknown” is probably honest but career limiting. In the absence of a mandate to quantify likelihood (no doubt engaging valuable project time and money) at the very least establish what will happen when the risk eventuates and what can be done to mitigate that effect.

Leave a Reply to David French Cancel reply

Your email address will not be published. Required fields are marked *

*