Reframing entropy in business
What is entropy? What – if anything – is its relevance to business? And how does chaos come into the picture?
The physics definition of entropy is straightforward enough: it’s a corollary of the second law of thermodynamics, the way in which energy flows from a hotter region to a colder one, and so on. Entropy goes up as the range of possibility and action goes down. And it’s deemed to be irreversible – ‘the arrow of time’ – with everything fading away over (long) periods of time to the ‘nothingness’ that is the heat-death of the universe. Other common metaphors are the silting-up of a river, or a mechanical-clock winding down until the final stop: not hard to see analogues of those in business-processes and the like…
The catch is that it’s not as straightforward as it looks. To quote the Wikipedia entry:
Outside the range of classical thermodynamics, the definition of the entropy of a small local region is no simple matter. … It is often assumed without proof that the instantaneous global entropy of a non-equilibrium system can be found by adding up the simultaneous instantaneous entropies of its constituent small local regions. For a given physical process, the selection of suitable independent local non-equilibrium macroscopic state variables for the construction of a thermodynamic description calls for qualitative physical understanding, rather than being a simply mathematical problem concerned with a uniquely determined thermodynamic description.
In other words, it’s starting to look much like enterprise-architecture, where the qualitative factors (including human-factors) mean that we can’t do all of the work with simple calculations: the whole of the system can be greater or less than the sum of its parts. Hmm…
Anyway, some good points there, but I couldn’t see how to apply it in business: too technical, too much of a metaphor, and potentially too confusing. Useful though it still looked, I shelved the concept of entropy from my enterprise-architecture work for a while.
Yet a few weeks back I was watching a BBC documentary on entropy (by physicist Jim Al-Khalili, I think?) Quoting standard physics, he says that entropy always moves (or increases) from order to chaos: and to illustrate this, he pushes a ceramic vase off the table, and lets it smash on the floor. Order, he says, is like the intact vase: we can do useful work with it. Chaos is the broken vase: we can’t use it to do the useful work of carrying water any more.
From order, to chaos: an irreversible decline. And yeah, we see that often enough in business too.
But hang on: just wait a minute, willya? ‘Order to chaos’ is the wrong way round – a special-case of both the terms ‘order’ and ‘chaos’ that doesn’t match up well with anything elsewhere in physics. For example, elsewhere in thermodynamics, a ‘chaotic’ state for matter is a plasma – the highest energy-level, not the lowest. And the lowest energy-difference, the highest entropy-level, is also the most ordered: a stasis, where nothing moves, nothing changes, nothing happens. And yep, we see that all too often in business, don’t we…?
So although the usual description of entropy is a move from order to chaos, in terms of energy the clockwork winds downward from chaos to order. Available-energy will fall, and entropy will rise, as we move downward from the chaotic mountain-stream to the silted-up river-delta; entropy rises as we move from chaos to order.
Or, in a bit more detail:
- (ultimate start-point is primordiality – ‘the moment before the Big Bang’)
- from chaos (maximum potential, maximum possibility)
- to useful order (maximum exploitable energy in constrained possibility)
- to non-useful order (low exploitable energy and/or misaligned possibility)
- to decrepitude [self-order] (no apparent energy and/or alignment of possibility)
- to stasis (no energy)
The reason we usually won’t see any reference to chaos in this more correct sense here is because true chaotic contexts are often not directly usable as such: for example, there’s often too much energy or possibility to be exploitable in practice. Instead, it’s simpler for most views to start with ‘useful order’, and end at the ‘decrepitude’ stage because it looks (and, at first glance, often is) an unusable mess.
Compare a factory working at full blast with an abandoned brownfield industrial site: the first is ‘useful order’, the second is a seemingly-unusable mess. The latter is self-ordered in a way that we can’t use: hence, in colloquial terms, ‘chaos’ – yet a misuse of the term ‘chaos’ that has some seriously misleading impacts here. Oh well.
Okay, fine, but so what? Does that pretty piece of petty pedantry have any use – any practical use – in business and the like?
Quite a lot, is the real answer: in fact it’s actually right at the core of all change in business. Hence why it kinda matters which way round we put it…
First, the slowdown implied by entropy is a natural and inevitable fact of physics. There is no such thing as an order that can be maintained indefinitely. Over time, even in business, useful order will always decay into non-useful order. We can perhaps delay that decay, but we can’t prevent it – and fighting to prevent it from happening will only make things worse. This is a really important fact of nature that needs to be understood right at the root of every enterprise-architecture.
Entropy applies to every system in business, including business-processes, system-configurations, work-rosters, business-models, facilities, resources. Everything decays and/or goes out of date. Entropy applies just as much to people in business: memory fades, capability fades, people leave, people die. And there’s nothing we can do about that fact: it’s inevitable, and irreversible.
In short, a pretty gloomy picture.
Yet that’s not actually what we experience in practice, is it? Sure, entropy is always there in the background, and we can never really escape it. But there’s also something else going on that can give an odd kind of ‘get-out clause’ that, even if we can’t ever truly escape entropy, shows us that can sometimes bend it and twist it in some very useful ways – if we’re willing to accept how it actually works. The catch is that the way that it works is almost the exact opposite of what most current business-paradigms either want or expect.
Part of this is entropy is not evenly distributed. Different isotopes and elements have different rates of radioactive decay, different alloys rust at different rates, and so on. The same applies to differences between business-processes, business-facilities or whatever. There’s also varying ‘variety-weather‘, where the factors affecting entropy and the like change over time and between different places. These are differences that we can leverage: for example, in some cases we can use something with a slower rate of decay to refresh something that has a faster decay. The ISO-9000 quality-system standard is built around exactly that principle: over time, work-instructions need to change faster than procedures, which change faster than policies, which change faster than enterprise vision – so in that sense, enterprise-vision can act as a stable anchor for an organisation’s quality-system even under conditions of turbulent change.
Next, a corollary of the inevitability of entropy is that any apparently self-sustaining system is actually receiving energy from somewhere ‘outside’ of the nominal system-boundary. A simple real-world example is the rainfall-cycle: water evaporates from the sea or land-surface to form clouds, from which rain falls, and returns to the sea via stream and rivers – but it relies on energy from the sun to power the evaporation that drives the seemingly counter-entropy ‘upward’ part of the cycle. In business, the refresh of a business-process comes in part from ‘investment’ from outside of the business-process itself.
Another corollary is that potentially-useful order is the outcome of a decay from chaos – and once we start that decay going, entropy demands that it must inevitably continue towards non-useful order, and thence to decrepitude. We can ‘tame’ a mountain stream to give us useful hydroelectric energy, but the fact of extracting the energy leads inevitably to a faster silting-up of the stream.
A side-corollary is that intervening in a system to impose ‘useful order’ on that system inevitably changes the dynamics of the system towards a faster rate of decay. This particularly applies when the context itself is undergoing change: as architect Bert van Lamoen put it, “the most efficient system dies first”. In terms of entropy, the more we try to ‘control’ something, the faster it is likely to decay. Unfortunately, many standard business-paradigms teach managers to ‘take control’ of systems – yet fail to warn them of the inevitable consequences of doing so.
Decay is inevitable. In mechanical systems and any other systems that follow physical laws in a linear fashion, the decay is largely linear: once we ‘tame’ something to create useful order, the decay is already on its way. Yet chaotic-systems, complex-systems and other non-linear systems create ‘loopholes’ that can, in effect, locally reverse the flow of entropy. At a larger scale, non-linear systems still follow the rules – there’s still the same irreversible trend towards decay and stasis – yet there are loops and peaks and troughs where, for brief moments, the effective flow is in the opposite direction.
In effect, living systems leverage those loopholes to counter entropy within their own local context. By definition, it’s a risky tactic: if the living system catches the wrong edge of the curve, it will increase the rate of entropy rather than reverse it. Yet the payoff can be huge: even something approaching immortality, in some cases, at least at the species-level.
Within this, each species learns and remembers how to leverage opportunities for reverse-entropy. How exactly this learning and remembering takes place varies enormously from species to species and context to context – everywhere there’s some different mix of ‘learning by surviving’ (Darwinian) versus ‘acquired learning’ (Lamarckian) – but the actual mechanisms are usually less important than that the learning and adaptation does take place. There’s always some form of purpose to act as a guide for the trade-off between risk and opportunity, even if only as an ‘instinctual’ drive for individual and species survival.
All of this applies to business too: for example, in his book Antifragile, Nassim Taleb describes the implicit ‘species-level’ learning-process in the lifecycles of restaurants in a large city. Learning-by-dying is not so good for an individual restaurant, though: so at that level – the more typical concern for business-strategists, business-architects, enterprise-architects and the like – we need a more explicit learning-process, and usually a more explicit purpose as a counter-entropy guide, too.
An essential corollary from the above is that the more tightly the system is controlled, the fewer opportunities are available for reverse-entropy. Given ‘total control’, often the only option for the system to refresh itself is in a phase-change that’s often experienced as ‘catastrophic collapse’ or ‘revolution’. (It’s notable here that ‘revolution’ literally means ‘going round in circles’, but in essence it’s a cycle of catastrophic-collapse, reverse-entropy refresh and then a reimposed ‘order’, trending once more towards decay and decrepitude. We’ll see that a lot in business too…)
Reverse-entropy refresh is enabled by return towards ‘chaos’; ‘micro-refresh’ can occur by leveraging small moments of inherent-uncertainty. Another corollary here is that since opportunities for micro-refresh can occur all of the time, but can only be leveraged if access to those opportunities is available, rigid imposition of ‘order’ inherently accelerates the rate of decay.
A lesser form of refresh can be created by going part-way towards ‘chaos’, leveraging different rates of decay between the components of a system versus the system as a whole: loose-coupling of elements enables greater opportunity for micro-refresh of the overall system. This is the fundamental principle behind service-oriented design: the system-components (services) are more stable than the system-as-a-whole (delivery of ‘business process’), hence we can refresh the system by reconfiguring the relationships between services, without changing the services themselves. This is, in effect, a return from non-useful order to useful order, where the only chaos element is in the process of reconfiguring of service-relationships, and perhaps also of specific services, rather than redesign of the entire system.
There’s probably a lot more that we can do with this, but I’ll stop here for now. Once again, the key idea here is that the decay of entropy flows from chaos to order – not from order to chaos – and that the imposition of ‘order’ is itself the key driver towards decay in the usefulness of systems.
Something to think about, perhaps?
Addendum: when I put out a note on Twitter that I was going to write this post, it kinda triggered a really nice back-and-forth between Eric Stephens (@EricStephens) and Stuart Boardman (@ArtBourbon), which seems worth including here:
- EricStephens: “entropy” is a favorite word of mine when describing dysfunctional architectures
- ArtBourbon: entropy is an unavoidable aspect of our environment. The trick is not to make a dysfunctional response to it. No?
- EricStephens: agree not to have dysfx response. But can “architectural entropy” be minimized with a disciplined #entarch approach?
- ArtBourbon: good question. Maybe we need to redefine discipline in this context. Ashby seems relevant.
- EricStephens: Sorry, not following “ashby”
- EricStephens: IMHO, entropy = neglect, lack of maintenance, forethought. // I admit some misalignment may be part of normal “wear and tear” or environmental/competitive forces
- ArtBourbon: [re ‘ashby’] the law of requisite variety.
- EricStephens: just reading up on requisite variety last night. thx.
- ArtBourbon: Re “wear and tear” yes but dysfunctional responses to entropy are indeed a big problem. So, yes, agreed. // I tend to stick to the concept of entropy in physics – the degree of uncertainty/instability.
- EricStephens: despite my paltry knowledge of physics, that is my view of entropy as well.
- ArtBourbon: So EA can’t manage entropy by creating artificial certainties, rather by creating/enabling ability to change – fast
- EricStephens: exactly: after the drawing and hand waving is done, EAs need to formulate and lead a set of concrete actions…
Another addendum: another colleague pointed me to Frank Buytendijk’s post ‘Aristotle and Enterprise Architecture‘, which also mentions entropy in business, though more from the classic view of ‘from order to chaos’. Well worth a read, anyway.
That’s it for now: over to you for comment, as usual?
Tom,
The popular notion of “order” being opposite of “chaos” and hence ordering forces being those that decrease the entropy, is a bit misleading. What actually decreases entropy is self-organization which happens after sufficient degree of complexity.
The second law of thermodynamics explains the “arrow of time” and the natural trend towards chaos, but there is something quite challenging that seems to exists: Life. And hence the motivation to challenge the law. From Maxwell’s demon in 1871 through the the work of Leo Szilard the law has been challenged relentlessly, and it is worth noting what ‘entropy’ means in the context of information theory and cybernetics (but going there would take a lot of time to make the point). The first sentence of my reply probably reveals the preference I have for the emergent self-organisation out of complexity being the actual phenomena opposing entropy. In the words if Stuard Kauffman “life exists at the edge of chaos”. And his work showed that there is no need of external force or natural selection for self-organisation to emerge. All it is needed is just sufficient number of notes and connections between them. And even more precise are the findings of Murray Gell-Mann, introducing the concept of ‘effective’ complexity. Both order and chaos have low effective complexity.
So, I would agree with statements such as “the more tightly the system is controlled, the fewer opportunities are available for reverse-entropy” and “A lesser form of refresh can be created by going part-way towards ‘chaos’” and with some others – less so. One source of confusion for me is the following assertion: “potentially-useful order is the outcome of a decay from chaos”. Could you elaborate, please.
Ivo
Ivo, Stuart, Dave (and no doubt others too): a more general note before I go into detail-replies…
I have to apologise that this post is perhaps quite a bit more fragmentary than even my usual ‘exploratory’ posts. I usually prefer to write a post in one go, but for the past two weeks and more I’ve been struggling with a bad cold that doesn’t seem to shift and that’s made it very difficult to think straight for more than half an hour at a time. The result is that I wrote this one in small bits over a period of about a week, hence the continuity is more than bit broken in places, and there are a few sub-themes still missing. I’ll answer as best I can, but I’ll admit I’m still a bit scrambled – sorry… 🙁
The key theme is this: the usual view of entropy is that it decays always from order to chaos, but there are some useful insights that can arise if we align this better with current physics, suggesting that the decay is actually better understood as from chaos to order. This isn’t an assertion that that chaos-to-order sequence is ‘the truth’, merely that it can be useful to reframe it that way. Useful in enterprise-architectures and the like, anyway.
I hope that makes a bit more sense?
Ivo: I’m a bit confused as to which direction you regard entropy as flowing? As I understand it, entropy will naturally tend to increase over time, whereas your description seems to me to imply that you view it as decreasing. Either I’m misinterpreting you, or one of us has got it the wrong way round: given my present befuddled state I’m presuming it’s me that’s got it wrong, but clarify this for me, if you would?
@Ivo: “The second law of thermodynamics explains the “arrow of time” and the natural trend towards chaos, but there is something quite challenging that seems to exists: Life.”
That’s exactly the point I’m making here in this post: the ‘standard description’ of entropy only makes sense with purely mechanical systems. Living-systems never actually break the rules as such, but can leverage local variances in such a way that it can bend the rules a very long way indeed. We can apply the same ‘rule-bending’ within organisations, process-design and much else as well, using the same kind of tricks that living-systems do. If we try to treat the organisation solely as a mechanical entity to be ‘controlled’, we cut off most of our access to such ‘rule-bending’ tricks.
(Thanks for the links and references – very useful indeed.)
@Ivo: “One source of confusion for me is the following assertion: “potentially-useful order is the outcome of a decay from chaos”. Could you elaborate, please.”
If we view entropy in terms of a decay of possibility, from infinite-possibility (e.g. immediately prior to the Big Bang) to zero-possibility (e.g. stasis of the ‘heath death of the Universe’), then ‘chaos’ represents very-high-possibility, but in practice often too high a range of possibility to be useful. We therefore apply constraints on that possibility on that possibility in order to make it more useful (e.g. more more repeatable). The constraints provide bounds that are usually interpreted or described as ‘order’ – hence ‘scientific law’ and suchlike. The fact of the constraints is, in effect, an increase in entropy, a ‘decay’ from higher-possibility to lower-possibility. Once we’ve started on that path, further and further constraints tend to accrete over time, until eventually too constrained to be useful within the current context – in other words, ‘not-useful order’. Further accretion of constraints eventually leads to a system so constrained as to not be apparently usable at all for the current context – in other words, it’s considered decrepit. To make it useful again, we need to re-think and reframe the possibilities – in other words, break out of the current constraints and assumptions in some way. And to do that, we must in effect reverse the entropy – we ‘go back towards chaos’.
To summarise, the decay of possibility via the accretion of constraints both directly is and indirectly is analogous to the accretion of entropy. In most cases, we need some constraints in order to make things useful: but the fact of those constraints is also, by definition, a decay of possibility – hence “potentially-useful order is the outcome of a decay from chaos”.
Tom,
I don’t have a (big) problem with the second law. And since I don’t have a cold, it might be me English that caused misunderstanding. The statement “what actually decreases entropy is self-organization which happens after sufficient degree of complexity” is related to the findings of Kauffman and Gell-mann (among others), from one side, and the perception of order-chaos polarity. This is not surprising. Logistic map, bifurcation and even the transformation of water from ice (order), through liquid (complexity) to gas (chaos) is somehow ordering them in a perceived line giving wrong impression about complexity as something more chaotic than order or more orderly than chaos.
In the organisational context both order and chaos are dysfunctional in the long run. Order is stable but stiff, chaos could be necessary for innovation but only if experienced for a very short time.
Corr: please read “nodes” instead of “notes”
Tom, I have mixed feelings about this. I can happily agree with your contention that “chaotic” is not a descriptor of a bad system – just a type of system. Trying to impose a conventional sense of order on it is indeed probably the worst thing one can do – creating fictitious certainties. So yes, I think I completely agree with what you’re saying about enterprises.
I have a bit of a problem with how you’re using the concept of entropy. Both your use of the term here and my own in my recent work are somewhat metaphorical. Nonetheless it behoves us to stay as close as we can to the original meaning in physics. If we look at it more from the perspective of statistical mechanics than of thermodynamics, it’s clearer what it actually means. Wikipedia provides an acceptable statement of this “In statistical mechanics, entropy is a measure of the number of ways in which a system may be arranged, often taken to be a measure of “disorder” (the higher the entropy, the higher the disorder)”. So higher entropy does not mean a lower state of energy, just a lower amount of usable energy. High entropy does not imply stasis.
In this sense entropy affecting enterprises (whether internal or external) is not about decay into order but about the increase in variety. Hence the relevance of Ashby’s law. If we meet variety with variety, we can create a new type of order that coexists and can change with the realities of the environment. Much of your own recent writing addresses exactly this.
Am I just being picky? I’l let someone else answer that.
Stuart
@Stuart: “Am I just being picky? I’l let someone else answer that.”
I’ll answer that, and say that I don’t think you’re being picky at all. What I do think, though, is that you’re illustrating the confusion that exists around entropy, even in the sciences, with multiple and often-conflicting definitions as to what entropy actually is. All I’ve shown here is that one of those definitions, from astronomy/cosmology, is probably more useful to us (in enterprise-architectures and suchlike) than the usual ‘decay from order to chaos’ view.
I’d also suggest that that Wikipedia definition you quote not merely misleading, it’s just plain wrong: “In statistical mechanics, entropy is a measure of the number of ways in which a system may be arranged, often taken to be a measure of ‘disorder’ (the higher the entropy, the higher the disorder)”. If we take that definition literally, entropy is actually a measure of the number of ways in which a system may no longer be arranged”: increasing entropy represents a decrease in variety. In which case, their definition of ‘disorder’ makes no sense, because the logical extreme of their ‘disorder’ is one in which there are no possibilities for change – otherwise known as extreme order.
‘Order’ is a decrease in variety; ‘useful order’ is a decrease in variety that can be used to deliver ‘controllable’ or ‘predictable’ results; ‘non-useful order’ is where the variety has decayed to the point that the available options for change are not sufficient to match the needs of the context; and so on, with further decay towards stasis. Once the natural decay of variety hits ‘non-useful order’, we have to apply tactics and tricks to reverse-entropy locally in order to make it useful again. One classic pattern for reverse-entropy is ‘catastrophic-collapse and revolution’, but it’s by no means the only one, of course.
Dunno if that makes any more sense, but it’s all that I can manage at the moment… sorry…
Hi Tom,
Enjoyed the post, but can we reverse-entropy, fight inevitability, or are we purposefully changing and through adaptation resetting the entropy clock?
It’s a difference in perspective that shifts emphasis from defense of ‘useful order’, to a view that promotes self-organization as a valuable affordance of architecture.
Order leads to controls/structures that accelerate entropy, but order is not the end – utility is.
If we want to maximize utility for varying and changing contexts, we should maximize design of extensible/configurable capabilities – latent capacity – let those be exploited with minimal imposition of fixed structures. It’s not the absence of order/controls, its the use of loosely-coupled constraints.
This is a different worldview from Service Orientation, which promotes parts over the whole – modular reductionism (SOA apps are hard to govern, adapt, version control). In SOA, use of a part is defined by the part owner/designer – this is antithetical to self-organization / adaptation. Instead, the implementation should be controlled by the system/application/interaction, lowering the ‘cost’ of change with opportunity to improve outcomes.
Best,
Dave
@Dave: “Order leads to controls/structures that accelerate entropy, but order is not the end – utility is.”
Very strongly agree. Far too many people in business mistake the means – ‘imposing order’, etc – for the real ends of utility and purpose.
In general I’d agree with your points re SOA etc, though there’s a lot of fine detail re tactics and suchlike that would take probably an hour at least of phone-conversation to tweak out. May I leave that until I’m feeling somewhat better than at present? 🙁 – though do remind me that it’s something that really does need further exploration. Thanks!
Anytime – it’d be nice to catchup.
Get well.
Dave
Tom. Let’s be clear, the Wikipedia definition really is correct. I only used it, because I couldn’t find the book that set me off on my entropy thing, The Black Hole War by Leonard Susskind (it’ll be in one of 10 large boxes). I would have directly quoted from that but the Wiki definition pretty much says the same thing. Leonard Susskind, by the way, is Felix Bloch Professor of Theoretical Physics at Stanford University, and Director of the Stanford Institute for Theoretical Physics. That certainly doesn’t make him right about everything but I think he knows his statistical mechanics.
I can’t go along with the inferences you draw from the Wiki definition. Makes no sense to me at all but I’m not going to say it’s wrong, because that would just be an assertion and a distraction from the core of the discussion.
I have already agreed that “order is a decrease in variety” in the sense in which you use it above. I might want to modify the statement slightly and say that “useful order” is the result of meeting variety with variety. Non-useful order is then the imposition of order on variety leading, as you say, to a decay in variety – but then that’s a decay in variety within an enterprise, which then cannot deal with variety in the environment. The latter is neither a good nor a bad thing – it’s just a fact. I thought we agreed about that.
So we could perhaps create a union of our perspectives by saying that entropy in your sense is a (undesirable) result within an enterprise of failing to deal with entropy (in my sense) in the environment.
@Stuart: “the Wikipedia definition really is correct” – very probably, and I’m neither a physicist nor a statistician, so I won’t argue with it or you on that.
The real point I’m on about here – and I think we’re fairly close to agreement on this? – is that whichever definition we use for entropy, the effect in business is that order suppresses available variety, especially in terms of the organisation’s ability to cope with variety in its environment. In particular, the organisation’s attempts to impose order ultimately lead to a reduction in ‘useful order’ – available-variety of ‘control’ at sufficient level to ensure organisationally-desirable results in relation to enviromental variety. This then necessitates a ‘refresh’ which is made available not by imposing further order, but by leveraging opportunities for reverse-entropy, via some form of ‘return towards chaos’.
@Stuart: “The latter is neither a good nor a bad thing – it’s just a fact. I thought we agreed about that.”
I think we do? – I’ll admit I’m getting a bit confused about terminology etc here… 🙁
@Stuart: “So we could perhaps create a union of our perspectives by saying that entropy in your sense is a (undesirable) result within an enterprise of failing to deal with entropy (in my sense) in the environment.”
That’s probably the best way to put it, yes. (We might need to refine that a bit further, but I don’t think this is the right medium to do it? – better with a whiteboard, or a notepad in a pub somewhere?)
Thanks for the critique, anyway – much appreciated!
I’ve been chewing on this one all weekend. The physics analogies (“chaos” and “entropy”) are useful, but only up to a point. It seems that the organic model comes closer to describing human enterprises. I get the impression of a somewhat lumbering host inhabited by quicker thinking viral bodies that range from parasitic to symbiotic.
@Gene: “I get the impression…” – a really good way to put it – thanks!
Hugh,
I was thinking almost the same, what do we gain from a storytelling point of view from using those analogies from physics? Those terms are so abstract for most people (including me) in our enterprise architecture and business world that this discussion looks chaotic by itself. Is there not a more understandable way to tell the story behind chaos and order?
I like this quote from “What Lies Between Order and Chaos?” by James P. Crutchfield:
Certainly, whatever this dynamic is, it is not unfamiliar to us. The structural anthropologist Claude Levi-Strauss describes the process as he experienced it during his first treks in the 1930s into the Amazon:
“Seen from the outside, the Amazonian forest seems like a mass of congealed bubbles, a vertical accumulation of green swellings; it is as if some pathological disorder had attacked the riverscape over its whole extent. But once you break through the surface-skin and go inside, everything changes: seen from within, the chaotic mass becomes a monumental universe. The forest ceases to be a terrestrial distemper; it could be taken for a new planetary world, as rich as our world, and replacing it.
As soon as the eye becomes accustomed to recognizing the forest’s various closely adjacent planes, and the mind has overcome its first impression of being overwhelmed, a complex system can be perceived.”
The article ends very powerful with:
What lies between order and chaos? The answer now seems
remarkably simple: Human innovation. The novelist and lepidopterist Vladimir Nabokov appreciated more deeply, than many, the origins of creativity in this middle, human ground:
“There is, it would seem, in the dimensional scale of the world a kind of delicate meeting place between imagination and knowledge, a point, arrived at by diminishing large things and enlarging small ones, that is intrinsically artistic.”
Source: http://csc.ucdavis.edu/~cmg/compmech/tutorials/wlboac.pdf
Peter,
There’s definitely value in those analogies, my point was that they had limits. Entropy most definitely applies to certain aspects of the enterprise, but the enterprise itself is better described using the organic model. Chaos theory is nicely illustrative of human interactions, even if (IMHO) it doesn’t strictly fit the definition.
Thanks, Peter – that’s a great article by Crutchfield (one of the chaos-theory classics, and unusually readable for a formal scientific paper!).
@Peter,
Physics is a specialised science. But it made some remarkable achievement which made the life of humans better. Interestingly some of these achievement go beyond the “scope” of physics and are used by other “specialised” sciences as well as by interdisciplinary ones. Entropy is one of those. E.g. see how significant it is to Information Theory.
The power of a story comes by emergent properties of the system in which the story is used. Simply put, the main three parts of the system are the story, the listener(s) and the environment. The emergent property comes from the unique interpretation the listener(reader) does which an act of co-creation. This interpenetration is dependant on the user experience and personality as well as on the influence (bi-directional) of the third component – the environment. The whole has adaptability and other emergent properties.
If we apply this to here and now, to this article and this discussion, it seems we are not dealing with entropy but with the “story of entropy”, and our co-creating based on our mental models is what matters and what brought in short time 15 comments.
@Ivo: “it seems we are not dealing with entropy but with the ‘story of entropy’, and our co-creating based on our mental models is what matters”
Very, very good point – many thanks for that.
I didn’t question the value of using analogies from physics and I’m certainly not questioning the value of physics itself. I was questioning the way (the form) the analogy is used and discussed. I deliberately gave examples and a link to an whole article how you can describe things from other sciences about the relation between order and chaos in a more understandable manner. In my opinion that would allow for more people to take part in the discussion about an important subject.
For me as a layman in the area of physics and someone who must cope with English as a second-language the article and the following discussion is almost incomprehensible. I read the “What Lies Between Order and Chaos?” by James P. Crutchfield with much pleasure and that gave me insights that helps me to understand the above article and discussion a bit better. But I still cannot relate to the article and discussion here and don’t feel comfortable to participate in the discussion.
I totally agree with Ivo when he says “The power of a story comes by emergent properties of the system in which the story is used. Simply put, the main three parts of the system are the story, the listener(s) and the environment.”. I postulate that most “listeners” are laymen in physics and the environment is the world of business/enterprise architecture and that the power of the story is not as strong as it could because the article and the discussion puts too much emphasis on the physics-side of entropy.
It is not an attack on the knowledge and the intentions of the participants nor on the use of analogies from whatever field. It is just a request for a bit empathy for simple souls like me 🙂
Peter,
Thank you for the article. I looked it briefly, and as it looks very interesting, I placed it in my reading backlog. One thing I noticed is that it’s issued from the Santa Fe institute. That is a leading research center on complex systems. Murray Gell-Mann, who I cited in my first comment, is from there. It seems that recently they want to make their work popular outside academic circles. Last summer they published on youtube this wonderful panel discussion and there will be a free on-line course as well.
Nice post and great discussion!
Order is unnatural, whilst chaos is natural. Order restricts flexibility and increases “stiffness”. Order is acceptable for a short period of time, as it leads to achievement of goals efficiently. There is order in chaos – except that the fidelity of our viewing perspective makes it seem like chaos. E.g. what appears to be a circle, at a sufficiently fine fidelity is a collection of *infinite” straight lines or a surface that appears smooth (orderly) to naked eyes is a very rough when under an electron microscope (chaos).
@Pallab – thanks for this: I hope I’ve extended it a bit more in the subsequent post ‘More on reframing entropy in business‘.
“There is order in chaos” – reminds me of a reviewer of James Gleick’s book ‘Chaos: The making of a new science’, who said “It turns out that behind order lies an eerie kind of chaos – and behind that chaos lies an even eerier kind of order”. There’s a lot we can learn from that in exploring how our organisations and enterprises actually work – as (to me) can also be seen in those articles by Crutchfield and Gell-Mann that Peter and Ivo point to above.