IT – it’s about (much) more than just ‘digital-stuff’
Information-technology – what is it?
There are so many arguments on this that it’s probably simplest to sidestep the question, and say that information-technology is the technology of information. Kinda straightforward, when we look at it like that.
But in which case, what is the information to which that technology would apply? To hear most self-styled ‘IT’-advocates, you’d probably get the impression that it’s only the kinds of information that would fit on some kind of screen.
Actually, no. There’s a lot more to it than that…
A whiteboard is information-technology. The sticky-notes on that whiteboard are information-technology. The pen and paper with which we make handwritten notes about what we see on the whiteboard, and the notebook in which we make those notes, are all information-technologies too.
A photocopier is information-technology. So is the printing-press that produces the free-newspapers that litter the subway-station, and the notice-board that asks us to take those newspapers away with us when we’re done with them.
Conversations between people are a kind of information-technology. A cafe is a platform for such information-exchange. Likewise the office water-cooler, of course.
Information-technologies don’t have to be ‘digital’. An analogue radio is information technology. A film-camera is information-technology.
Some information-technologies are entirely physical. A Yale-key is information-technology: it carries information that the holder has rights of access to the space beyond the lock. The cash notes and coins in your wallet or purse are an information-technology: they denote rights of access or transfer of ownership for certain types of societal resources.
Even computers don’t have to be digital. There are plenty of analogue-computers around, each with their own specific usages. And people can be computers, too: up until around the 1960s, perhaps particularly in engineering and aircraft-design, ‘computer’ was a job-title, not a machine.
As information-technologies, computers don’t have to compute, either: most so-called computers don’t do much of it at all. As indicated better by the French term ‘ordinateur‘ – an ‘orderer’ – what they mostly work with is the ordering of information: sorting, searching, stuff like that.
Which in turn points to a crucial constraint on the usefulness of computers – especially digital-type computers: that they’re often not good at all with information that isn’t amenable to order, structure, predefined schema and suchlike. There’s a lot of information that doesn’t fit well within those constraints.
So why do we so much get the impression these that ‘information-technology’ is synonymous with ‘digital-computer’?
Short answer: an awful lot of sales-hype, all of it underpinned by a rather nasty little term-hijack.
A term-hijack occurs when, given a context denoted by some term, a subset of that context purports to be the whole – blocking the view to anything else in that context. Perhaps the classic example is the term-hijack of ‘economics’ as ‘managing the money’, whereas the literal meaning of ‘economics’ is ‘the management of the household’. Managing the money is important, but it’s only one small part of managing a household – and often the easy part at that…
Likewise the notion that IT is only about digital-computers: it’s a term-hijack, a tiny subset pretending to be the whole, and blocking our view to everything else.
(It’s even worse in enterprise-architecture, with a tiny subset of a subset pretending to be the whole.)
Why does this matter?
It’s because the hype-machine around digital-technologies keeps pushing the notion that every concern can be resolved in a ‘digital’ way – indeed, that the ‘digital’ is not only the best way, but the only way, always, for everything.
Which, for many contexts, is just plain wrong.
— Digital isn’t always the only way:
Most so-called ‘digital transformation’ is actually better described as ‘multichannel’ – and only some of those channels would (or even could) use current ‘digital’ information-technologies.
In any human context, person-to-person connection may be needed – much of which won’t happen if digital-technology is the only intermediary. In a conference-call, there’s often crucial information such as body-language that can’t be carried down the line: distributed teams need an occasional in-person meetup in order to work well.
— Digital isn’t always the best way:
I walk through a gleaming new aircraft factory in Mexico. Every work-area has its own kanban-board, for tracking local work-flow. Yet whilst there are digital status-displays, the kanban-system itself is entirely manual: hand-written cards in ordered racks. Why? It’s because the act of writing and handling a card embeds personal responsibility in a way that merely pushing a button just can’t.
A greetings-card through the the mail is experienced differently from an emailed link to the same graphic-image. Music in digital format often sounds cold, clinical, mechanistic: hence, as Geoffrey Colon put it in a recent LinkedIn post, ‘analogue is fast becoming ‘the new black’. The same for movies: there are very good reasons why many directors still prefer to work with film.
— Digital isn’t always the right way:
By its nature as an ‘orderer’, most digital-technology is not well suited to things that aren’t amenable to order, to predefined structure. The real-world is riddled with uncertainty, uniqueness, unpredictability, unorder. And there’s always something that doesn’t fit our assumptions – which matters when the information-need must match up with every eventuality. The man turned back at the border, for example, with a so-called ‘fake’ passport, because it had no fingerprint data – but those data were missing, not because his passport was fake, but because he had no hands…
And no matter how big the so-called ‘big-data’, there’s often no substitute for human judgement, human sensemaking and serendipity. Trying to do everything ‘the digital way’ too often leads us to a mindless ‘me-too’ mediocrity, driven by predefined ‘best practices’ that probably aren’t best for anything at all.
— Digital doesn’t always work:
Digital is great when it works, but often worse than useless when it doesn’t – especially if we rely on it as our only information-technology for that need. (‘Digital wallet’, anyone?) Wherever we use digital-technologies, we’ll always need a backup option that can take over when the power’s gone down, the server’s gone offline, whatever.
I walk into my local Starbucks. The network’s down again, they say: they can’t take credit-cards or NFC-payments, but they can still take cash. They write the order in a pre-printed ledger that usually sits under the counter, ready for use in exactly this context; I pay cash, take my coffee, and go. Later, when the network is back up again, they’ll transfer those hand-written records to the regular system. Different information-technologies, running in parallel, switching smoothly between them according to the need: can your current information-systems do that?
Hence as enterprise-architects, we do need to be wary of the current over-hype on ‘digital-everything’. We’re not being ‘Luddites’ to do so. (Actually, we are, in the proper sense of the term – but that’s another story!) Instead, we’re taking the necessary and appropriate precautions against ‘solutioneering‘ – the misframing of a problem solely in terms of a ‘solution’ that someone wants to sell.
To do it properly, as enterprise-architects, always work in this step-by-step sequence:
- What is the actual need within this context? (for the customer, citizen, company, supplier, regulator, whatever)
- What is the information-need to support that actual-need? (for all stakeholders in that context)
- What information-technologies (always plural!) would be available to support that information-need?
The choice of information-technology should always come last in that sequence – not first. We forget that fact at our peril…