A Review of "The Unaccountability Machine"
On optimising for the wrong things
Amazon link
The central theme of today’s book: the perils of setting a single optimisation objective. This isn’t new territory (e.g. Seeing Like A State), but the proposed solution is.
Our story begins with World War 2. Its problems required a new discipline, operations research, to grapple with the sheer scale of modern industrial systems. A methodological snag: how do you model systems that affect and are shaped by the environment?
Take an anti-aircraft gun: sighting onto an enemy plane might cause evasive manoeuvres, causing oscillations between forecasts and corrections. As a scientist, one could simplify by modelling simplified sequential steps: action and observation. Or, one could consider the system as a complex whole. Davies uses the wonderful analogy of a Rubik’s Cube. The former (sequential and parallel analysis) would view it as 36 coloured cubes. The latter would view it in totality.
The sequential view has now won out (largely due to the parallel invention of the transistor, highly suited to this type of maths). However, Norbert Weiner and Stafford Beer (American and British respectively) sought to maintain and develop a holistic view of the system as a whole.
Outside of an attempt to facilitate Chilean nationalisations before Pinochet, Beer’s ideas have found limited real-world implementation. However, his intellectual lineage has left us with two tools: Ashby’s law of requisite variety, and the Viable Systems Model.
To illustrate Ashby’s law, consider a train driver. Here, the “system” (railroads) takes care of steering. The operator need only control speed, and a single forward-or-reverse lever suffices. On the other hand, we need a wheel and pedals to drive a car. In short, any system has to be able to contain enough variety (degrees of freedom) to match the complexity of information and requirements imposed on it.
So far, so innocuous. However, does a Fortune 500 company hiring McKinsey indicate some systemic failure or business as normal? Does the replication crisis stem from bad apples or a publishing systems failure? To answer these questions, Stafford Beer schematised the constituent parts of any system into five pieces:
System 1: the “execution” part, those who do things to the outside world. (Soldiers, violinists, bus drivers)
System 2: the “enabling” part, those who directly enable external action. (Quartermasters, COOs)
System 3: the “manager” part. Systems 1 and 2 require some direction for action “here and now”: System 3 provides that guidance, and mediates conflict and exceptions.
System 4: the “reconnaissance” part. #4 looks ahead to what may be coming: think the CIA for government or strategy departments for companies.
System 5: the “identity” part. Conflicts between System 3 (“here and now”) and System 4 (future planning) require some mediation. This occurs through appeals to culture, vision or purpose.
These systems aren’t formal boxes on an org chart: anyone can do them. However, these definitions apply at all levels of analysis: within a company to over a market and across an entire economy. With this toolbox, we can now diagnose system failures as either a) loss of information (Ashby’s Law) b) part failure (in 1, 2) or coordination failure (in 3 between 1 & 2, or 5 between 3 & 4).
Unaccountability Machine applies these ideas to contemporary society with the notion of an “accountability sink”, or coordination failure. Modern systems handle complexity through process that remove responsibility from any one person. We may view this as Ashby’s Law in effect: simplifying complex information to simpler “if-then” statements.
However, this results in many unintended consequences. Politics should theoretically express the will of the electorate. In practice, no one actually believes a vote or call changes much (loss of information). Moreover, politicians have given up control of infrastructure like money or energy to third parties (part failure). Markets are no better. Profit maximisation allows firms to focus on one external signal to the exclusion of many others.
Finally, when fundamental systems go wrong, no one foresees it or has the agency to correct it (coordination failure). Economics continually misdiagnoses this as a failure of incentives, rather than of information or systems design. Take financial crises for example. So far, regulation seems to have had little effect (and is missing the brewing storm in CLOs). No one wants another 2008, but everyone is working towards one.
I’ve done only partial justice to the book. As with many “big idea” style writings, its diagnosis is more convincing than its solution. It’s unclear to me how anything bar a complete fundamental system reset could solve these issues. However, it’s still a highly accessible introduction to cybernetics as an alternate paradigm for understanding society.


