"The Unaccountability Machine" by Dan Davies - Book Review
Share

Summary
"Longlisted for the 2024 Financial Times Book of the Year. How life and the economy became a black box--a collection of systems no one understands, producing outcomes no one likes.
Passengers get bumped from flights. Phone menus disconnect. Automated financial trades produce market collapse. Of all the challenges in modern life, some of the most vexing come from our relationships with automation: a large system does us wrong, and there's nothing we can do about it.
The problem, economist Dan Davies shows, is accountability sinks: systems in which decisions are delegated to a complex rule book or set of standard procedures, making it impossible to identify the source of mistakes when they happen. In our increasingly unhuman world--lives dominated by algorithms, artificial intelligence, and large organizations--these accountability sinks produce more than just aggravation. They make life and economy unknowable--a black box for no reason.
In The Unaccountability Machine, Davies lays bare how markets, institutions, and even governments systematically generate outcomes that no one--not even those involved in making them--seems to want. Since the earliest days of the computer age, theorists have foreseen the dangers of complex systems without personal accountability. In response, British business scholar Stafford Beer developed an accountability-first approach to management called "cybernetics," which might have taken off had his biggest client (the Chilean government) not fallen to a bloody coup in 1973.
With his signature blend of economic and journalistic rigor, Davies examines what's gone wrong since Beer, including what might have been had the world embraced cybernetics when it had the chance. The Unaccountability Machine is a revelatory and resonant account of how modern life became predisposed to dysfunction."
Review
This non-fiction book taps into the implicit feeling that many people share about the contemporary world: that something has gone deeply wrong, and we do not know how to fix it. Deriving from the academic literature on cybernetics, the author explains how big systems work, including corporations and governments, and how some trends from the past decades have curtailed positive feedback loops for systems to autocorrect. The most memorable concept was the "accountability sink", describing how big corporations create complex structures that dilute responsibility. In these cases, there is no one to blame for poor results, since they cannot be traced back to specific individuals or actions, but to the system itself. A good example would be airline cancellations that leave customers arguing with employees who are not directly responsible for it. Nonetheless, the reading is not pleasurable and is sometimes obscure and confusing, especially if you do not understand the academic branch that is being derived from.
Key Takeaways
- Modern corporations have incentives to create accountability sinks
- Systems perform the function they are designed for
- As a society, we need to correct feedback mechanisms for systems producing bad results
Who Should Read This
Readers looking for an introduction to cybernetics in the context of contemporary economics and business.
Favourite Quotes
"The people in the system are good, but the system itself is not."
"If you read the great books of management with this in mind, you’ll notice a strong common theme; the stars of this literature all try to get managers to understand that they must create systems which regulate themselves rather than requiring constant supervision. Most of what’s worth reading in management science is about stopping decision-making systems from becoming overwhelmed."
Rating
⭐⭐⭐ (3/5 stars)