This is an excellent survey of adaptation as a strategy to deal with complex environments in which a top-down rational planning simply can't find solutions. There are many excellent stories, but this one bit speaks to me, a 1960s generation person:
When the US Army faced the 'disruptive innovation' of guerrilla warfare in Vietnam, there was great reluctance to accept that it had changed the nature of the game, making obsolete the Army's hard-won expertise in industrial warfare. As one senior officer said, "I'll be damned if I permit the United States army, its institutions, its doctrine, and its traditions to be destroyed just to win this lousy war."That ranks up there with the infamous statement by a US Major "It became necessary to destroy the town to save it." as reported by Peter Arnett.
This book is richly illustrated with examples of how an evolutionary strategy can find a solution where managerial dictum or top-down planning fails miserably. Some of the best examples are from business.
He does a very good job of reviewing:
- The disaster of the US invasion of Iraq and how a bottom-up effort by rebellious colonels and captains finally changed the US military tactics off their dangerous track and toward a more successful approach.
- The regulatory failures and coupled risks that led a meltdown in a constrained "sub-prime real estate" market to the globe straddling collapse of financial markets in 2008. He walks the reader through the failure to listen to whistle-blowers and the reluctance to effectively change the banking rules to prevent another catastrophe.
- The mindless simplifications of a greenhouse gas enthusiast compared to the known complexities of trying to identify a proper "green strategy". He walks through a day's choices of "green alternatives" and shows why each and every one was wrong because the underlying reality is far more complex than the simplistic green enthusiast ever could imagine.
- He examines nuclear safety and shows how the various catastrophes were waiting to happen because the systems are designed with too much complexity and coupled failure modes.
- He looks at two big oil rig diasters, the Piper Alpha in July 1988, and the Deepwater Horizon in April 2010. He walks through the failure in design and safety systems. He shows why these were accidents waiting to happen.
I worked in a company where they paid lip service to the idea that "there is no failure, just a learning opportunity" and that projects, especially in the R&D lab, should expect a high failure rate. But in reality, failures were punished, so creativity was suppressed and lessons really weren't learned. Tim Harford gives a glowing review of Google as a learning environment with adaptive engineering practices, but I'm cynical. It is hard for managers to accept failure. Corporations are always going to get atherosclerosis. The big old successful corporations are always going to fall to the young, rising whippersnappers.
I also worked closely with QA (Quality Assurance) people and watched them play their role. In theory they were the frontline defence against obfuscation and deception on the part of managers and teams that are failing but want to pretend that things are going teckety-boo. The org chart showed them reporting independently right up to the CEO to ensure independent and timely information about project problems. But in reality project managers had a "right" to demand issues first be heard by them and they could muscle most QA auditors into silence. Similarly, I was involved in a ISO 9000 initiative within the company and quickly discovered that most of our "learning organization" capabilities, such as our extensive audited written procedures, were in fact window-dressing. In short: it is hard to build and maintain a truly adaptive organization that uses evolutionary strategies for problem-solving. Humans don't like uncertainty and they love hierarchical organizations. I like the message in Tim Harford's book, but I'm sure it will get more lip service than real implementation.
There is much wisdom in the book, much to learn, I strongly recommend it to everyone. It will open your eyes to the complexity that is out there. It will stun you to realize how badly our engineered "safety systems" have failed. And it will give you an appreciation for a need for more experimentation and a healthy acceptance of failure as the technique for learning.