If it can be built, it can fall apart: a cautionary study in how complex systems can easily go awry.As systems become more complex, guided by artificial intelligence and algorithms as well as human experience, they become more likely to fail. The result, write one-time derivatives trader and commercial pilot Clearfield and Tilcsik (Univ. of Toronto Rotman School of Management), is that we are now “in the golden age of meltdowns,” confronted on all sides by things that fall apart, whether the financial fortunes of entrepreneurs, the release valves of dam plumbing, or the ailerons of jetliners. The authors examine numerous case studies of how miscommunications and failed checklists figure into disaster, as with one notorious air crash where improperly handled oxygen canisters produced a fatal in-flight fire: “The investigation,” they write, “revealed a morass of mistakes, coincidences, and everyday confusions.” Against this, Clearfield and Tilcsik helpfully propose ways in which the likelihood of disaster or unintended consequences can be lessened: cross-training, for instance, so that members of a team know something of one another’s jobs and responsibilities, and iterative processes of checking and cross-checking. At times, the authors venture into matters of controversy, as when they observe that mandatory diversity training yields more rather than less racist behavior and suggest that “targeted recruitment” of underrepresented groups sends a more positive message: “Help us find a greater variety of promising employees!” Though the underlying argument isn’t new—the authors draw heavily on the work of social scientist Charles Perrow, particularly his 1984 book Normal Accidents—the authors’ body of examples is relatively fresh, if sometimes not so well remembered today—e.g., the journalistic crimes of Jayson Blair, made possible by a complex accounting system that just begged to be gamed. Programmers, social engineers, and management consultants are among the many audiences for this useful, thought-provoking book.