Engineering in reverse

Authors: Jonathan Grudin
Posted: Thu, January 09, 2014 - 11:34:14

As a new year starts, we may review the year past, taking note of passages and travel, selecting events that provide humorous, solemn, embarrassing, or celebratory glances back. A crafted retrospective might be accompanied by a resolution to do better.

More broadly, much time is spent analyzing the past. Acclaimed successes—a project or product, a career, a discipline—we wish to understand and emulate. We can also learn from failures—a terminated project, someone who missed being a contender, an unsuccessful line of research. Any project can reveal possible efficiencies; any life can be learned from.

Reverse engineering successes

Success sells. Countless business books promote management practices such as business process reengineering or building diversity, illustrating them with case studies of successful application. Magazines promise to reveal the strategies of successful businesses and executives. Research papers identify factors shared by successful ventures: open-source development projects, social media sites, and so on. Readers hope that understanding past successes will improve the odds for their next endeavor.

A previous post on confirmation bias quoted Francis Bacon describing a success—a man on a storm-tossed ship praying and being saved—and noted that we can’t draw a causal connection because we don’t hear from those who drowned after saying their vows. Different factors could save a man; a successful project or enterprise could owe success to an almost infinite range of factors.

Finding a practice shared by successful ventures tells us little, because there are so many factors that could contribute to the outcome. A big step toward producing a useful analysis of successes is to simultaneously study unsuccessful ventures. If a practice is present in the former and not the latter, its positive contribution is much more plausible—but this is rarely done. It is not inspirational to read about failures and it can be difficult to get people to discuss them objectively.

What phases of successful software engineering projects are the most expensive? Operation and maintenance—and analyses showed those costs would be far less had the initial design been better. The conclusion—put more effort into designing it right—is congenial to HCI professionals who all too often are asked to paper-over deep problems with surface user interface adjustments, help text, and training. However, is this conclusion valid? Perhaps not. In environments where one in ten new ventures succeeds, reverse engineering the successes is risky. Why did the 90% fail? How many spent so much time on design they missed a go/no-go decision point, lost the confidence of management, or lost out to a rival project that presented a design that looked good enough? Without analyzing failed projects, we don’t know whether spending more time on design is good advice. Reverse engineering of successful software projects was worthwhile, but not enough.

However, analyzing failed projects has challenges, too.

Reverse engineering failures

“Success has many parents, but failure is an orphan.”

Some companies claim to conduct project “post-mortems.” When a product or project collapses, senior management would like to know what went wrong. However, to avoid acrimonious finger-pointing and further demoralization of team members, the preference is to get everyone looking forward and engaged on new projects as quickly as possible. Dwelling on what went wrong could make people overly cautious or averse to documenting activity for fear of subsequent retribution. And no one wants news about problems to reach the press, customers, or funding agencies. Twenty-five years ago (at a different company), when a high-level effort was cancelled as it neared completion, we were instructed to destroy the extensive record of our work.

The collapse of an organization is also difficult to dissect. The aforementioned enterprise and another that I worked for were extremely successful for many years, then went bankrupt. Their records vanished. When I heard that one was shutting down, I phoned a former colleague to ask her to preserve some materials. “You’re two days too late,” she said. “It all went to the dump.” Similarly, when AFIPS, the parent organization of ACM and IEEE, collapsed financially and went out of business in 1990, its records and collections became landfill. Not only is it difficult to piece together what happened, years later there was uncertainty about copyright ownership of its conference proceedings.

Reasons for burying the past include legal liability. Consider near-accidents in commercial aviation. The potential benefit in logging and understanding them is clear, but so are the disincentives for reporting them. To address this a collection of reports is maintained by a respected third party, NASA, which provides assurance of anonymity and avoiding retribution when pilots file “after-incident reports.”

The complexity of reverse engineering a failure was elegantly described by the physicist Richard Feynman when investigating the 1986 Space Shuttle Challenger disaster. The commission determined that the primary O-ring did not seal properly in cold weather. In examining O-ring engineering and management, they found that vulnerabilities were understood but that a series of faulty management decisions led to the risk being underestimated.

This seemed a successful resolution. No, said Feynman. Was the faulty decision-making an unfortunate sequence of rare events, or business as usual? The commission randomly selected other engineering elements of the shuttle and conducted comparable analyses to determine whether similar forces led to the underestimation of other potential catastrophic failures. In all but one they found comparable problems. This highly unusual, thorough approach identified systematic higher-level issues.

Reverse engineering disciplines

The sciences strive for rigor, elegance, prestige, and funding. Mathematics and physics are at the pyramid’s apex, widely envied and mimicked. Computer science theory branched off from mathematics. Just as some mathematicians look down on CS theory, some CS theoreticians hold other branches of computer science in dim regard: the mechanics of hierarchy. While earning degrees in physics and mathematics I shared my colleagues’ low regard for psychology. Later, working as a software developer and worried about our species, I read more widely and came to a different view.

On returning to university to study psychology, I found that many of my colleagues had a misplaced “physics envy” and were too easily impressed by mathematical expressions. In addition, they misunderstood the history of the hard sciences. They reverse engineered these successful disciplines based on limited information. They assumed that the rigorously defined abstract terminology, theory, and hypothesis-testing of today were the root source of progress. Tracing a lineage—Einstein and Gödel, Newton and Leibniz, Archimedes and Pythagoras—it can appear to be a succession of major advances separated by periods of steady, incremental progress in which theories and theorems were proposed and tested experimentally, or, in the case of mathematics, proven or disproven. In Thomas Kuhn’s terms, “scientific revolutions” and “normal science.”

This is seriously misleading. Confusion and unproductive paths affected mathematics over the millennia prior to the development in the 19th century of systematic approaches to notation, concept, and proof. In the natural sciences, physics, chemistry, and biology were for centuries impeded, not advanced, by theory-building and hypothesis-testing. The theoreticians were astrologers, alchemists, and theologians. What was needed was descriptive science: collecting and organizing observations. Tycho Brahe’s meticulous astronomical measurements, Linnaeus’s painstaking collection of animals and plants, Mendeleyev’s arrangement of elements by their properties, none of it informed by or leading to useful theory in their hands, paved the way for the emergence of theoretical sciences. In the late 20th century, Thomas Kuhn among others described psychology as “pre-theoretical,” suggesting that the proper focus is descriptive science, collecting and organizing observations.

The theory-driven field of astrology still gets regular coverage in major newspapers. In some areas of computer science and related fields, “building theory” and hypothesis-testing are heavily promoted. The results are not always more useful than horoscopes. Students are advised, “No need to look in the real world for a problem to address: Find a theory in the literature that might apply in a tech setting, design a controlled experiment with uncertain ecological validity, conduct analyses that are susceptible to confirmation bias, claim causal vindication from correlational data...” Then take a break to review papers, rejecting strong descriptive scientific contributions that “lack theory-building.”

Graduate students with beautiful data have approached me in desperation, looking for a theory that their data could inform. Their committee insists. This is a tragic consequence of emulating successful disciplines by selective reverse engineering.

Reverse engineering lives

Biography and autobiography are retrospective views of the lives of the famous and occasionally the infamous, potential role models or object lessons. Although a good biography identifies blemishes as well as virtues, biographers generally have a positive view of their subjects and autobiographers even more so. Politicians, business executives, and professors often give talks recounting their paths to prominence. They offer advice, such as “don’t follow the safe path—pursue your passion.” 

Once again, these exercises in reverse engineering come apart under inspection. First, we do not read biographies or inspirational speeches from people who did not succeed. (Even the infamous succeeded in their perfidy, or we wouldn’t find them interesting.) We do not read about those who pursued their passion to no avail. As a professor, some of my most sorrowful interactions were with grad students who would not be talked out of paths (such as building speech recognition systems) that I knew would not pan out. Second, how accurate are the accounts of successful people? Luminaries who advise young scientists to approach research idealistically often seem to have been adept at the politics of science.

Is it a problem if speakers view their pasts through rose-tinted glasses? Yes, if young people take them seriously. I saw some of the most talented and idealistic people I knew, who believed that merit would prevail and politics could be ignored, chewed up by the academic system. Most were women, either because women were more prone to idealistic views of science or because the system was more likely to find a place for a politically inept man than for a woman. Most likely both. Perhaps times have changed.

I am not recommending ignoring passion and embracing opportunism, but everyone should see the water they swim in and know how to increase the odds that their merit is recognized. Then make an informed decision about how to proceed. Realize the importance of connecting to congenial, helpful people, and also, realize that scientists can spend decades working diligently and brilliantly with nothing to show for it.

I will close with a startling example, from historian Colin Burke’s monograph Information and Secrecy: Vannevar Bush, Ultra, and the Other Memex. Bush was a highly successful MIT professor and administrator who oversaw government research. Many computer scientists were inspired by his 1945 essay “As We May Think.” It described the Memex, a futuristic information retrieval system based on optomechanically manipulated microfilm records, a system with many of the qualities of the Web today. Not widely known is that Bush impeded early semi-conductor research, feeling that microfilm was the future. More significantly, Burke describes 20 years of classified projects promoted and led by Bush in which phenomenal sums were spent trying to build parts of the Memex. Many brilliant scientists worked for decades at MIT and elsewhere on optomechanical systems, making astonishing innovations—but falling far short of the Memex. It was impossible. Decades of work and few publications. Information retrieval shifted from optomechanical to semiconductor systems. We rely on the reverse engineering of success and do not see the dead ends.

In summary, looking back is a tricky undertaking. Yet I don’t want to begin 2014 on a somber note and have often emphasized that history is a source of insight into the forces that explain the present and will shape the future. This is a remarkable time—so much is happening and it is so readily accessible. The task of staying abreast of pertinent information is intimidating, exhilarating, and necessary. The future should smile on those who see patterns in the activity that unfolds day by day.

Thanks to Steve Poltrock, Phil Barnard and John King for comments on a previous draft.

Posted in: on Thu, January 09, 2014 - 11:34:14

Jonathan Grudin

Jonathan Grudin works on support for education at Microsoft. Access these and related papers at under Prototype Systems.
View All Jonathan Grudin's Posts

Post Comment

No Comments Found