People: timelines

XIII.6 November + December 2006
Page: 50
Digital Citation

The demon in the basement

Jonathan Grudin

back to top 

People and technology: Who's in control? What's in control? Or are we out of control, reaching a tipping point, approaching a singularity? This column and the next look to the past with an eye on the future.

back to top  Technological Determinism: Where Do You Stand?

I have shown audiences the slide reproduced in Figure 1 and asked them to raise a hand when I reach the position closest to theirs. Hard determinism argues that behavior is inevitably pushed in a particular direction when a technology is adopted; soft determinism describes a significant tendency for that to happen. How would you vote?

Typically, no one votes for 1. A few vote for 2. A strong majority always favors 3, co-determinism. In most audiences, one person hesitantly raises a hand for 4. I admire this expression of free will, but my vote is 1.5: I believe technology has a very strong influence.

The literature is not on my side. Debates on technological determinism, mostly in the 1970s and 1980s, eventually settled at around 3.5. Governments and corporate management, they argued, would use technology to strengthen control and reinforce existing power structures. "What emerges is a strong and observable tendency toward use of computing technology to reinforce the decision authority status quo" [1].

In the 1990s, co-determinist positions emerged. Structuration theory held that behavior and technology co-evolve, each influencing the other. Wanda Orlikowski wrote, "The ongoing interaction of technology with organizations must be understood dialectically, as involving reciprocal causation, where the specific institutional context and the actions of knowledgeable, reflexive humans always mediate the relationship" [2]. Actor-network theory [3] posited that understanding came from identifying networks in which technological artifacts and people are treated uniformly. The influence of actors is not necessarily equal, but the analysis is uninteresting if one actor consistently dominates.

Now, in the 2000s, I argue that a major technology has a strong deterministic influence once adopted. We can avoid adopting a technology, individually or collectively, up to a point. Avoiding technologies such as metal, gunpowder, or electricity grew more difficult once neighboring groups had them.

There is a progression here: In the 1980s, people are in charge; in the 1990s, technology has equal footing; in the 2000s, the influence of digital technologies may be seen as stronger. Effects of digital technology may be felt especially rapidly, because extraordinary increases in power and decreases in price enable the technology to spread with unparalleled speed. A few qualifications:

  1. Outcomes are determined by the culture and the broader technological context at the time of adoption, not by the technology alone. This does not mean that people have control, but it does mean that outcomes can differ.
  2. The claim is made for digital technology broadly, not for every application or feature.
  3. As technology is more deeply embedded in work, domestic, and leisure activities, unavoidable effects are more likely. A mainframe used in the 1970s by a few people to handle payroll may have had little impact; When most employees use intranet and Internet daily for communication and production functions, effects may be unavoidable.

The third point suggests that the shift in positions on technological determinism may be a subtle consequence of Moore's Law. Although somewhat tediously familiar, Moore's Law has disrupted professions and industries. The disruptive effects are often not attributed to it, just as few would attribute the theoretical drift outlined above to hardware changes.

Exponential growth is neglected for a reason, with consequences that can be unfortunate. My hope is that this column will promote a greater understanding of the underlying dynamics, so that you will do better than I did when technology-driven change took me by surprise and forced career changes.

back to top  Understanding Rapid, Nonlinear Growth

I use the terms "Moore's Law" and "exponential growth" broadly to refer to the familiar range of phenomena that have doubled every year or two [4]. I am not concerned here with which aspects of digital technology have and haven't developed exponentially, or whether exponential growth will continue. The focus is on understanding the implications of exponential growth when it does occur. Your job will be to decide where to apply that understanding.

back to top  A Psychological Surprise

In the late 1970s, concerned about population growth and environmental changes, psychologists found that people reason very poorly about exponential growth [5]. That was not a great surprise. After all, we rarely encounter nonlinear growth in directly perceptible ways in nature. It happens—cell count doubles repeatedly in early fetal growth, salmonella bacteria multiply rapidly on warm, nutrient-rich chicken. But these dynamics are invisible.

The surprise was that giving people more information did not help! If anything, it produced worse reasoning. Bad news! People concerned with possible unforeseen consequences of exponential growth, such as Ray Kurzweil [6], typically wave charts at us and complain that we don't get it. The studies suggest that waving charts at us makes us stupider.

I've concluded that the wrong charts are being waved. Exponential growth depicted on normal X and Y axes, as on the left in Figure 2, quickly becomes unwieldy because the curve rises so quickly. The alternative, a log-linear plot, solves this problem, but the appearance of linearity leads us to overlook the dramatic rise. Salmonella can double for a long time without any effect, but given enough time, eventually it doubles to a level that makes you sick. Then, one more doubling can make you very sick indeed.

back to top  Visualizing Nonlinear Growth

Figure 3 shows ten, 30, and 50 years of Moore's Law. To compare them, the Y-axes are independently scaled. This reveals two critical features. (i) The curves differ only in the length of the segment that hugs the X-axis, the time spent close to zero relative to what is coming; (ii) when it arrives, the rise is steep.

If a phenomenon that increases exponentially—power or memory capacity, numbers of Internet users or Web sites—eventually has a substantial impact, the longer it takes to get there, the longer we experience the flat line squeezed down to the X-axis where nothing seems to be happening. We are lulled into a false sense of security. We hear about Moore's Law year after year, but nothing happens. Then, suddenly, comes a dramatic leap to a tipping point. Once in range, a few doublings take us to the point of impact from effectively zero. This is illustrated in Figure 4. At the top is the situation after 45 years have elapsed, 30 doublings every 18 months. The red line marks a critical level. We are nowhere close; looking back, we think "four and a half decades of exponential growth, big deal!" Yet we see below that we were on track to reach the bombshell in a few short years, without realizing it.

This assumes that a bombshell exists, a tipping point at which the growth makes a big difference. If not, none of this matters. We eat the chicken and we're fine. But we've already seen many Moore's Law-driven tipping points. We often don't recognize them for what they are. My career was twice diverted by bombs exploding in its path. Most of us will likely have similar experiences in the years ahead.

Tipping points have appeared in every Timelines column. In March I noted that the original Mac did not succeed, but 18 months later, a Mac with twice the memory, enough to run desktop publishing applications, did. (At the time I was publishing soon-to-be-insignificant research on command-line interfaces.) The May column laid out the rise and fall of research-and-development fields tied to platforms that were smaller and less expensive every decade. July's column discussed the unexpectedly rapid shift away from mainframes that was tied to the demise of the premier US computer professional society and conference/trade show. In September, Moore's Law enabled a chess-playing machine to deliver on a promise that was a couple decades overdue.

Moore's Law bombs hit hardware, software, and then UI professions, and have continued up the digital food chain. Long-play records, tape recorders, VHS, film cameras—entire industries—have gone away. Bankruptcies in the photographic-equipment business indicate that the impact came more quickly than anticipated. The huge music industry was shaken by file-sharing; another few Moore's Law cycles and Hollywood began to wonder where this will end up. Is management in control of these forces? Some governments may buy time by inhibiting technology use, but not much. It has been argued that information technology hastened the breakup of a Soviet Union that was still trying to control information dissemination by restricting photocopier use.

This is only the beginning. Don't discount the inertia of organizations and institutions, but even the most glacial—the university system, for example—will be deeply affected. History suggests that profound change will come faster than most people expect.

But perhaps slower than some expect. Are we headed toward a singularity? Will we soon have computers as smart as people? No, not likely. Perhaps I've been lulled into a false sense of security by the very phenomena I've described, expecting tomorrow's weather to be the same as today's, but I doubt it. Nevertheless, major shocks are coming. The next column will project forward from past trends.

A history can focus on engineering, conceptual development, or social trends. Engineering history—who did what, when, and how—is closest to technology. Conceptual history, which follows the evolution of the ideas that drive uses of technology, has been slower; many HCI ideas were foreshadowed byIvan Sutherland, Douglas Engelbart, and others in the 1960s and 1970s. Social history follows broad trends in technology use, identifying key actors—individuals, organizations and institutions—whose interests and practices shape the field. Change there is slowest, but can develop momentum.

back to top  The Demon and History

Social analysts tend to eschew technological determinism; over the years they have been invaluable in pushing us to step back from fascinating, attention-grabbing technologies to see the full range of actors in broader contexts. However, as we consider the uses of digital technology, a key actor is that demon in the basement who has been enforcing Moore's Law. It has been relatively easy to ignore the steady, predictable hammering down there. But for almost half a century the demon has been gathering strength. It's starting to shake the foundation.

back to top  References

The demon metaphor arose in an interview on the influence of Rob Kling,

1. George, Joey F. & King, John L. (1991). Examining the computing and centralization debate. Communications of the ACM, 34, 7, 62-72.

2. Orlikowski, Wanda (1992). The duality of technology: Rethinking the concept of technology in organizations. Organization Science, 3, 3, 398-427.

3. Law, John (1987). Technology and heterogeneous engineering: The case of Portuguese expansion. In W.E. Bijker, T.P. Hughes, and T.J. Pinch (Eds.), The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. MIT Press.

4. Moore limited his law to transistor density, and exponential growth technically includes linear and no growth-exponents of 1 and 0, so I am using these terms in a loose but familiar way.

5. Wagenaar, William A. & Sagaria, Sabato D. (1975). Misperception of exponential growth. Perception and Psychophysics, 18, 416-422. Timmers, Hans & Wagenaar, William A. (1977). Inverse statistics and misperception of exponential growth. Perception and Psychophysics, 21, 558-562. Wagenaar, William A. (1978). Intuitive predictions of growth. In Dietrich F. Burkhardt & William H. Ittelson (Eds.), Environmental assessment of socioeconomic systems. Plenum. Wagenaar, William A. & Timmers, Hans. (1978). Extrapolation of exponential time series is not enhanced by having more data points. Perception and Psychophysics, 24, 182-184. Wagenaar, William A. & Timmers, Hans. (1979). The pond-and-duckweed problem: Three experiments in the misperception of exponential growth. Acta Psychologica, 43, 289-251.

6. Kurzweil, Ray (2005). The singularity is near: When humans transcend biology. Penguin.

back to top  Author

Jonathan Grudin

About the Author:

Jonathan Grudin is a senior researcher in the Adaptive Systems and Interaction group at Microsoft Research. His Web page is

back to top  Figures

F1Figure 1. For major technologies, which position is closest to yours?

F2Figure 2. Standard depictions of exponential growth: linear and log-linear plots.

F3Figure 3. Ten, 30, and 50 years of Moore's Law

F4Figure 4. Above, looking back a few years before reaching a critical level (red line), Moore's Law has been active for decades with no appreciable change. A few years later, the critical point is reached (below).

back to top 

©2006 ACM  1072-5220/06/1100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2006 ACM, Inc.

Post Comment

No Comments Found