The Interactions Timelines forum, 38 contributions by 28 authors over eight years, spanned the history of human-computer interaction and related topics. The November-December column on women who pioneered human-centered design is the last.
History piles up faster than it is written down. My detour from the present through the past started with half a dozen questions about how we arrived where we are and why we did not reach other places. I tracked down written records and the people involved. Answers to the initial questions appeared in columns over the first year and contributed to a longer account. Realizing that history is fundamentally a matter of perspective, I then enlisted friends and acquaintances with different viewpoints to write columns on a variety of topics.
As described below, I believe that an era has ended. As I put in to port, younger sailors with eyes on new horizons, asking different questions, can take the helm and identify salient trajectories. The Web, Internet, and online used book stores are remarkably powerful tools for such research in our field.
It was magical. In the early 1960s, the march of miniaturization began. Computers built with transistors had just arrived. Integrated circuits had just been patented. Before that, a vacuum tube computer less powerful than a graphing calculator filled a building. A technician reportedly wheeled a shopping cart around to replace tubes as they burned out. Computers were called “giant brains,” but powerful machines were the stuff of science fiction. Then everything changed.
Even before Moore formulated his law, imaginations were unleashed. A wave of visionary writing flowed from scientists, calling on researchers and developers to push back the frontiers of interactive computing.
Forty years before Toy Story, Ivan Sutherland speculated about computer-generated movies as he built the first graphical user interface elements. A quarter century before mice were widely used, Doug Engelbart built one, and he demoed word processing features, one-handed text input, and live video integrated with computing in ways that only became mainstream 20 to 40 years later. Ted Nelson envisioned a powerful globe-spanning network forty years before Web 2.0. Alan Kay’s Dynabook preceded ebooks by 40 years .
Psychologists drawn to HCI in the early 1980s, including me, had not heard of this work, but the graphics pioneers who joined CHI as the GUI took hold brought us up to speed. Histories of HCI have all led with excerpts from the writings of “the visionaries” and referred to Engelbart's breathtaking 1968 demo.
After two decades, the future scenarios began to be realized. Another quarter century later, almost everything imagined back then is in use. (The notable exceptions are fluent natural language understanding and intelligence that surpasses ours, envisioned by J.C.R Licklider, Nicholas Negroponte, Marvin Minsky, Allan Newell, Herb Simon, and others. However, few in HCI worked toward those goals. In 1960, Licklider wrote that the wait for truly intelligent machines “may be 10 or 500 (years)”; HCI researchers understood that people and the world are complex and 500 might be the better bet.)
This is an impressive achievement: We accomplished what we set out to do. What now? Like a 19th-century Jules Verne story predicting the invention of the airplane, an essay written half a century ago that predicts the state of the world several years ago is interesting but not awe-inspiring. Few new visions have appeared. Mark Weiser outlined ubiquitous computing around 1990, before the Web. It too has been realized, extended by the “Internet of Things” but without a widely embraced overarching framework to guide research and development.
Can a vision be crowdsourced?
In the 1960s, media promoted charismatic, visionary leadership. John Kennedy challenged us to put someone on the moon and to ask what we could do for our country. The ambitious European Union was coming together. Mao launched the Cultural Revolution: “Destroy the old world. Forge the new world.” Some of the visions worked out better than others. Today, the camera has been pulled back, ready to expose the clay feet beneath the bold gesture. Presidents and prime ministers are less admired, confidence in central planning is low. The next conference deadline drives more research than visions do.
Are we making individual choices, or acting as crowds in response to shifting contexts? An individual ant’s path can appear to be random, even as an intelligent collective purpose emerges from the behavior of the colony.
For decades we shared a framework, whether or not it was consciously articulated. For better or worse, the current situation is different. To chart your path, you may find historical traces useful for mapping trajectories and anticipating where we are headed. Research efforts that appear to be unrelated are increasingly accessible and amenable to quantitative and qualitative analysis. You may find patterns.
Note: After this was submitted, Roger Cohen’s New York Times column “A Time for Courage” made similar points about the decline in political leadership.
Interactions history articles, 2006–2013
Columns I authored are accessible without charge from my website.
Is HCI homeless? In search of inter-disciplinary status. By Jonathan Grudin.
Contributions from human factors, management, and computer science, with recent involvement of design and information science.
The GUI shock: Computer graphics and human-computer interaction. By Jonathan Grudin.
Computer scientists joined the psychologists populating CHI; why it happened when it did.
A missing generation: Office automation/information systems and human-computer interaction. By Jonathan Grudin.
The progression of hardware and HCI, focusing on the once-powerful, now-extinct minicomputer platform of the 1970s and 1980s.
Death of a sugar daddy: The mystery of the AFIPS orphans. By Jonathan Grudin.
Problems arose because the dying parent of ACM and IEEE did not name an heir.
Turing maturing: The separation of artificial intelligence and human-computer interaction. By Jonathan Grudin.
Two fields interested in intelligent uses of technology: Can they get along?
The demon in the basement. By Jonathan Grudin.
Detailed effects of Moore’s law are seriously underexamined, I claim.
Living without parental controls: The future of HCI. By Jonathan Grudin.
After a year of plotting trajectories, speculation as to where we are headed.
An unlikely HCI frontier: The social security administration in 1978. By Richard W. Pew.
A human factors pioneer describes an effort that preceded CHI.
NordiCHI 2006: Learning from a regional conference. By Jonathan Grudin.
Anticipating that domain-specific HCI research will become more prevalent.
HCI is in business—focusing on organizational tasks and management. By Dov Te’eni.
HCI became a research thread in management information systems before computer science.
Meeting in the ether. By Bruce Damer.
A history of social virtual worlds: early experiments and the waves of the mid-90s and mid-00s.
Five perspectives on computer game history. By Daniel Pargman and Peter Jakobsson.
An ambitious exploration of computer game progression along five dimensions.
Unanticipated and contingent influences on the evolution of the internet. By Glenn Kowack.
The most downloaded history column, an original analysis by an Internet pioneer.
Themes in the early history of HCI—some unanswered questions. By Ronald M. Baecker.
A timeline of HCI events, identifying unconnected dots in the conceptual history.
Travel back in time: Design methods of two billionaire industrialists. By Jonathan Grudin.
When young, Henry Ford and Howard Hughes pursued iterative and participatory design with singular results.
Tag clouds and the case for vernacular visualization. By Fernanda Viégas and Martin Wattenberg.
The rapid evolution of an unusual design form.
Why Engelbart wasn't given the keys to Fort Knox: Revisiting three HCI landmarks. By Jonathan Grudin.
Understanding past work and outcomes requires consideration of the context of when the work was done.
An exciting interface foray into early digital music: The Kurzweil 250. By Richard W. Pew.
Interface challenges and work on the first 88-key professional-quality digital synthesizer.
Sound in computing: A short history. By Paul Robare and Judy Forlizzi.
Sound in computing evolved from electromechanical to digital, from rare to everywhere.
The information school phenomenon. By Gary M. Olson and Jonathan Grudin.
The proliferation of schools of information, a research field now merging with HCI.
Wikipedia: The happy accident. By Joseph Reagle.
Histories of Wikipedia entries are easy to retrace; the history of Wikipedia is less so.
Understanding visual thinking: The history and future of graphic facilitation. By Christine Valenza and Jan Adkins.
Graphic artists don’t often switch media to put their accomplishments into words; this is a welcome contribution.
Reflections on the future of iSchools from inspired junior faculty. By Jacob O. Wobbrock, Andrew J. Ko, and Julie A. Kientz.
A conversation in this history forum—how will information schools fit into the future of HCI?
As we may recall: Four forgotten pioneers. By Michael Buckland.
Pre-digital efforts to build large-scale information systems are a fascinating, neglected story.
Reflections on the future of iSchools from a dean inspired by some junior faculty. By Martha E. Pollack.
Further reflections on the role and diversity of information schools.
What a wonderful critter: Orphans find a home. By Jonathan Grudin.
An old yet familiar refrain on development, and the AFIPS legacy is resolved after twenty years.
CSCW: Time passed, tempest, and time past. By Jonathan Grudin.
CSCW evolution and interaction across two continents, viewed through a techno-cultural prism.
Project SAGE, a half-century on. By John Leslie King.
A massive 1950s defense project created computing professions and spawned interface techniques.
MCC's human interface laboratory: The promise and perils of long-term research. By Bill Curtis.
A frank account of the rise and fall of a prominent HCI-AI laboratory of the 1980s.
Multiscale zooming interfaces: A brief personal perspective on the design of cognitively convivial interaction. By James D. Hollan.
A personal view of an interface approach that became more powerful as it became more abstract.
The DigiBarn computer museum: A personal passion for personal computing. By Bruce Damer.
An insanely great physical computer museum and website.
Kai: How media affects learning. By Jonathan Grudin.
A dialogue that examines what Socrates and Plato really said and what it can tell us millennia later.
Design case study: The Bravo text editor. By William Newman.
One of the most influential projects of the early GUI period, in meticulous detail.
A personal history of modeless text editing and cut/copy-paste. By Larry Tesler.
Features now taken for granted resulted from painstaking work on once-open questions.
Punctuated equilibrium and technology change. By Jonathan Grudin.
The underlying technology changes yearly, major surface changes occur every decade—with subtle effects.
Journal-conference interaction and the competitive exclusion principle. By Jonathan Grudin.
Selective conferences stress journals and leave a sparsely populated community-building niche.
The first killer app: A history of spreadsheets. By Melissa Rodriguez Zynda.
Spreadsheet, rarely being mentioned at CHI, were instrumental in launching the personal computer era.
Two women who pioneered user-centered design. By Jonathan Grudin and Gayna Williams.
An astonishing virtuoso, Lillian Gilbreth founded modern human factors; Grace Hopper invented technology to free people to do their work.
1. Some of these men had been inspired by Vannevar Bush’s 1945 essay “As We May Think.” Bush outlined a microfilm-based opto-mechanical system, but his vision was appropriated by the semi-conductor brigade.
View All Jonathan Grudin's Posts