XXI.1 January - February 2014
Page: 22
Digital Citation

What’s in a word?

Lone Hansen

My smarter students have figured it out: If in class they casually include the words natural or objective in a sentence that also includes research and/or technology and if they manage to look as if they mean it, the following five minutes will be action-packed. I will begin a rant—about how nothing is objective because everything is biased, about how what we think comes naturally to us is difficult for others to do, and about how even seemingly objectively made artifacts like maps are always based on conventions imbued with ideologies that are specific in this time and place. And while at this point my smarter students will be passing around the popcorn, happy that they managed to get me going, I am telling them that words matter, and that it is important they always pick words with care and use the full spectrum of what language allows for. I will finish by saying the last thing we need is for them to not care about words and not realize the power of language, as it is one of our primary sources to knowledge and insight.

I hereby confess that I often want to shout this out to my research community as well.

While we all slip once in a while, you simply should not be allowed to state in a research paper that you have done “an objective study of [something]” because that is not possible. Read or listen to ANT and STS researchers like Bruno Latour or any semiotician worth his salt and they will tell you that no ways of studying and interpreting the world are objective, as in: non-biased. As soon as something is represented in language, actions, or pictures, it is always already mediated and can never be anything but a representation. And as Latour said at this year’s closing keynote at CHI’13 [1], everything is connected in ways we will never be able to fully grasp. No matter how much data we have and no matter how much we try to stay on the outside of what we study, all we can do is generate particular representations of networks, of data, of ourselves, and of the world. An object might present itself to me but as soon as I even think of doing anything with it, I understand only fractions in very disconnected ways. These slices of represented reality will probably be very detailed but they will never be non-biased.

Following this, if I were the HCI Dictator, terms like natural user interfaces/interaction (NUI) would be stricken from the vocabulary. While there is nothing wrong with researching and developing the systems and interactions that are currently called natural, there is everything wrong with claiming that some ways of interacting or some computer systems are more natural than others. In 2010, Donald Norman argued in this magazine that “natural user interfaces are not natural” [2]. The basic argument was what he first put forward in the legendary book The Psychology of Everyday Things [3], that every interaction with a computer is based on representational conventions: Though interaction also makes use of physical movement, the mental mapping of this process is what makes us understand how to use it. Norman argued in his 2010 column that natural is a terrible word to use because this relies on conventions as much as any other interface. In line with this, Jetter, Reiterer, and Geyer [4] argue that the term blended interaction can explain why some things are easier to learn: They fit easily into our conceptual system. The more we recognize, the easier it is for us to “blend” one understanding with another and thus create a third entity that we can integrate and apply.

In other words, familiarity, not naturalness, is crucial. Like words, culturally based understanding matters.

I would extend this critique to also address what the word natural implies, even if it shouldn’t. First off, it implies that there is something that is not natural; that there is such a thing as an unnatural interaction or interface. To some, it also implies that some users are more (un)natural than others—as when systems like the Kinect make it very hard to be missing an arm or be in a wheelchair. If use or users can be (un)natural, what effects does this then have on the politics of interfaces and technologies—a topic that is notoriously under-addressed in most of HCI? What criteria allow for someone to decide the level of naturalness? Does it implicitly become a techno version of Orwell’s “some are more equal than others”?

If I were the HCI Dictator, terms like natural user interfaces/interaction (NUI) would be stricken from the vocabulary.

Research-wise, natural implies that this is the end point: After a long journey that began with keyboard, punch cards, and command lines, technological evolution has reached the point where interaction has become natural. What, then, comes next, I wonder? Is that it? Does it make any sense to search for a new paradigm that would then be unnatural?

In other words, the word natural—just like the word objective—makes an implicit claim to fame and relevance that only makes it possible for others to dismiss a design as a bad design, not as a bad paradigm. To a research community this is bad news.

Furthermore, and following the argument made by Latour in Paris, the words natural and objective both imply that it is possible to transgress the point from which we stand—that we can be at a place with no paradigmatic blind spots (see Niklas Luhmann’s notion of distinctions). Since an objective study is understood as belonging to a higher level than a subjective study, the word objective implies that this setting ourselves aside as humans and researchers is something to strive for. To a research community that identifies as human-computer interaction this is bad news.


I realize this might not be what people using those words wish to say, and this might be the worst part. Because this means we as researchers have done exactly what I ask my students not to do: ignore the power of language, and instead of using it actively to gain knowledge and insight, use it as a simple vehicle of seemingly neutral communication, as if Lakoff and Johnson had never existed [5]. However, we do not only live by metaphors; we also literally construct ways of living by materializing metaphors with which users can interact and through which we enable ourselves to understand the impact we have on the situations we design for.

The bad news is that we are currently not very good at setting a stage for ourselves on which we can discuss in nuanced ways what we do. The good news is that it isn’t very hard to change. I am happy to confess that while I strongly believe that natural isn’t objectively better, I as strongly believe that a lot of us know this already—all we need to do is to start caring about which words we pick from the language we already have at our disposal.


Thanks to Clemens Nylandsted Klokmose, Daniel Fallman, Gilbert Cockton, Nicolai Brodersen Hansen, and others for discussing this topic with me on Facebook and elsewhere, thus providing fuel for a confession.


1. Latour, B. From Aggregation to Navigation—A Few Challenges for Social Theory. Closing Keynote Plenary, CHI’13. Video of the talk:

2. Norman, D.A. Natural user interfaces are not natural. Interactions 17, 3 (May 2010), 6–10.

3. Norman, D.A. The Psychology of Everyday Things. Basic Books, New York, 1988.

4. Jetter, H.C., Reiterer, H., and Geyer, F. Blended interaction: Understanding natural human-computer interaction in post-WIMP interactive spaces. Personal and Ubiquitous Computing (accepted Sept. 2013).

5. Lakoff, G. and Johnson, M. Metaphors We Live By. University of Chicago Press, Chicago, 1980.


Lone Koefoed Hansen is an associate professor in digital design and aesthetics at Aarhus University, where she is also senior key researcher in the Participatory Information Technology Centre, PIT.


The Digital Library is published by the Association for Computing Machinery. Copyright © 2014 ACM, Inc.

Post Comment

No Comments Found