Blogs

Report from DIS 2014 part 1: Moral status of technological artifacts


Authors: Deborah Tatar
Posted: Fri, July 25, 2014 - 11:25:01

Peter-Paul Verbeek gave the opening keynote speech last month at the DIS (Designing Interactive Systems) conference in Vancouver. His topic was the moral status of technological artifacts. Do they have any? 

He argues that “yes, they do.” The argument runs that humans and objects are co-creations or, as he prefers, hybrids. Just as J. J. Gibson long ago argued for an ecology between the human and the environment—the eye is designed to detect precisely those elements of the electromagnetic spectrum that are usefully present in the environment—so too are humans and designed objects culturally co-adapted. This, by itself, is not revolutionary. In fact, it is on the basis of this similarity that Don Norman brought the term affordance into human-computer interaction. It brings together both the early, easier-to-swallow idea in cognitive psychology and human-computer interaction that the ways that we arrange the space around us are extensions of our intelligence, and the post-modern philosophical move. Suchman [1] uses the word re-creations rather than hybrids to describe the intertwining of (high-tech) artifact and person. However, Suchman, who spent many years embedded in design projects, emphasizes our active role in such re-creations, the things that we do, for example, in order to be able to imagine that robots have emotions. 

But Verbeek does not stop there. He moves towards an enhanced framework from which to understand this hybrid relationship. He draws on Don Ihde (this looks like a good link that can generate more for the interested.) to identify different kinds of relationships between the designed artifact and the human. The artifact may be part of the human, bearing an embodied relationship, as with glasses. But now we have to think embodied, as in Google glasses? The artifact may have a hermeneutic relationship to the human, bringing or excluding information for our consideration into the bright circle of our recognition, as with the thermometer. Now, we ask a FitBit? The artifact may have a contrastive role, called alterity or otherness, as in a robot. Last, the artifact may provide or create background—maybe even the soundtrack of our lives as we jog-trot down the beach along with Chariots of Fire. In the context of these distinctions, Verbeek asks what we know, do or hope for with respect to the technologies? These are excellent questions and lest we be too hasty in our answers, his examples summarize unintended effects and how new technologies create new dilemmas and possibilities. Courtesy of modern medical testing, for example, much congenital disease is moving from its status as fate to a new status as decision. Fetal gender decisions will soon be playing out in homes near you—and everywhere else. The decisions about whether to have a girl or a boy are local, but society has an interest in the anticipating and gauging the effects. (One of the reasons for the oft-depicted plight of women in Jane Austen’s England was the dearth of men caused by England’s imperial struggles.)

What are the consequences of Verbeek’s analysis? Here is where the design dilemmas start to build. Let us suppose that we just accept as normal the idea that we are hybrids of artifacts and biology. Fair ‘nuff. But Verbeek goes beyond this. He rejects the separation of the idea of human and machine in the study of human-machine interaction. The difficult part is that the relationship of the designer to the design components is not the same. The individual designer controls the machine, but only influences the person. The power of the relationship, the components of the relationship that cause us to conceptualize the relationship as so strong as to constitute hybridity, is precisely what leads to the need to study the relationship. 

What makes me impatient about Verbeek’s approach is that, as I understand it, he does not prioritize recent changes in the power of the machine. For many years, some of us (c.f. Englebart’s vision of human augmentation) imagined a future in which people could do precisely what they were already doing, and have something for free via the marvels of computing. We could just keep our own calendar, for example, and have it shared with others. 

But we do not hear this rhetoric any more. Now the rhetoric is one of expectation that our most private actions will do precisely those things that can be shared by the system widely. The intransigence of the computer wears us down even where we would prefer to resist and where another human being would give us a break. Verbeek’s position—like Foucault’s—feeds into the corporate, systemic power-grab by weakening our focus on those design and use actions that we can indeed take. 

If I am frustrated with Verbeek, I am more frustrated with myself. Our own “Making Epistemological Trouble” goes no further towards design action than to advance the hope that the third paradigm of HCI research can, by engaging in constant self-recreation, stir the design pot. These are the same rocks against which so much feminist design founders. In Verbeek’s view, we designers can have our choice of evil in influencing hybridity. Influence can be manifested as coercive, persuasive, seductive, or decisive (dominating). 

The designer may think globally but must act locally. The thing is that design action is hard. Moral design action is harder. In the May/June 2014 issue of Interactions, I published a feature that advanced a theory of what we call “human malleability and machine intransigence.” The point here is to draw attention to one class of design actions that often can be taken by individual designers, those that allow users to reassert that which is important to their identity and vision of themselves in interacting with the computer. 

Often when there is a dichotomy (focusing on human-machine interaction vs. rejecting the human-machine dichotomy), there are two ways of being in the middle. One way is to just reject the issue altogether. “It’s too complicated.” “Who can say?” “There are lots of opinions.” But the other is to hold fast onto the contradiction. In this case, it means holding onto the complexity of action while we think out cases. And it means something further than this. It means that individuals thinkers, like me and you, regardless of our corporate status indebtedness, should resist the temptation to be silenced by purely monetized notions of success. To end with one small but annoying example, it is a tremendous narrowing of the word helping to say that corporations are helping us by tailoring the advertisements that we see to things that we are most likely to buy. Yeah, sure. It’s helping in some abstract way, but not as a justification for ignoring the manifest injustices inherent in the associated perversion of shared knowledge about the world. I think that Verbeek may have been saying some of this when he talked about the need to anticipate mediations, assess mediations, and design mediations, but my impatience lies in that I want it said loudly, repeatedly, and in unmistakable terms. 

Endnote

1. Suchman, L. Human-Machine Reconfigurations: Plans and Situated Actions. Cambridge University Press, New York, 2007.

Thanks to Jeffrey Bardzell for comments on an earlier version of this!


Posted in: on Fri, July 25, 2014 - 11:25:01

Deborah Tatar

Deborah Tatar is a professor of computer science and, by courtesy, psychology, at Virginia Tech.
View All Deborah Tatar's Posts



Post Comment


No Comments Found