Research alert

XI.6 November + December 2004
Page: 11
Digital Citation

Usability and collaborative aspects of augmented reality


Authors:
Morten Fjeld

In the past few years, augmented reality (AR) has received increasing attention from research and industry. By nature, AR is a highly interdisciplinary field engaging signal processing, computer vision, computer graphics, user interfaces, human factors, wearable computing, mobile computing, information visualization, and the design of displays and sensors. AR concepts are applicable to a wide range of areas involving the automotive industry, surgery, and office environments. In addition to a series of symposia and workshops devoted to this field, several journals have offered special issues on AR. This paper—which originally appeared as an introduction to a special issue of the International Journal of Human-Computer Interaction (IJHIC)—focuses on usability and collaborative aspects of AR; specifically, it summarizes the work of the six authors whose papers appeared in that issue [6].

Previous special issues have dealt with other aspects of contemporary AR research and development, such as computer-augmented environments with topics ranging from mobile computing, gestures, tracking requirements, and ubiquitous computing, to the mixing of paper and digital information [13]. One issue offered papers on wearable systems, a magnifying-glass approach; registration errors, confluence issues, and judgment of distance to nearby objects [2]. And finally, there was a special issue which offered papers on outdoor mobile systems, tracking in unprepared environments, robust tracking, optical and magnetic tracking, occlusion in collaborative environments, 3D aural augmentation, and direct manipulation [7].

These days some topics, such as registration errors, are being researched less, while other topics, such as tracking and mobile outdoor devices, remain prevalent in AR research and development. As AR technology advances, it has become increasingly accessible to a wider public and offers promising capabilities in supporting collaborative work. For instance, the field of tangible user interfaces is a practical example of how AR technology can mediate collaborative work [12], [5].

In the design process of an AR application, a series of questions related to human-computer interaction (HCI) call for attention: Who are the users and what are their needs? How can a system be designed to work effectively and efficiently for these users? How is effectiveness and efficiency measured in AR applications? Do users prefer an AR system or an alternative tool to go about their work? And finally, with what kind of tasks and what kind of alternative tools should the usability of AR applications be tested?

Additionally, perceptual issues demand further attention, relating predominantly to the user’s visual and auditory capacities [1]. Embodiment and embodied interaction must also be considered, and has been recently addressed by Dourish, who suggests that users create and communicate meaning through their interaction with a system [4]. Lastly, issues related to work and collaboration require additional investigation, and are treated by the six papers summarized here.

These papers touch upon most of the HCI questions and topics previously mentioned. Some papers are more visionary and focus on novel enabling technology for collaboration, while others offer solid empirical work, presenting experimental studies with alternative applications. In reading some of the studies included in the IJHCI special issue, the reader will recognize that the topics of usability and collaboration are dealt with simultaneously.

Special Issue Papers

The need for studies evaluating the effect of computerized tools on human cooperation and communication is well justified and documented in the first paper, prepared by Billinghurst, Belcher, Gupta, and Kiyokawa [3]. The authors report on two experiments: the first involving collaboration with AR technology (Figure 1) as compared to more traditional unmediated and screen-based collaboration, and the second, a comparison of collaboration with three different AR displays. In both experiments the authors used process (capturing the process of collaboration through the number and type of gestures, and the number of deictic phrases spoken) and subjective measures, in addition to more traditional performance measures. It was found that users exhibited many of the same behaviors in a collaborative AR interface as they did in a face-to-face unmediated collaboration. However, user communication behavior changed with the type of AR display used. The authors then describe implications of the results for the design of collaborative AR interfaces and present directions for future research. The variety of different, and very relevant measures, used in the studies contrasts with most AR research.

In the second paper, Kooper and MacIntyre demonstrate how the current World Wide Web (WWW) will evolve over time into a Real-World Wide Web (RWWW) [8]. The authors describe a prototype AR system, which allows users to interact with a 3D spatialized WWW-based information space. They make assumptions concerning the characteristics of such a system, discuss the implications of those assumptions for AR interfaces, and describe the experience of creating a prototype RWWW browser. The RWWW browser uses spatially located anchors to indicate the location of information nodes. This provides users with an unobtrusive indication of the information available that does not significantly interfere with their view of the world. A major interactive aspect of anchors is the use of glance- and gaze-selection, which are interesting solutions towards a usable realization of a RWWW. While the system is designed to work without the use of additional displays, the authors indicate how handheld devices can be used for convenient configuration and control of the system.

In the third paper, Menozzi, Hofer, Näpflin, and Krueger examine what happens to users of AR when two sources of information interfere [9]. In the study, they investigate the interference between real-world and artificial information, which affects performance in completing a visual search task. The search task was carried out under three different conditions. A video recording of a driveway served as the background. In one condition the recording was replayed continuously. In a second condition, static images of the recording were sampled at five-second intervals and replayed as background. A uniform gray background served as baseline condition. As expected, it was found that the detectability of the target was highest in the baseline condition, reduced in the presence of static images, and lowest in the condition with continuous playback. However, subjects were found to be more efficient when targets were presented in the lower, rather then the upper, part of the screen. The authors conclude that performance in detecting artificial information not only depends on spatial characteristics, but also on temporal variations of the background on which the artificial information is superimposed. The authors then proceed to indicate under what circumstances artificial information in AR systems should be avoided. The results of this paper are well documented and will be of interest to designers of mobile AR systems.

In the fourth paper, Pedersen, Buur, and Djajadiningrat present a design case involving two conceptual interaction designs for a frequency converter (used to control the speed of electric motors in many industrial applications) [10]. These converters are found in environments unfamiliar to design teams, making it difficult for designers to suggest usable interaction concepts. The authors suggest methods to bridge the designers’ imagination and users’ insights into the use-context. With a background in the Scandinavian tradition of participatory or cooperative design, the design team observed and talked to users, sketched and produced mock-ups, acted out scenarios and received user feedback during these field trips. By practicing design in the field, designers achieve a direct physical experience of the circumstances and a non-represented, non-abstracted introduction to the problems at hand. To achieve a maximum benefit of AR in the design of professional tools, the authors argue that knowledge of state-of-the-art technology is necessary, but not sufficient. Technical insights must hence be complemented by design approaches providing both insight about the users, their work practice, and the use-context. The methods suggested will most likely provide substantial help in designing successful "technology rich" applications. Through the work presented, the authors have explored an interesting problem that is not adequately addressed in the existing literature.

In the fifth paper, Thomas, Quirchmayr, and Piekarski present a model for bringing the coordination power of workflow management systems to outdoor wearable AR systems [11]. They show how mobile equipment may be integrated with adaptive context aware work environments. A scenario of a medical-emergency task is described to illustrate the functionality of this form of a collaborative system. Appropriate information stickers are introduced to support data collection in medical emergency scenarios through a hands-free user interface for medical workers (Figure 2). A key feature is the access to relevant information for users in the mobile environment, as well as for those in the advanced control room. An additional advantage is the automatic recording of on-site data, which helps to build the medical record of a patient without interfering with the work of the emergency team. The user-interface technology for which the authors propose investigation includes multimedia, AR information stickers, and the allocation of patient medical records to specific locations of the human body. These are novel and innovative uses of AR technology. The strength of this work lies in describing the application scenario. The authors show a strong expertise in developing outdoor AR interfaces and show how the prototyping of some elements required for the envisioned large-scale system occurred. Finally, this paper shows how outdoor collaborative AR can be embedded into a larger workflow system.

In the sixth paper, Wiedenmaier, Oehme, Schmidt, and Luczak show how AR for assembly processes can be a new kind of computer support for a traditional industrial domain [14]. The article concisely links AR to the real-world task of assembly, and named it ARsembly. The article describes a typical scenario for assembly and maintenance personnel and how AR might support both (Figure 3). For this purpose, tasks with different degrees of difficulty were selected from an authentic assembly process from the automotive industry. Two other kinds of assembly-support media (a paper manual and a tutorial by an expert) were examined in order to compare them with ARsembly. The results showed that the assembly times varied according to the different support conditions. AR support proved to be more suitable for difficult tasks than the paper manual, whereas for easier tasks, AR support did not appear to be significantly advantageous. Some of the information obtained in this investigation also indicates important considerations for improving future ARsembly applications. The authors make a valuable contribution in presenting empirical results comparing different types of support for assembly processes. They also show some evidence that a particular AR system in some situations can have advantages over traditional paper-assembly manuals. Their work shows where AR is both suitable and unsuitable. For AR to achieve wide spread application, it is important to take AR "out of the lab" into the "real world".

Summary

The papers in the IJHCI special issue weave a framework of typical AR design issues and considerations. They show how to design usable AR applications and how to support collaborative work by putting AR technology to work. Several themes recur and thus deserve particular attention. In order to follow the description of these themes, the reader is directed to Table 1. All of the papers study the design and use of AR applications. Some papers investigate different forms of collaboration (1, 4, and 5). Two papers present work with a direct relevance for industrial settings (4 and 6), one for the electrotechnical (4) and one for the automotive industry (6). Several papers examine mobile computing (2, 3, and 5) and different uses of Web applications are highlighted in three papers (2, 5, and 6). Applications for the medical field are examined in one paper (5) and several papers present issues related to transportation (3, 5, and 6). One paper focuses on user-centered design (4). The effect of display technology is investigated in two papers (1 and 3). Finally, the employment of handheld devices is presented in two papers (2 and 5).

The use of augmented reality technology shows wide-ranging possibilities. If care and attention is paid to the usability of such technologies—in real-world settings—the evolution should be on a promising track. The implications of AR for collaborative work call still for further investigation. How the use of AR systems in different work contexts will affect the way we go about our everyday occupations, also merits further attentions.

References

1. Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 6(4), 355-385.

2. Barfield, W., Feiner, S., Furness III, T., & Hirose, M. (Eds.). (1997). Augmented reality [Special issue]. Presence: Teleoperators and Virtual Environments 6(4).

3. Billinghurst, M., Belcher, D., Gupta, A., & Kiyokawa, K. (2003). Communication behaviors in co-located collaborative AR interfaces. International Journal of Human-Computer Interaction, 16(3), 395-423.

4. Dourish, P. (2001). Where the action is: The foundations of embodied interaction. Cambridge, MA: MIT Press.

5. Fjeld M., Lauche K., Bichsel M., Voorhorst F., Krueger, & H., Rauterberg, M. (2002). Physical and virtual tools: Activity theory applied to the design of groupware. Computer Supported Cooperative Work (CSCW), Kluwer, 11(1-2), 153-180.

6. Fjeld, M. (Ed.). (2003). Augmented reality: Usability and collaborative aspects [Special issue]. International Journal of Human-Computer Interaction, 16(3), 387-393.

7. Hildebrand, A., & Gervautz, M. (Eds.). (1999). Augmented reality [Special issue]. Computers & Graphics, 23(6).

8. Kooper, R., & Macintyre, B. (2003). Browsing the real-world wide web: Maintaining awareness of virtual information in an AR information space. International Journal of Human-Computer Interaction, 16(3), 425-446.

9. Menozzi, M., Hofer, F., Näpflin, U., & Krueger, H. (2003). Visual performance in augmented reality systems for mobile use. International Journal of Human-Computer Interaction, 16(3), 447-460.

10. Pedersen, J., Buur, J., & Djajadiningrat, T. (2003). Field design sessions: Augmenting whose reality? International Journal of Human-Computer Interaction, 16(3), 461-476.

11. Thomas, B.H., Quirchmayr, G., & Piekarski, W. (2003). Through walls communication for medical emergency services. International Journal of Human-Computer Interaction, 16(3), 477-496.

12. Ullmer, B., & Ishii, H. (2000). Emerging frameworks for tangible user interfaces. IBM Systems Journal, 39(3-4), 915-931.

13. Wellner, P., Mackay, W., & Gold, R. (Eds.). (1993). Augmented reality and ubiquitous computing [Special issue]. Communications of the ACM 36(7).

14. Wiedenmaier, S., Oehme, O., Schmidt, L., & Luczak, H. (2003). Augmented reality (AR) for assembly processes design and experimental evaluation. International Journal of Human-Computer Interaction, 16(3), 497-514.

Author

Morten Fjeld
Computer Science, Interaction Design
Chalmers University of Technology
Gothenburg, Sweden
morten@fjeld.ch

Figures

F1Figure 1. Collaboration with AR technology. The subjects wore a head-mounted display with a small video camera attached. (With courtesy of M. Billinghurst, D. Belcher, A. Gupta, and K. Kiyokawa).

F2Figure 2. To assess a current situation, the recovery doctor views the AR information stickers placed by the paramedic as world-relative information. Here, a prototype view of two forms of information for the recovery doctor is shown. (With courtesy of B.H. Thomas, G. Quirchmayr, and W. Piekarski).

F3Figure 3. A hybrid AR-HMD and Touch-Panel-Display System was chosen for the AR-supported assembly. This hybrid system makes wearing a HMD during the entire assembly unnecessary. The Touch-Panel-Display shows the easier assembly steps such as those in an electronic manual, and it is fixed to the frame for the door assembly. (With courtesy of S. Wiedenmaier, O. Oehme, L. Schmidt, and H. Luczak).

Tables

T1Table 1. Articles in the IJHCI special issue

©2004 ACM  1072-5220/04/1100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2004 ACM, Inc.

 

Post Comment


No Comments Found