ACM symposium

IX.5 September 2002
Page: 57
Digital Citation

First UIST interface-design contest


Authors:


The 14th annual ACM Symposium on User Interface Software
  and Technology (UIST 2001) was held this year at Disney World
  in Orlando, Florida, November 11 to 14. The conference
  included a new attraction at its opening reception, a
  user-interface (UI) design contest. Contestants had several
  months to design and build a real-time control interface to a
  custom game application. During the contest they used their
  interfaces to play a suite of game scenarios. Game scores
  were used to rank the teams and their interfaces. Thanks to
  generous sponsorship from ACM SIGCHI, all participants were
  given a T-shirt, all student participants received free
  registration to the UIST Symposium, and prizes were awarded
  in the following categories: best overall UI; second-best UI;
  best single-user UI; and best student-designed UI.

 

Our hope in designing the contest was that a clearly
  superior interface would not be obvious and that contestants
  would attempt a variety of UI designs. Variety is what we
  got, even more than we anticipated: Eight teams competed,
  using most of the weapons in the interface designer’s
  arsenal. The game, the contest, and the various interfaces
  are described in the following paragraphs.

 

The Application

 

Real-time control applications pose significant challenges
  for UI designers. Examples of such applications are
  air-traffic control, computer games, and process and plant
  control. Despite their diversity, the common theme in these
  applications is the real-time manipulation of dynamic
  entities through a user interface.

 

The real-time application we developed for the contest is
  a game in which one or more human players control the
  velocity of five game pieces in a two-dimensional playing
  arena (see Figure 1). The player’s goal is to
  move his pieces beyond the end-line while avoiding capture by
  computer-controlled pieces (which pursue the closest
  human-controlled piece within direct line of sight) or move
  randomly when no player’s piece is in view. The game has a
  time limit within which all scoring must be achieved. Opaque
  obstacles, different numbers of computer-controlled pieces,
  and different relative velocities of the human-controlled and
  the computer-controlled pieces make this a challenging game,
  with both strategic and tactical elements. For example,
  usually the human player must offer up some pieces as decoys
  to distract the computer-controlled pieces, while carefully
  maneuvering his other pieces toward the end-line. Thus the
  human player must choreograph complex motions of the pieces
  and execute them promptly via the interface. On the
  assumption that a good interface makes control of the pieces
  easier by an experienced player, game scores were used as a
  proxy for the quality of the interface design.

 

The system architecture for this application had to
  accommodate a wide variety of UI designs without conferring
  special advantages on any particular design. A game server
  sits on a local-area network (LAN). It runs the game and
  responds to two kinds of network messages, one requesting the
  current state of all dynamic pieces and one changing the
  velocities of the human-controlled pieces. Contestants
  implemented their UI as a separate node on the LAN that
  communicated with the game server in response to the
  aforementioned network messages. Thus the developers had
  complete freedom in the design of their interface.

 

When we designed this game, we anticipated that the
  following UI technologies might be useful for it: novel
  visualizations of the game state; pen-based input; two-handed
  input; multiuser interfaces; multimodal interaction; and the
  incorporation of intelligent control into the interface.
  Coincidentally, all of these technologies featured in at
  least one of the entered designs.

 

The Contest

 

Contestants were shown the five game-board configurations
  30 minutes before the start of the contest so that they could
  plan strategies (see Figures 1 and
  2
). The board configurations were created to test
  specific interface attributes, such as the ability to quickly
  set different paths for all pieces, the ability to set paths
  for groups of pieces simultaneously, fine control for
  maneuvering pieces in tight spots, and the ability to control
  decoy pieces with minimal attention and effort. For each
  board a recommended strategy (such as how many decoys to use
  and how to position them) was also announced, to minimize the
  advantage a team might obtain by simply devising a better
  game plan. And in fact most teams followed our
  recommendations, so that the quality of the interface was
  more relevant than game-playing strategy in determining
  scores.

 

Scoring for the game was based on the number of pieces a
  contestant successfully maneuvered across the field. In a
  tie, the time at which the last piece crossed the finish line
  was used as a tiebreaker. Each contestant was given two
  attempts at each of the boards, with only the best score
  counting. Final scores were based on the cumulative scores
  from all rounds.

 

A preliminary round of competition was held in the
  afternoon to determine the finalists for the evening round.
  Five individual contestants and three teams participated in
  the preliminary round. Of these, the top four teams advanced
  to the final round, which was held in conjunction with the
  opening reception for the UIST Symposium. The timing and
  location ensured a throng of spectators, who followed the
  games on a projection screen.

 

The Contestants

 

Our initial speculations about possible UI designs for the
  game focused on multimodal interaction. In particular, the
  combination of speech and direct manipulation seemed to hold
  promise: speech commands could be used to select individual
  pieces or groups of pieces, and direct manipulation (using a
  touch screen or a mouse) could be used to indicate
  trajectories. One of us implemented such an interface to aid
  in the development of the game boards and to tune the game
  parameters. Although ineligible for any prize, this interface
  was included in the contest to see how it performed.

 

To our surprise, it did not perform very well and was
  eliminated in the preliminary round. Even during development
  the unsuitability of a touch screen for indicating piece
  trajectories became obvious; the player’s hand obscured too
  much of the screen while pointing, making it difficult to
  observe the movement of the pieces. Moreover, the low spatial
  resolution of a touch screen made fine control of the pieces
  problematic. A mouse was found to be more useful for
  indicating trajectories. Speech commands did offer some
  advantages in selecting pieces—for example, selecting
  all of the pieces simultaneously could be accomplished by a
  single command—but the inherent latency and fragility
  of current speech-recognition technology, especially in noisy
  environments, made the interface uncompetitive with other
  designs. A single misinterpreted command was usually enough
  to ensure a bad score. This experience was duplicated by a
  student team from the University of California at Berkeley,
  which used the mouse to select pieces and speech to set their
  trajectories in a single-user interface. This interface
  performed even worse. However, it won praise for whimsy: The
  voice commands to move pieces far to the left and far to the
  right were "Clinton" and "Buchanan,"
  respectively!

 

The other approach that we expected to do well was
  multiuser interaction. This idea turned out to be more
  popular and more useful. Three different teams entered a
  multiuser design. Chris Wren (Mitsubishi Electric Research
  Labs) and Andrew Wilson (Microsoft Research) developed an
  interface based on Atari joystick controllers. Each joystick
  was used to control the movement of one piece, so with a team
  of five people, each person on the team was able to focus on
  controlling just one piece. This distribution of
  responsibility and control seemed to be a good idea. However,
  this approach had a significant drawback: To complete any
  strategy, all members of the team had to communicate and
  coordinate. The Wren/Wilson team had evidently given little
  thought to this issue—some of the team members were
  recruited from the audience immediately before the first
  game—and so struggled to coordinate their efforts.

 

A combined team from Carnegie Mellon University (CMU) and
  the MIT Media Lab used a similar distributed-control user
  interface. Each member of the five-person team used a
  mouse-based interface on a separate PC to control one game
  piece. This team had evidently trained on the task, because
  their coordination and game play were superior. They won the
  prize for best overall interface design.

 

The third multiuser interface came from Xerox PARC. It too
  used five joysticks, one per game piece. However, it differed
  from all other systems in that it used a custom display that
  contained several visual annotations. The annotations were
  designed to allow players to notice which computer-controlled
  pieces were pursuing their pieces and what might be their
  best escape routes. In Figure 3 the yellow
  boxes surround the human-controlled pieces and the red
  circles surround the computer-controlled pieces. The boxes
  and circles indicate the maximum distance that each piece can
  travel in one second.1 The black
  dots on the boxes and circles indicate the velocity of each
  piece.

 

Another visual annotation concerns "radar lock":
  When a computer-controlled piece locks onto a player’s piece,
  a black line is drawn from the computer’s piece to the
  player’s piece to indicate the lock. Additionally, a circle
  with that line as its radius is drawn to indicate that any
  player’s piece entering the circle (assuming no intervening
  obstacles) will become the closest piece and will then become
  the new target of the computer’s piece.

 

The PARC interface can also be used in single-user mode,
  in which case other visual cues are relevant. Furthermore, in
  the single-user version the player’s control is augmented by
  some intelligent behaviors. A pink line indicates the
  trajectory of a piece to the next destination on its
  player-specified route. Each piece tries to move toward its
  destination unless an obstacle intervenes, in which case it
  follows either "port" or "starboard"
  buoys (red and green, respectively) around the obstacle
  before resuming its course.2
  Also, if a piece gets too close to a computer-controlled
  piece, it moves away from it for a specified period of time
  before resuming its course.

 

It is not clear how much the visual annotations and
  automatic behavior helped. The multiuser interface performed
  better than the single-user interface, which suggests that
  automatic behaviors were not as useful as having teammates
  and distributed control of the pieces. Neither the multiuser
  interface nor the single-user interface was best in its
  respective class, which suggests that visualization aids were
  not a decisive advantage for this application.

 

The final two interfaces are both for single users.
  Kentarou Fukuchi from the Tokyo Institute of Technology
  designed a tangible UI in which pieces are controlled by
  manually moving physical tokens on top of a transparent
  screen. The tokens are tracked by computer vision: A camera,
  mounted under the screen, detects the positions of the
  tokens. The user can use both hands to move tokens
  simultaneously. This design won the prize for best
  student-designed interface; the CMU-MIT team won for best
  overall design.

 

An interesting aspect of Fukuchi’s interface is that it is
  especially useful for symmetric movement of the pieces; this
  can be achieved simply by taking the same actions with both
  hands simultaneously, a natural thing to do. This feature
  turned out to be advantageous for the fifth and final game
  board, in which the recommended strategy called for moving
  two pieces down the left edge while simultaneously moving two
  pieces down the right edge, with the fifth piece serving as a
  decoy down the middle.

 

The final design illustrates well how simple and elegant
  engineering is often the key to superior UI design. Takeo
  Igarashi from the University of Tokyo won both the second
  overall prize and the prize for the best single-user
  interface. In Figure 4 the black lines
  indicate trajectories that guide the player’s pieces, while
  the red lines indicate barricades that block them. The pieces
  try to follow the nearest trajectory but are not allowed to
  pass barricades. These two simple primitives support a range
  of control options. A single stroke can make all five pieces
  move in unison, whereas differential control over smaller
  groups of pieces can be achieved by drawing more strokes. The
  barricade strokes are useful for making pieces stop and
  loiter, which is useful behavior for decoys. The user
  sketches trajectories as freeform strokes using either a pen
  or by dragging with the left mouse button. Likewise,
  barricades are drawn with a pen or by dragging with the right
  mouse button. The user can erase trajectories and barricades
  by clicking them.

 

Conclusion

 

The UIST interface-design contest was not a careful
  experiment, so no definitive conclusions about the relative
  merits of different interface technologies can be drawn from
  it. However, it was an entertaining and engaging activity for
  participants and spectators alike. It also provided an
  interesting survey of some of the most promising UI
  techniques as they were applied to a common problem.

 

We hope that this and future interface-design contests may
  provide useful material for design-oriented HCI courses. The
  contest software and documentation is available on the UIST
  Web site, www.acm.org/uist. Professor Rob St. Amant from
  North Carolina State University has already used this year’s
  contest in an undergraduate course in human-computer
  interaction; his students’ efforts can be viewed at
  www.csc.ncsu.edu/ faculty/stamant/uistgame/index.html.

 

A second contest, to be held at next year’s UIST
  Symposium, is being planned. Details are available on the
  UIST Web site.

 

Acknowledgments

 

Thanks to Khai Truong and Jason Hong for testing early
  versions of the contest software. Thanks also to Mir Farooq
  Ali and the other student volunteers for help during the
  contest. Jock Mackinlay added to the atmosphere of the event
  by serving as master of ceremonies. Stan Pozerski helped
  configure the game server and other hardware for us. The
  staff at Disney’s Boardwalk Inn made the contest run
  smoothly. Janet O’Halloran and Karen Dickie provided
  miscellaneous essential help. And thanks once more to ACM
  SIGCHI for sponsoring the contest.

 

Calendar

 

October 6-9

  2002 IEEE International Conference on Systems, Man and
  Cybernetics Bridging the Digital Divide

  Hammamet, Tunisia

  http://smc02.ec-lille.fr/ home.html

 

October 9-11

  Fifth Annual International Workshop

  PRESENCE 2002

  Porto, Portugal

  http://www2.ufp.pt/presence2002/

 

October 14-16

  IEEE Fourth ICMI `02

  International Conference on Multimodal Interfaces

  Pittsburgh, PA, USA

  www.is.cs.cmu.edu/icmi/

 

October 14-16

  Creativity and Cognition 4

  Loughborough University, UK

  http://creative.lboro.ac.uk/ccrs/CC02.htm

 

October 14-17

  User Interface 7 East

  Cambridge, MA, USA

  www.uiconf.com

 

October 23-25

  NordiCHI 2002

  Design versus Design

  The 2nd Nordic Conference on Computer-Human Interaction

  Aarhus, Denmark

  www.nordichi.org/

 

October 24-25

  Theories of Computer-Mediated Work

  Aarhus, Denmark

  www.chmi.dk/events/theories_of_computer.htm

 

October 27-30

  UIST `02

  15th annual symposium User Interface Software &
  Technology

  Paris, France

  www.acm.org/uist/

 

November 1-4

  APCHI 2002

  5th Asia Pacific Conference on Computer Human Interaction

  Beijing, China

  http://iel.iscas.ac.cn/apchi/

 

November 13-15

  WWW/Internet 2002

  IADIS International Conference

  Lisbon, Portugal

  www.iadis.org/icwi2002/

 

November 16-20

  CSCW 2002

  Conference on Computer Supported Cooperative Work

  New Orleans, LA, USA

  www.acm.org/cscw2002/

 

November 25-27

  HF2002

  Human Factors Conference

  The Hotel Sofitel, Melbourne

  Melbourne, Australia

  www.iceaustralia.com/HF2002/

 

Authors

 

Marty Frenzel, Joe Marks, and Kathy Ryall

  Mitsubishi Electric Research Laboratories

  & Harvard University Extension School

  mfrenzel@atg.com

  marks@merl.com

  ryall@merl.com

 

Footnotes

 

1 In the original
  implementation of the game, the computer-controlled pieces
  were assigned a single value for maximum velocity, whereas
  the human-controlled pieces were assigned maximum velocities
  for both x and y directions. This bug meant that the
  human-controlled pieces were faster on the diagonal than they
  were in any of the four compass directions. This bug was
  fixed for the actual contest.

 

2 Even in the multiuser
  version (which does not use any automatic behaviors), the
  buoys are helpful because they indicate how close to an
  obstacle a piece can get without bouncing off it. The visual
  depiction of the pieces (ships and submarines), obstacles
  (islands), and buoys—and even the background
  color—were inspired by nautical maps.

 

SRC="thumbs/f1.jpg" BORDER="0" VSPACE="5" HSPACE="5"
ALIGN="LEFT" ALT="F1">

Figure 1. The initial configuration of
  the first game board from the contest.

CLEAR="ALL">

 

SRC="thumbs/f2.jpg" BORDER="0" VSPACE="5" HSPACE="5"
ALIGN="LEFT" ALT="F2">

Figure 2. Initial configurations of the
  other four contest game boards. For each board the speed
  advantage of the computer-controlled pieces over the
  human-controlled pieces was set to ensure significant
  challenge for the human player(s).

 

SRC="thumbs/f3.jpg" BORDER="0" VSPACE="5" HSPACE="5"
ALIGN="LEFT" ALT="F3">

Figure 3. The custom display used by
  the Xerox PARC interfaces.

 

SRC="thumbs/f4.jpg" BORDER="0" VSPACE="5" HSPACE="5"
ALIGN="LEFT" ALT="F4">

Figure 4. A sketch-based UI.

CLEAR="ALL">

 

©2002 ACM  1072-5220/02/0900  $5.00

 

Permission to make digital or hard copies of all or part
  of this work for personal or classroom use is granted without
  fee provided that copies are not made or distributed for
  profit or commercial advantage and that copies bear this
  notice and the full citation on the first page. To copy
  otherwise, to republish, to post on servers or to
  redistribute to lists, requires prior specific permission
  and/or a fee.

Post Comment


No Comments Found