Forums

XXV.3 May-June 2018
Page: 78
Digital Citation

Human-drone interaction: Let’s get ready for flying user interfaces


Authors:
Markus Funk

back to top 

Many advances in human-computer interaction (HCI) are driven by technological advances. Recently, a new category of devices became small enough, cheap enough, and robust enough to be available for the mass market: drones [1].

back to top  Insights

ins01.gif

Initially, drones were for non-civilian use, but over the past 10 years, they have become toys controlled by hobbyists. With both drone technology and computing hardware growing lighter, drones can now handle a payload sufficient to carry sensors and actuators. This makes using drones interesting for user interface researchers and designers of interactive systems, as drones are now able to bring user interfaces into a third dimension in a scalable way. These developments have opened up a new sub-field in interaction research: human-drone interaction.

The largest benefit of human-drone interaction is that drones can precisely transport user interfaces to any point and orientation in 3D space, making drones a new medium for both input and output. Not only can hardware for sensing user input be positioned at any convenient position in 3D space, but actuators, displays, and other output devices can also be placed in any position around the user. While drones cannot create a lot of resistive force for haptic user interfaces, they can rapidly and scalably bring sensors and actuators to a position in 3D space that would otherwise require a mechanical construction. This is a significant benefit for the field of HCI in general, as almost any 2D user interface can be transferred to a third dimension by attaching a small flying robot to it. We are already experiencing flying cameras, flying sensors, flying haptic props, and flying displays.

Compared with traditional robots, drones can be very small and are able to fly. Therefore, we argue that the field of human-drone interaction has become a special sub-field of humanrobot interaction, bringing new challenges and new opportunities for designing interaction.

back to top  Basic Concepts of Interacting with Drones

The most basic feature of human-drone interaction is a user or system telling a drone to change its position. With more intelligent systems, how one controls the drone changes. There are four different ways of controlling a drone’s position:

  • Direct position control. In the traditional form, the user directly controls the flight path using an interface (e.g., a joystick). The drone does not need to carry additional location sensors, as the user is flying it within their line of sight (LoS) or in first-person view (FPV).
  • Absolute position control. With more advanced systems, the user can tell the drone where to go just by defining location targets that the drone flies to autonomously using an on-board location processor. This can be, for example, GPS or a more complex camera-based optical location tracking system (e.g., SLAM).
  • Relative position control. With even more tracking capabilities, drones can be positioned relatively to a tracked object. A user can therefore issue commands to a drone that are relative to a person, position, or object. A user could tell the drone to “follow me at a distance of 10 meters” or “stay next to this tree.”
  • Task-oriented control. In the most advanced setting, drones can be controlled by giving them a task that the drone needs to achieve. Such a task could be “build a wall out of these plastic cubes.” The drone would then continuously pick up cubes and place them at the specified position in order to build a wall. Another scenario of task-oriented control is when multiple drones work together to form a display in the sky, where every drone represents a flying pixel (e.g., the SPAXELS project; https://www.spaxels.at).

ins02.gif

There are more interaction possibilities than just changing a drone’s position. Drones can be extended with other sensors and actuators, opening up a whole new design space for more interaction modalities considering input and output with drones.

back to top  How are Drones Used in HCI?

Using drones for research projects has become more popular in recent years. For example, drones have been used for carrying cameras, carrying screens and projectors, providing a tactile interface, and for their bare presence in 3D space. In the following, I list a few examples from the HCI research community.


Drones can now handle a payload sufficient to carry sensors and actuators.


Flying cameras. Drones can be used for carrying a camera to any position and orientation in 3D space. This feature was used in the project DroneLandArt (https://www.youtube.com/watch?v=MwOeWzrHpF0) to create art installations using the outdoors. The art installations were not visible to an observer on the ground, but when a drone would film the environment from a specific aerial angle, the art would appear.

Flying screens and projectors. Screens and projectors are the traditional output media to display content from a computer to a user. Research in human-drone interaction has investigated the combination of screens attached to drones. For example, Schneegass et al. [2] envisioned that pervasive flying screens, so-called midair displays, could be used to show evacuation instructions in emergency scenarios, or could guide the way at sporting events or tourist sites. Gomes et al. [3] use their BitDrones to carry the displays for Skype conversations on a flying screen. This allows the callers to position themselves in the physical environment of the person they’ve called in any way they choose.

Further, Knierim et al. [4] used a projector attached to a drone to provide mobile in situ projected navigation instructions. With navigation instructions projected into the users’ physical environment, users no longer need to look at their smartphones and instead can concentrate on enjoying the environment (Figure 1).

ins03.gif Figure 1. A user receiving navigation instructions from a drone-carried projector [4].

Flying tactile props. Especially in the fields of augmented reality and virtual reality, one of the most dominant current research questions is how to provide haptic feedback for digital content. In the Tactile Drones project, Knierim et al. [5] use drones to fly haptic props to a position in 3D space and align the haptic prop with the position of the digital content (Figure 2). Drones are then able to give a physical appearance to digital content in augmented and virtual reality. In a follow-up project, drones that were overlaid with augmented reality information were then also used as an input device [6], with one’s hand used to move the drone’s position in 3D space (Figure 3).

ins04.gif Figure 2. Drones are used as a tactile feedback element in a VR application [5].
ins05.gif Figure 3. Using a quadcopter as a flying tangible control for providing input to an interactive system [6].

In-air companionship. Despite carrying hardware, drones can also be used as a flying companion, just there in space. In the work “Jogging with a Quadcopter,” Mueller et al. [7] use a quadcopter to motivate runners to keep a constant pace and to guide the way. In this project, the presence of a drone is used as a motivational device.

back to top  Prototyping Human-Drone Interaction

For building and prototyping flying user interfaces for human-drone interaction, we need to pay attention to three aspects: controlling the drone, knowing where the drone is, and providing communication between the drone and other systems.

Controlling the drone in interactive systems depends on which basic controlling concept is used in the prototype. When using direct position control, the traditional joystick control might be sufficient. Especially in the early days of human-drone interaction, mostly Wizard of Oz studies, where an experimenter controlled the drone as if it were acting autonomously, were performed [8].

When creating autonomous systems, for example when using absolute position control, relative position control, or task-oriented control, the drone needs to be controlled by a computer. This was done in the early days by hacking the drone and reverse engineering the remote control. Nowadays most drones that are used for research prototypes come with an API for programming them.


Social acceptability will rise as drones are used as accessibility devices.


Depending on the application scenario (in absolute, relative, and task-oriented control), a tracking mechanism is needed. We distinguish between on-board and external tracking systems. On-board tracking uses sensors that are mounted on the drone. This can be, for example, a GPS device that keeps track of the drone’s position, or a camera-based, drone-mounted tracking system that is creating a SLAM model.

External tracking is used when cameras that are placed in the environment are tracking the drone. One example of an external tracking method for drones is an OptiTrack system, which transmits the position to a computer, and, in a feedback loop, makes a decision on which action to perform next.

back to top  The Future of Human-Drone Interaction

Human-drone interaction is a relatively new field with a lot of interesting application areas. Delivery companies are already experimenting with an automated delivery system using drones. Further, transportation companies are building air taxis based on drone technology. With these use cases, we assume we will see an advance in drone-carried positioning technology in the next few years. This will enable drones to autonomously navigate in indoor environments and to independently find charging spots for their batteries. While the public currently perceives drones as dangerous, annoying, and loud, we argue that drones will increasingly be more socially acceptable. Cages will increase the safety and social acceptability of drones. Social acceptability will also rise as drones are used as accessibility devices [9] (see sidebar). Given the popularity of drones and the countless number of use cases, now is a good time for HCI researchers to start exploring the sub-field of human-drone interaction and making user interfaces fly.

back to top  References

1. Throughout this work and consistent with everyday language usage, we are using the word drone to refer to unmanned aerial vehicles, e.g., quadcopters. In no way are we referring to military drones in this article.

2. Schneegass, S., Alt, F., Scheible, J., and Schmidt, A. Midair displays: Concept and first experiences with free-floating pervasive displays. Proc. of The International Symposium on Pervasive Displays. ACM, 2014, 27.

3. Gomes, A., Rubens, C., Braley, S., and Vertegaal, R. BitDrones: Towards using 3D nanocopter displays as interactive self-levitating programmable matter. Proc. of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2016, 770–780.

4. Knierim, P., Maurer, S., Wolf, K., and Funk, M. Quadcopter-projected in-situ navigation cues for improved location awareness. Proc. of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 2018.

5. Knierim, P., Kosch, T., Schwind, V., Funk, M., Kiss, F., Schneegass, S., and Henze, N. Tactile drones-providing immersive tactile feedback in virtual reality through quadcopters. Proc. of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2017, 433–436.

6. Knierim, P., Kosch, T., Achberger, A., and Funk, M. Flyables: Exploring 3D interaction spaces for levitating tangibles. Adjunct Proc. of the 12th International Conference on Tangible, Embedded and Embodied Interactions. ACM, 2018.

7. Mueller, F.F. and Muirhead, M. Jogging with a Quadcopter. Proc. of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2015, 2023–2032.

8. Cauchard, J.R., Zhai, K.Y., and Landay, J.A. Drone & me: An exploration into natural human-drone interaction. Proc. of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 2015, 361–365.

9. Avila, M., Funk, M., and Henze, N. DroneNavigator: Using drones for navigating visually impaired persons. Proc. of the 17th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 2015, 327–328.

back to top  Author

Markus Funk leads the Human-Computer Interaction group at the Telecooperation Lab of TU Darmstadt. His research interests include augmented reality, virtual reality, and human-drone interaction. funk@tk.tu-darmstadt.de

back to top  Sidebar: DRONES FOR ACCESSIBILITY

The ability to automatically position a computer-actuated device in any 3D position is useful not only for delivering goods, entertainment, and taking pictures, but also for accessibility. In the DroneNavigator [9] project, Avila et al. used a drone to navigate visually impaired and blind persons through indoor environments. They used the sound that the drone would naturally emit as a cue. In a follow-up of the DroneNavigator project, the drone is also attached to the visually impaired person with a leash. Suddenly a multimodal navigation using tactile and aural feedback was possible. Just like a guide dog, the drone can lead a person through an unknown environment.

ins06.gif A visually impaired user with a drone as an accessibility device, which emits aural navigation instructions [9].

back to top 

Copyright held by author. Publication rights licensed to ACM.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.

Post Comment


No Comments Found