Features

XXXI.2 March - April 2024
Page: 44
Digital Citation

The Telepresence of Furniture in Extended Reality


Authors:
Ian Gonsher

back to top 

The Covid-19 pandemic has produced many lasting social and cultural changes. Among these changes is the increased role telepresence plays in mediating interpersonal relationships. Lockdowns served as real-time social experiments, validating at scale the notion that physical proximity is not always absolutely necessary for meaningful encounters. This normalization of videoconferencing, as well as other forms of telepresence, challenges designers to think differently about how we gather.

back to top  Insights

The integration of mixed reality into furniture can provide a viable alternative paradigm to the emergence of "face computers" and other wearables.
How might telepresence be designed to be a more equitable experience for both remote and local users?
Attention to design principles such as scale, movement, and context can create affordance for a more immersive and embodied telepresence experience.

Over the past several years, the design communities at Brown University and the Rhode Island School of Design have been exploring new modalities for telepresence in extended reality through a series of prototype studies. These furniture and furniture-like objects overlay and overlap virtual content onto physical spaces, incorporating an awareness of scale, movement, and context. They assume that the thoughtful design of the built environment sets the conditions for interpersonal communication. These prototype studies demonstrate that ubiquitous computing doesn't need to be a device carried on the body; reality can be engineered directly into the built environment.

Initial design research, which predated the pandemic, explored the ways in which the videoconferencing experience might be enhanced. Early on, there was particular interest in reimagining the future of work, to ask how and where we might work. Laptop monitors, and to a lesser degree phone and tablet screens, often serve as sites for remote meetings. And yet, the experience of these small "windows" feels constrained when compared with unframed encounters at larger scale. These constraints also make nonverbal cues such as body language harder to pick up on, for example. When compressed into such a tight space, there is a tendency for the voices of local users to dominate over those of remote users, introducing professional inequalities into the experience.

ins01.gif

Large screens are a simple way to scale to a more immersive experience. Movement enhances the telepresence effect. The Large Screen Mobile Telepresence Robot (LSMTR; Figure 1) is a study that marries these design elements of scale and movement. The LSMTR is roughly the size of a standing human. These elements lend presence in ways that a talking head on a small screen cannot achieve [1]. The experience is significantly more natural and immersive. Movement gives greater agency to the remote pilot, expanding the ways they might interact and collaborate with others at a distance. Scale evokes the presence of the body.

ins02.gif Figure 1. Large Screen Mobile Telepresence Robot.

Building upon the insights of the LSMTR prototype, questions arose about the ways furniture might be considered as sites for telepresence. Furniture is ubiquitous. It hides in plain sight. It establishes the context for most quotidian activities and behaviors. Furniture, and the built environment more generally, is the background upon which daily life plays out. It provides a stage for the ways people interact with one another. Affordances for conversation are built into the way a table and a set of chairs are designed and arranged, for example. Furniture offers intriguing possibilities for a model for ubiquitous computing that does not rely on wearables or portable devices, but rather integrates these features directly into the built environment.


Furniture, and the built environment more generally, is the background upon which daily life plays out. It provides a stage for the ways people interact with one another.


Theses insights grounded in the development of TBO—a contraction of "table robot"—that like its predecessor, the LSMTR, can offer a telepresence experience as it moves within its environment. However, TBO (Figure 2) is first and foremost a piece of furniture—a table. Its telepresence features appear only when summoned. Instead of a screen, TBO uses a projector to achieve a larger, more immersive image that can be cast onto nearby walls and surfaces [2]. A screen or monitor is always within a kind of frame, but projection onto local surfaces eliminates those boundaries. Most of the time, TBO is just a typical table. Like conventional furniture, it fits into a system of objects that references a tradition of furniture making and domestic interiors. The use of materials such as wood, for example, fits neatly into this form language.

ins03.gif Figure 2. Left: TBO in furniture mode. Right: TBO in telepresence mode.

But when emerging technologies are integrated into these long-established design conventions, new possibilities arise. TBO is also much more than a conventional table. Activated with a simple voice command, TBO can move into position using simultaneous localization and mapping. It can find an unobstructed place, parallel to a wall or other surface, at the appropriate distance, and project onto it. The size of the image can be adjusted based on the relative distance of TBO from the wall or surface (although the relative brightness of image in bright ambient light still needs to be resolved). This makes videoconferencing a more immersive experience, and one that is better integrated into the deep structure of the built environment. Additionally, TBO explores the viability of a voice-activated AI assistant that is integrated into the built environment. Although its initial AI features and natural language processing are somewhat limited in the initial prototype, TBO validates the concept and sets the stage for further AI research into extended reality engineering.

When the conversation is finished, TBO goes back to its original location and once again becomes an inconspicuous part of the decor. The TBO prototype demonstrates that robotic furniture is a viable alternative to common conventions for telepresence, and advances new directions for ubiquitous computing and AI user experiences, eschewing wearable and other portable computers in favor of building these technologies directly into the built environment. TBO asks how telepresence might exist in extended reality. One can imagine furniture-like objects functioning as a kind of universal interface—as an AI assistant—when called upon, while at all other times they simply present themselves as functional and beautiful pieces of furniture.

Blending furniture typologies with robotics and telepresence applications provides an opportunity to imagine new modalities for extended reality. Both the LSMTR and TBO have a conventional spatial relationship to the user in so far as the user or users sit in front of the screen or projected image, thereby fixing the user and interface across from one another. But when one user finds themselves in the physical company of a group of more than two or three individuals, the interactions within a given space can become far more dynamic, and demand a more complex spatial orientation than a bilateral symmetry of the monitor/user interface can provide. The surrounding spatial context becomes important to consider.

Mixed Reality Passthrough Windows (MRPWs), or Janus screens, as they have come to be known—in reference to the Roman god with two faces, one facing forward and the other backward—gave further insight into how screens might be used to create a more spatial and immersive experience for extended reality and telepresence [3]. MRPWs feature two LCD screens situated back-to-back, with two mounted cameras also facing in opposite directions. This creates the effect of looking through a window, upon which virtual content can be overlaid. Three MRPW versions were developed, each exploring a different aspect of immersive telepresence in extended reality: an MRPW that is integrated into a table (Figure 3), an MRPW that is integrated into a laptop (Figure 4), and an unrealized mobile version that builds upon the LSMTR design typology (Figure 5).

ins04.gif Figure 3. Janus Table.
ins05.gif Figure 4. Mixed Reality Passthrough Window laptop prototype.
ins06.gif Figure 5. Mixed Reality Passthrough Window robot concept.

Videoconferencing has changed the way we work. Now it is not always necessary to go into the office to attend work meetings. This can be done virtually with common videoconferencing apps such as Skype and Zoom. But this has also come at something of a cost, introducing new inequalities into the way these conversations occur. Sometimes the contributions of local users dominate the conversation at the expense of remote users, especially when the meetings involve more than a few participants, and especially when an understanding of the physical context is essential to contributing to the conversation.

Building upon the furniture design paradigm explored in the TBO project, a Janus screen was integrated into a round table. Like King Arthur's Round Table, the circle is nonhierarchical within the physical space. By orienting the Janus screens to face both sides of the circle, it establishes a more equitable field of vision for both remote and local users. The mass of the Janus screen, about the scale of an oversize desktop monitor and offset to one side, evokes the presence of a body, much like the scale and movement of the LSMTR. Local users can look through the Mixed Reality Passthrough Window to local colleagues on the other side of the table, as well as to their remote colleagues on the screen, seen from both sides, and visible to all. The remote users see all local users, on both sides of the table.

There may be occasions when presentations can't be held in a space with this kind of extended reality videoconferencing infrastructure. To address this concern, a conventional laptop version of the Janus screen was developed [4]. The laptop works like any other laptop, but it has an outward-facing screen in addition to the screen facing the primary user. Like the Janus table, the Janus laptop furnishes a screen that can be seen through, like a window, and upon which content can be overlaid, like a public monitor. The second, outward-facing screen is particularly useful for making presentations to local users, even when telepresence and videoconferencing is not necessary, especially in spaces where other monitors or projectors are not available. This feature is also present on the Janus table, where the larger screen gives greater impact to presentations.

Awareness frames reality. The ways designers and engineers frame an awareness of others through telepresence technology, both physically and remotely, sets the conditions for the kind of interpersonal experiences that are possible. These prototype studies establish three design principles for the development of telepresence in mixed reality: scale, movement, and context. Commensurate scale sets the initial conditions for comparison for encounters between remote and local users. Images that are smaller than life size disrupt an expectation of verisimilitude. Integrating movement into users' experiences allows for more dynamic and collaborative interactions. And context, which is to say the user's physical and virtual proximity to the space around them, to the built environment and the furniture already around us, allows for an experience that aligns more comfortably with expectations about unmediated reality.

back to top  Video Demo Links

back to top  Acknowledgments

Mandy He, Kevin Hsu, Leon Lau, Arun Kavishwar, Jinha Kang, Maya Fleischer, Michael Chandler, Li June Choi, Vanessa Chang, Asad Khan, James Li, Ivan Pineda-Dominguez, Matthew Lee, Yuxin Han, Jung Yeop Kim, Karthik Desingh, Aaron Gokaslan, Yumeng Ma, Celina Ye, Tianyi Shao, Ada Chen, Ray Sun, Jae In Kim, Jackson Delea, and Peter Haas contributed to these projects.

back to top  References

1. Gonsher, I., Han, Y., Desingh, K., and Gokaslan, A. Prototyping mixed reality large screen mobile telepresence robots. Proc. of the 5th International Workshop on Virtual, Augmented, and Mixed Reality for HRI, 2022.

2. Gonsher, I. and Kim, J.Y. Robots as furniture, integrating human-computer interfaces into the built environment. Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. ACM, New York, 2020, 215–217.

3. Gonsher, I. et al. Integrating interfaces into furniture: New paradigms for ubiquitous computing, mixed reality, and telepresence within the built environment. Paper presented at the Media Architecture Biennale 2023, Toronto, Ontario, Canada.

4. Gonsher, I., Ma, Y., Pineda-Dominguez, I., Lee, M., and Han, Y. The mixed reality passthrough window: rethinking the laptop videoconferencing experience. Human Interaction and Emerging Technologies (IHIET-AI 2023): Artificial Intelligence and Future Applications 70 (2023).

back to top  Author

Ian Gonsher is an assistant professor of practice in the School of Engineering and Department of Computer Science at Brown University. His teaching and research interests examine creative process as applied to interdisciplinary design practices, and the development of emerging and speculative technologies. [email protected]

back to top 

intr_ccby.gif This work is licensed under a Creative Commons Attribution International 4.0 License.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2024 ACM, Inc.

Post Comment


No Comments Found