Forums

XXIII.2 March + April 2016
Page: 62
Digital Citation

Adaptive architecture


Authors:
Holger Schnädelbach

The concerns of architecture, pervasive computing, and interaction design have been overlapping for quite some time. This forum provides a welcome outlet to discuss the impacts of interactive technologies becoming embedded into our surroundings and the use of interactive technologies to reinvent the built environment, as forum editor Mikael Wiberg has framed it [1]. Here I want to focus on the feedback loops that emerge between people and their built environments when the latter are technically augmented—that is, sensed and actuated by technical means. For this, it is useful to first outline where I see this work fitting into the wider landscape of architecture and interaction.

Insights

ins01.gif

Work at the Mixed Reality Lab at the University of Nottingham considers the whole spectrum of adaptivity in architecture, seeking to bridge the concerns of a broad range proposals throughout architectural history. In this context, adaptive architecture has been defined as concerned with buildings specifically designed to be adaptive to their environments and to their inhabitants [2], deliberately employing a very broad term. This definition recognizes the fact that architecture not augmented with computing is already highly interactive, as Richard Coyne described in a previous article for this forum [3], pointing out its inherent support for play and conflict. Adaptive architecture therefore includes very diverse examples, such as the “manual” adaptivity present in the Rietveld Schröder house from the early 20th century and the computationally driven adaptations proposed for the smart homes of the 2000s. It ranges from the everyday adaptivity found in modern office buildings to the highly experimental projects emerging from a growing number of architecture schools globally, such as TU Delft, UCL, and MIT.

As a community, we need to continue charting the field, maintaining awareness of existing work and the many connections to that work, which we can build on to avoid going over the same ground again and again. To that end, a series of excellent books are available—too many to mention in this short piece—that trace the field of adaptive architecture under the banners of flexible, interactive, responsive, and robotic architecture. There have also been dedicated conferences, such as the Adaptive Architecture conferences series, as well as workshops at CHI more recently.

Our own contribution to this effort is a tool (www.adaptivearchitectureframework.org; drawing on [2]) that allows people to collaborate on that charting exercise through crowdsourced information. The Adaptive Architecture Framework proposes four core questions:

  • In reaction to what is architecture being made adaptive?
  • How is it done?
  • What adapts?
  • What are the effects of adaptations?

We illustrate these themes with examples from across architectural history. The tool will hopefully continue to grow and develop with support from the community.

Adaptive Architecture Feedback Loops

While charting the design space is important, it does not actually get to the crux of the challenge: What does it mean to inhabit adaptive architecture? How does adaptive architecture affect quality of life, well-being, and a shared sense of place? There is nothing in the technology itself that would determine the answers to these questions; it is up to us architects and interaction designers to answer them. I will not be able address them comprehensively here, but I would like to look at these issues through one particular lens: the interaction feedback loops that emerge between people’s behaviors and the adaptations designed to be present in the environment, whether they are manually actuated or digitally driven.

The existence of longer-term feedback relationships have already been established; for example, some have focused on how social rules and norms shape the topology of architectural spaces around us and how the resulting topology shapes people’s spatial behavior [4]. On a different spatial and temporal scale, interaction designers have considered the way we shape and are influenced by the affordances of our physical and digital surroundings. Adding computational technologies into the mix, combining sensors, software infrastructure, and actuators with the built environment, allows for much more immediate feedback loops.

My concern here is first to relate the movement of people to movement in architecture and second to discuss the ways in which personal data (of which movement information is a key part) is at the center of our relationship with digitally driven adaptive architecture.

Embodied Movement Interaction

Examples across the historical spectrum of adaptive architecture demonstrate how people’s actions can change topologies and affordances of the spaces that people occupy and how people’s actions are in turn shaped by those changes. Analyzing this dynamic in manually adaptive and digitally driven adaptive built environments highlights the importance of movement in people and in architecture, and the opportunities for coupling these two movement spectra.


Movement in people is translated into movement in architecture, and an action-reaction feedback loop emerges.


There are two reasons for this interest in movement. The first reason is that all expressions of our behaviors are motor acts (while not all behaviors are expressive, e.g., thinking), resulting in movements somewhere in our bodies [5]. Whenever we are interacting with others or with the environment, movement is involved, and a broad classification of such movements will be useful for further discussions. Human movements occur on a continuum of different scales, ranging, for example, from movements of the eyelids to walking, where the whole body is moving. The detectability of human movement places it on a spectrum from least expressive, such as movements in the gut, to most expressive, such as smiles, grimaces, and other faces we make. One can also reason about the level of control we have over those movements: Our heart is autonomic, while we tend to speak only when we want to. At this stage, it is important to note that this categorized movement range is very broad, describing an average movement range across which individuals could vary substantially, and for many different reasons. Thomas Fuchs has detailed how interpersonal interactions are coupled through inter-bodily resonance, drawing largely on the expressive movements of interaction partners and their unconscious perception [6]. Movement at different scales, levels of expressiveness, and levels of conscious control become coupled between interaction partners in this view.

This brings me to the second reason for the interest in movement: the fact that elements in architecture are increasingly designed to move. In manually adaptive architecture such as Rietveld’s Schröder house, Holl’s Fukuoka housing, and Ban’s Naked House, movement of architectural elements is actuated directly through human movement, with a one-to-one mapping between human and architectural movement. Architectural movement requires physical effort and force to trigger. In technologically driven architectural prototypes such as TU Delft’s Muscle Tower, Ruariri Glynn’s Reciprocal Space, and our own ExoBuilding, human movements from respiration to whole-body movements are technologically coupled to architectural movements. This involves a sensing infrastructure, software middleware, and a system for actuation. This technological mediation allows the introduction of translations, for example, amplifying the scale of human movements to larger architectural movements.

In such work—and similar to interaction between multiple people—interaction with adaptive architecture becomes movement-coupled. Movement in people is translated into movement in architecture, and an action-reaction feedback loop emerges. What might such a coupling between human and architectural movement be like? The work with ExoBuilding has demonstrated the potential for such couplings to induce deep relaxation in inhabitants under certain circumstances [7]. Although this is only a single point of reference, and generally there is far too little study data available at the moment, such couplings continue to be proposed. As a direct response, we have started experimenting with a more general-purpose movement-mapping platform to conduct basic research in this area.

Personal Data—Big Data

Beyond the suggested movement couplings, people’s movement behaviors can also reveal important information about them, including their location and spatial relationship to other people, objects, and places. When adaptive architecture is digitally driven, such movement information becomes sensed, recorded, and stored to be able to manipulate actuations in architectural elements. This can be combined with other personal data, such as physiological, identity, activity, and social networking data, technically available through sensors embedded into places and carried by people.

Measuring, recording, and storing personal data for actuation gives digitally driven adaptive architecture access to personal data in unprecedented ways. Compared with the slower interaction between people and place described through space syntax and affordances, it establishes a much more immediate feedback loop between people’s behaviors and the behaviors of the built environment. This feedback loop is driven by personal data in the broadest sense, as illustrated in Figure 1.

Some technical infrastructure is used to acquire personal data of various types. Such data might be manipulated, interpreted, aggregated, and stored before it is used to actuate aspects of the built environment. Technical actuations of, for example, the lighting or architectural elements then affect the environment. The resulting changes in that environment feed back on the inhabitants from whom the personal data was acquired, and so it continues.

This kind of coupling between human and architectural behaviors through personal data is useful to abstractly describe interaction with many examples of digitally driven adaptive architecture, regardless of what their particular way of actuating the environment might be. It also highlights the potential for employing Fuchs’s model of interbodily resonance to people’s interaction with adaptive spaces. The resonance between person and space becomes modulated by technology and data in a very particular way. The potential for adaptive spaces to become embodied interaction partners can be seen in the aforementioned experimental work [7].

However, this interaction is very much limited to a local perspective and local feedback loop, and limited to a scale that makes the described embodied interaction legible, even if one takes into account multi-occupancy. Adaptive architecture through a sensing infrastructure is part of a much larger concern, though. The built environment is the site of the global technical infrastructure that turns personal data (acquired from personal devices and embedded sensors—the Internet of Things) into “big data” via infrastructure embedded into the built environment (from rooms to smart cities), stored in the cloud and mined by global IT corporations.

Some technical infrastructure located in the built environment and in relative proximity to people (e.g., ID card readers, motion sensors, but also Wi-Fi, mobile networks, and CCTV) is used to let people actively provide personal data and acquire personal data where individuals remain more passive (e.g., face recognition in CCTV). Very much invisibly, some of that data is transmitted from people via the built environment to the cloud. It is in the cloud that data will be stored, mined, and used for user profiling, regulation, and decision-making. Viewing this from a standpoint that possibly begins to see buildings as interaction partners, the built environment becomes “complicit” in the invisible journey of personal information from individuals to second and third parties, stored for whatever illegible purposes and with very little recourse to people.

Adaptive Architecture as Personal Data Interface

This is particularly problematic if the generic user models drawing on big data are employed in user interaction, when people differ too widely for such models to be accurate enough in enough cases. Elizabeth Churchill has called for the development of scrutable user models [8], providing people with much clearer input in the capture, storage, and modeling of data, and allowing them to make corrections to such models. With computing becoming embedded into the environment in adaptive architecture, there is enormous potential not only in the design of the environment as the interface to such models but also in personal data itself. This is the case for the more locally focused feedback loops outlined above, where we should, for example, design the tools for inhabitants to decide which of their personal data streams are mapped to actuate which part of the space they occupy. It is also the case for the hidden role of buildings in the journey from personal to big data, in which we need to design legible forms of interaction with that journey.

References

1. Wiberg, M. Interaction design meets architectural thinking. Interactions 22, 2 (2015).

2. Schnädelbach, H. Adaptive architecture—A conceptual framework. Proc. of the MediaCity. Bauhaus-Universität Weimar, 2010, 523–555.

3. Coyne, R. Places to play. Interactions 22, 6 (2015).

4. Hillier, B. and Hanson, J. The Social Logic of Space. Cambridge Univ. Press, Cambridge, U.K., 1984.

5. Solodkin, A., Hlustik, P. and Buccino, G. The Anatomy and Physiology of the Motor System in Humans. Cambridge Univ. Press, 2007.

6. Fuchs, T. The Phenomenology of Affectivity. Oxford Univ. Press, 2013.

7. Schnädelbach, H., Irune, A., Kirk, D., Glover, K., and Brundell, P. ExoBuilding: Physiologically driven adaptive architecture. ACM Transactions in Computer Human Interaction (TOCHI) 19, 4 (2012), 1–22.

8. Churchill, E.F. Scrupulous, scrutable, and sumptuous: Personal data futures. Interactions 21, 5 (2014), 20–21.

Author

Holger Schnädelbach (@hschnadelbach) is a senior research fellow at the Mixed Reality Lab, School of Computer Science, University of Nottingham. He is an architect with more than 15 years’ experience in HCI research focusing on the intersection of information technology and architecture. holger.schnadelbach@nottingham.ac.uk

Figures

F1Figure 1. Adaptive Architecture Feedback Loop: Personal data acquired from inhabitants actuates architectural elements, which changes the environment, in turn feeding back on inhabitants.

©2016 ACM  1072-5520/16/03  $15.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2016 ACM, Inc.

Post Comment


No Comments Found