Authors:
Eva Eriksson, Lone K. Hansen, Peter Dalsgaard
In this feature, we apply a decennial perspective to discuss the interweaving of technology, everyday practices, and societal infrastructures. We ask what happens when technologies become so embedded in everyday life that we stop talking about them and who gets left out in this process. The occasion of the Aarhus decennial conference 2025 made us look at the intersection of technology, design, users, and societal aspects—a perspective informed by a 10-year outlook. Specifically, we discuss how we, as HCI researchers, should not allow a technology to fade from sight while its social effects intensify.
Securing personal digital devices is one of the first things that happens during an intake at a women's shelter in the Global North. Spyware is detected and removed, and devices are decoupled from family accounts and cloud services. Following shelters' elaborate guidelines on how and when to untangle services, devices, and systems is tedious work because many services and products are optimized for invisible linking and data collection. Abusers take advantage of this, and it is of little use to hide physically if you can be found digitally.
→ Once a technology stops being named, its biases, built-in choices, and exclusions become harder to contest.
→ Applying a 10-year lens exposes convenience for many, but loss of agency for those at the margins.
→ Without sustained engagement from HCI researchers, marginalized users bear the cost of decisions they did not make.
Similarly, when an abusive partner leaves a household, some survivors find that they are no longer in control of doors, heating, or music. The ease and convenience of connecting laundry machines, heating controls, lights, lawn mowers, and cars to an Internet of Things (IoT) infrastructure clearly comes with drawbacks. Reports show that the more connected a home is, the less connected the women who live there are: In heterosexual couples' houses with smart home equipment, the men usually set it up and manage it more often than the women do. For some users, this becomes a nightmare: In order to restrict an infrastructure, you need to understand it. And there is little help available from the IoT systems we have built over the past decade, partly because they are built to disappear seamlessly into everyday life.
This shows that design matters, and it is not a new lesson. Rather than analyzing one design at a time, applying a decennial perspective makes clear how all technological paradigms are entangled with societal, individual, and technical structures. From this perspective, we see how the ways in which technologies disappear from sight are not neutral.
The IoT field matured 10 years ago, when we went from research experimentation with user interface and Internet Protocol version 6 scenarios to integrating them into consumers' everyday life. At roughly the same time, the IoT Conference shifted in 2014 to an annual event, after taking place biannually since 2008. In the U.S., Big Tech stepped into the consumer market: Apple launched HomeKit in 2014, Google acquired Nest in 2016, and Amazon bought Ring in 2018, collectively boosting the market for the home IoT and renaming it as home automation or smart home. The uptake of this technological paradigm has driven development to a point where, in 2025, unless we experience IoT from a particular perspective, such as how it mediates relations between abusers and survivors, it has largely dissolved into the backdrop of our daily environments. Connectivity is barely mentioned as a feature, and we take for granted that most machines will have an app attached to them. IoT is present, but it is unnoticed and unnamed.
When technologies become taken for granted and stop being a concern for those who benefit from them, the opposite is true for those who experience frictions and fractures.
Currently, generative AI appears to be on a similar path, moving at an accelerated pace into everyday products and services that affect consumers and citizens. In a very short period of time, the effects of GenAI are everywhere, although already barely mentioned as a feature. Vast resources enable GenAI to simply be there, and only glitches prompt us to notice and name it.
Being part of the organizing team of the Aarhus decennial conference 2025—a conference that has taken place every 10 years since 1975—has prompted us to think about the intersection of technology, design, users, and societal aspects, both retrospectively and prospectively. Analyzing discourses and foci over the course of a decade—long enough for a noticeable change to happen but short enough for the past to not feel distant—unlocks different perspectives on these intersecting matters. We use that decennial perspective on the technological paradigms of IoT and GenAI as a way to understand how we might better pay attention to what happens at the margins of invisibility. These technologies have different effects on different types of people. When technologies become routine or when they work without requiring user interaction, the systemic lack of noticing introduces new forms of friction or exacerbates existing inequalities. This dimension highlights the uneven impacts of design and technological adoption and raises questions about who benefits and who is left behind. We posit that when technologies become taken for granted and stop being a concern for those who benefit from them, the opposite is true for those who experience frictions and fractures. Technology is invisible to and often unarticulated by its beneficiaries, but acutely visible to and only sometimes articulated by those it marginalizes.
With this article, our aim is to invite critical reflection on how HCI and design can remain attuned to these shifts in visibility, as they influence how we engage with the technologies that shape our lives. Through this, we ultimately ask how we can use our methods, approaches, and knowledge to improve the coming decade of GenAI—when we are not GenAI researchers—to become more attuned to the margins and centers that every technology inevitably introduces, as well as to the shifting visibility of technology in our lives. Seen through a 10-year arc, disappearance is not a quiet success, but a signal to examine who currently lacks leverage over the system.
From Hype to Background: How Technologies Disappear
What happens when technologies move from front and center, buzzing with the promise of disruption and innovation, to being ubiquitous and integrated into our infrastructures, disappearing into our everyday practices? Using the lens of domestication [1] and infrastructural embedding [2,3] in a decennial perspective, we point to how a technology's path from visibility to invisibility, while signaling success in some contexts, can obscure the struggles of those for whom it fails to materialize as beneficial. While new technologies hold the potential to provide significant benefits to many people and communities, it is crucial to consider those left on the margins of technology development and adoption. Every individual and community doesn't experience technology as empowering or beneficial; for some, it may lead to unintended negative consequences, frictions, or disempowerment.
Central to sociotechnical systems theory, Wiebe E. Bijker and colleagues [2] offered a comprehensive perspective on how technologies and social systems coevolve, emphasizing the processes through which technologies become embedded in society. Their framework on the social construction of technological systems highlights the reciprocal relationship between societal forces and technological developments, showing how these interactions guide the transformation of technologies from novel innovations to integral, often invisible components of everyday infrastructures. Similarly, Geoffrey C. Bowker and Susan Leigh Star [4] discuss the embedding of technologies into infrastructure, noting that as technologies become essential to sociotechnical systems, they fade into the background. This also connects with Roger Silverstone and Leslie Haddon's [1] concept of domestication, where technologies are appropriated and become part of the routines and rituals of everyday life, thereby losing their initial novelty. Taking an infrastructuring perspective on technologies means moving beyond viewing infrastructures merely as networks of technical objects. Instead, this perspective foregrounds technology as a political, value-laden, and performative configuration, and emphasizes the processes through which algorithms actively shape reality.
GenAI as Infrastructure in the Making
In 2025, GenAI stands at a similar threshold as IoT did in 2015. It is no longer a niche technology being explored and developed in research labs. Rather, it is quickly becoming available to everyone, but it is also disputed and questioned. Additionally, it is seemingly developing at a much faster pace than IoT did and with broader scope. What distinguishes GenAI is not only the scale of its uptake but also the breadth of domains it touches—from creative expression to institutional operations. What we are witnessing is not merely a rapid spread of new tools but the early stages of infrastructuring. It is becoming so seamlessly integrated into workflows, services, and institutional routines that it is taken for granted, even as it subtly reshapes practices and relations across society. A key driver in this process is the integration of GenAI into productivity tools that are already widely adopted. From writing assistants in word processors and messaging services to automated email responses, slide design suggestions, and code generation, GenAI is increasingly embedded not as standalone software, but as background functionality in systems people already use. This mirrors earlier transitions observed with spellcheckers, search engines, and recommender systems, which were once novelties but are now infrastructural. In this sense, GenAI is moving from visible subject of hype to invisible force shaping how work is done.
This infrastructural shift raises several concerns, including blackboxing. The deep embedding of GenAI systems makes their operations harder to scrutinize. Decisions, such as which phrasing is "better," which job applicant is a better fit, or which image best illustrates a concept, are delegated to models whose inner workings are opaque by design. Bruno Latour [5], among others, described blackboxing as a process through which a system becomes accepted as a working whole, even as its internal complexity is hidden. When GenAI becomes background infrastructure, blackboxing occurs at both technical and social levels: The model's mechanics are obscured, and the fact that its outputs are algorithmically generated can be forgotten.
The implications are not abstract, and the consequences are materially evident. GenAI depends on resource-intensive systems, such as data centers, power grids, and water, with far-reaching environmental and geopolitical effects. The environmental impact of GenAI is critical and possibly a key factor in other current crises. Additionally, the effects of GenAI are surfacing in infrastructural practices and policies on sociotechnical levels, with real effects on people's lives. One example is in hiring and recruitment: Automated systems are increasingly used to screen applications and rank candidates, ostensibly to increase efficiency. These systems, however, often replicate and reinforce existing social biases, as they are trained on historical datasets that reflect discriminatory hiring practices. Studies show that applicants with names associated with racialized minorities are significantly less likely to be selected by AI-based screening tools. These systems may appear neutral and efficient, but their decisions are shaped by patterns of exclusion that are difficult to trace or contest once embedded. Another area of concern is the displacement and deskilling of labor. Tasks that once required specialized expertise (e.g., transcription, translation, copywriting, image editing, and parts of legal analysis) are increasingly performed by GenAI systems. While this can increase accessibility and reduce costs, it also raises questions about the long-term effects on knowledge, training, and professional autonomy. In our own scholarly domain, students who often transcribed interviews as part of their research training now rely on AI tools to do the job in seconds. The benefit in terms of saved time is clear, but the opportunity to use transcription to practice close listening, analytical precision, and qualitative judgment is lost. What appears as convenience in one area may represent a hollowing out of practice in another.
The process of infrastructuring is also shaped by regulatory and institutional forces. As public institutions and governments begin to adopt GenAI (e.g., in digital service delivery, education, and public health), the technology gains legitimacy and permanence. Once a model is embedded into a bureaucratic workflow, it becomes very difficult to remove or substantially revise. This kind of institutionalization also reconfigures questions of digital sovereignty and accountability. The companies developing GenAI tools do not just supply software; they shape the very parameters within which decisions are made and services are delivered.
The process through which technologies become invisible and taken for granted is contingent, asymmetrical, and shaped by intersecting technical, social, and institutional dynamics. When we look back over the past decade, the case of IoT illustrates how a technology's disappearance from public discourse coincides with its deep embedding in infrastructures, routines, and systems of control. This disappearance is not neutral. In fact, it tends to benefit those who are already empowered, while amplifying the marginalization of those who are not. This observation reframes Melvin Kranzberg's famous adage: "Technology is neither good nor bad; nor is it neutral [6]." If we take the decennial perspective seriously, we see that the trajectory from hype to normalization leads to shifts in who gets seen and who holds influence, agency, and responsibility. As a technology fades into the background, so too do the design decisions, trade-offs, and exclusions embedded in it. And when something is no longer a topic of discussion, it becomes harder to contest, reshape, and resist [1,3]. To paraphrase and extend on Kranzberg's observation: The processes by which technologies become taken for granted are neither good nor bad; nor are they neutral.
This has clear implications for HCI and design research. Our field has often focused on the development and early adoption of new technologies: on prototyping, testing, and envisioning futures. But the challenge lies in staying with technologies during and after their stabilization. As researchers, we must commit to and sharpen our methods for engaging with the slow, cumulative consequences of systems, even as they are no longer novel. That extends beyond technical infrastructures and demands attending to the lived frictions, dependencies, and reconfigurations they result in over time. This means staying close to the people and communities who do not experience invisibility as comfort or efficiency, but as loss of agency or control. The encouraging part is that we are not starting from scratch. Methods and perspectives such as participatory design, ethnomethodology, critical analysis and design, feminist HCI, feminist science and technology studies, sociotechnical imaginaries, and infrastructural inversion are already a part of our tradition. But they need to be mobilized with a different temporal lens.
There is a pedagogical dimension to this as well. We must train students to better notice what becomes taken for granted and to pay attention to the subtle but significant consequences of technology normalization, especially for those at the margins of access, knowledge, and influence. The 10-year frame reminds us that today's small frictions are tomorrow's entrenched barriers, and acting now prevents a repeat of the IoT pattern. If GenAI is already on the way to becoming taken for granted by 2035, we must strive to create a different way of embedding marginalized perspectives into it from the one we used with IoT. We must keep looking, asking, and designing in ways that foreground the needs and voices of those most at risk of harm, exclusion, and marginalization. What disappears from view does not disappear from effect.
1. Silverstone, R. and Haddon, L. Design and the domestication of information and communication technologies: Technical change and everyday life. In Communication by Design: The Politics of Information and Communication Technologies. R. Mansell and R. Silverstone, eds. Oxford University Press, 1996.
2. Bijker, W.E., Hughes, T.P., and Pich, T., eds. The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. MIT Press, 1987.
3. Star, S.L. and Ruhleder, K. Steps toward an ecology of infrastructure: Design and access for large information spaces. Information Systems Research 7, 1 (1996), 111–134; http://bit.ly/4n8K7FG
4. Bowker, G.C. and Star, S.L. Sorting Things Out: Classification and Its Consequences. MIT Press, 1999.
5. Latour, B. Pandora's Hope: Essays on the Reality of Science Studies. Harvard University Press, 1999.
6. Kranzberg, M. Technology and history: "Kranzberg's Laws." Technology and Culture 27, 3 (1986), 544–560.
Eva Eriksson is an associate professor of interaction design at Aarhus University in Denmark, with a Ph.D. from Chalmers University of Technology. She is a principal investigator in several research projects at the intersection of HCI, child-computer interaction, interaction design, participatory design, and public knowledge institutions. [email protected]
Lone K. Hansen is an associate professor at Aarhus University. She conducts research in digital culture, art, and design, with a focus on feminism, the more-than-human, and the interweaving of technology and culture. [email protected]
Peter Dalsgaard is a professor of interaction design at Aarhus University and director of the Centre for Digital Creativity. His work explores the design and use of digital systems from a humanistic perspective, with a focus on collaborative design and creativity. [email protected]
This work is licensed under Creative Commons Attribution International 4.0.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2025 ACM, Inc.
Post Comment
No Comments Found