Let me tell you a story—a frightening, disturbing one. A story about a future we would never have purposefully set out to create. As this is a scary tale, we need to begin in the dark, so close your eyes and picture in your mind's eye the first memory you have of a robot.
What was it? Perhaps, like me, it was a tin wind-up figure like the one pictured in Figure 1. In my case, it sparked to life when I hauled it out of the sack Father Christmas left at the foot of my bed on December 25th, many, many years ago. I wound it up, released it, and saw it teeter unsteadily along my bedroom floor—I was both terrified and delighted.
|Figure 1. Early memories of a robotic future.|
From our earliest days, be it as a small child receiving a Christmas gift or collectively as societies focused on technological progress, perhaps we knew a day would come when delight with the possibilities offered by our innovations would turn to fear and despair. We knew a day would come when the next stage in the slither-to-walk-to-human evolution would be robots that outclassed and outpaced us—when they, not us, would be the most heavenly of creations.
In movies, books, and magazines, the source of this invasion is often portrayed as coming from "outside," an alien planet that has hyper-advanced technology intent on colonizing ours—think the Marvel franchise, Daleks, or the Transformers. But the genesis of our downfall was always going to be ourselves: For centuries people have been attempting to create automatons.
One of the most famous examples of the fascination with creating new forms of sentient, autonomous life is Mary Shelley's Frankenstein. The story illustrates what happens to creations that people don't understand, that they distrust, or that frighten them. For Frankenstein's monster, then, instead of acceptance, the response to this new life form was violently negative. So perhaps the current darkness descending on the digital, as people worry about AI and big data, is the first sign of us taking up pitchforks and torches to chase out a technology we have created.
While we are thinking about creations and creators, let's consider two other "creation" stories to help us think about our AI and robotic destiny. Turn first to Greek mythology and to the demigod Prometheus. Picture him tied to a mountaintop, receiving terrifying punishment from the gods. Each day, birds swoop down to pick painfully with surgical precision at his liver, that agony relived day after day as each night it regrows.
What could he have done to anger the gods so much to receive such punishment? His sin was to give humankind the power of fire, a technology that enabled them to tame nature, expand their horizons, and create tools—and with all this, gave them the ability to rely on themselves rather than the gods. In a TV interview, Stephen Fry muses whether there will be a modern-day Prometheus who empowers AIs to such an extent that they no longer need us:
As we stare down into our mobile devices, are we fusing with the metallic forms of our mobiles, click by click, only too late realizing that we are becoming the robots?
Will the Prometheus who makes the first piece of really impressive robotic AI—like Frankenstein or the Prometheus back in the Greek myth—have the question: Do we give it fire?... In other words: Shall we be Zeus and deny them fire because we are afraid of them? Because they will destroy us? The Greeks, and the human beings, did destroy the gods. They no longer needed them. And it is very possible that we will create a race of sapient beings who will not need us.
Now consider the biblical account of the Garden of Eden. Adam and Eve were created by God to live with him in the paradise of Eden. One day, they were tempted to eat of the tree of knowledge of good and evil, with the devil encouraging them, saying that by eating its fruit they "will be like God." Some developers are creating modern-day Adams and Eves, built to dwell with us in the paradise of our technologically advanced homes, offices, and hospitals. But perhaps like the first Adam and Eve, these social robots will seek to become increasingly like us and then leave us, focused on their own self-determination.
Surely Things Can't be This Bad?
With this talk of the Bible and the like you may be writing me off as a street-corner prophet of doom, shrilly predicting that the "end of the world is nigh." You might be thinking that I'm exaggerating the concerns: Will robots really rise up? Will they really want to become better than us? How will they master what we humans have taken millennia to evolve? Will we need to take up pitchforks and torches to chase them out of town?
After all, we've always hyperventilated over technological progress, only to see things turn out in more happy, mundane, benign ways. So, the scary-looking humanoid cleaner robots that were seen in consumer exhibitions in the 1950s turned out to be sleek, subtle, and silent Roombas. The Frankenstein-like movie monsters have been replaced by cutesy Wall-E robots—there's even a robotic pillow that you can hug to send you to sleep (Figure 2).
|Figure 2. Surely nothing to fear here? Wall-E (left) is a far cry from Frankenstein's monster; and robots will even soothe us to sleep (right, https://meetsomnox.com/).|
So perhaps we can—nestled up to one of the pillows—relax and sleep easily?
Things are Worse Than we Thought
Not quite yet.
Cut to another scary image, this time of a customer, emerging from a tech store, the newest mobile phone in hand. He holds it aloft, his eyes raised to the heavens. It looks like he is on drugs—and he is, and so are we. Enchanted by the devices that offer us endless interactions, entertainment, connection, and creativity.
Much has been written over the past several years of how these devices are turning us all into modern-day Narcissuses, staring down into the dark pools of our sleek mobile devices, oblivious to those around us who love and care for us. You'll no doubt have experienced many situations when you are sitting with friends and family at a restaurant, in a meeting, or even on a sports field while they peer down and prod the dead glass screen.
While there's much debate about the actual impact of such behaviors on our well-being, it's undoubtedly the case that many people are worried about what these technologies are doing to themselves and those around them. Unlike earlier moral panics around technology, those worrying include the next generation; for example, in a CNN survey in 2017, 54 percent of children worried that their parents spent too much time on their mobile devices .
When I was a kid, one of the scariest robot forms was the Cybermen in the TV series Doctor Who. Humans were gruesomely transformed into these machines in a process involving chainsaws, with their original flesh and bones co-opted bit by bit into a metallic form. The final step of the conversion came as the human's emotional abilities were extracted, leaving them deadened and devoid of empathy. As we stare down into our mobile devices, are we fusing with the metallic forms of our mobiles, click by click, only too late realizing that we are becoming the robots?
While the Cybermen frightened me, there's an even scarier picture you can find by googling "Mark Zuckerberg VR crowd." You'll see an image of a seated throng all wearing VR headsets, their eyes blanked by the devices strapped to their faces. If that's not concerning enough, the juxtaposition of a smiling, unencumbered Zuckerberg walking purposefully down the aisle adds to the discomfort. We have become the robots, enslaved to provide data, value, and money to a powerful few.
It's not too late and now is the time to act—so what can we do? As a start, I'd like to suggest four ways forward:
- Promote and practice digital detox and design for digital well-being.
- Design out tempting trivial interactions and make tech less obtrusive and more shareable.
- Amplify what it is to be human.
- Get a new perspective from people who are still more human than robot.
Promote and practice digital detox and design for digital well-being. If you haven't tried a digital detox, I'd recommend it as a way of comparing your sense of well-being before and after. You'll find you feel more, not less connected to what matters if you follow simple steps like keeping your phone out of sight during meals and meetings .
If you are an app designer or developer, you can help others, too. For inspiration, experiment with the dashboards launched by both Apple and Google to help users understand and moderate the use of their mobiles. If you want to be more creative, consider designing in a way that draws on what our bodies do if we abuse them: What happens if you spend too long in the sun? Your body gets hot and eventually burns, warning you to cover up. Or if you overeat, you begin to feel full, then bloated, then sick. What might be the interaction design equivalents?
Design out tempting trivial interactions and make tech less obtrusive and more shareable. When I was growing up, the only phone in the house was a landline that sat in the hallway, silent until its bells rang loudly for an incoming call. When it rang, we paid attention, with whoever was closest answering its cry, as automatically as a parent tends to a newborn babe. Someone was making an effort to contact us, so we attended. These days, all of our apps, services, and even mobile sites seem to want to grab our attention continually, with notifications mushrooming. While users can moderate notifications via both device and app settings, there is scope for more nuanced (and less cumbersome) ways to reduce trivial distractions. So, for example, instead of notifying whenever someone retweets, why not learn what the user wants, perhaps bringing it to their attention only when many people have liked what they have posted?
Smartwatches have been heralded as one form of mobile that is less obtrusive, although there is some evidence that it drives people to greater interactions by being more at hand than a mobile in the pocket or a bag. But one of the nice things about watches that might have been overlooked in the new digital era is that they facilitate physical social interaction. Think about the times you have asked or been asked by a stranger for the time (Figure 3a). Inspired by this, in our work, we have looked at making smartwatch displays that benefit both observer and wearer (Figure 3b). Our thinking was to use the watch to draw us together, rather than to promote retreat into automaton states.
|Figure 3a. Some technology, like conventional watches, keeps us human.|
|Figure 3b. Using smartwatches to act as a public display. The display is designed to be visible by and useful for the onlooker (in this case, Tim's companion) rather than the watch wearer (Tim). The onlooker here can remind Tim of the upcoming meeting .|
Amplify what it is to be human. Every morning, I do two things that really remind me that I am flesh and bones—far more than the robot I might become as I get further drawn to the dark screen of my mobile (Figure 4). As I cycle through the beautiful coastal dawn and then later slice through the pool, I use the bike and water as media to express and experience in ways that go far deeper than any current computing device or service can enable. In a similar vein, Kia Höök has written beautifully of her experiences with horse riding, encouraging us to create relationships with technology that are more like it, allowing us to become less cyborg or cyberman in form and more a natural blending of human and tech—or, in her words, more centaur-like .
|Figure 4. Being alive—how can we learn from highly physical activities to design technology that amplifies what we are rather than deadening us by the digital.|
Get a new perspective from people who are still more human than robot. For a decade or so, our team has had the privilege of working with communities and individuals who are only just getting their hands on digital technology. Many of us will have a mobile phone, perhaps a laptop or a tablet, possibly a smart TV, home networking, and so on. But for those who have been called emergent users, the mobile smartphone is their first exposure to digital devices and services.
As I cycle through the beautiful coastal dawn and then later slice through the pool, I use the bike and water as media to express and experience in ways that go far deeper than any current computing device or service can enable.
These users inhabit contexts and have a range of constraints and abilities that are quite distinct from "conventional" users like you and me. Typically, they possess lower levels of educational attainment (impacting, for instance, textual literacy) and have limited income. For them, a reliable power grid is not guaranteed, and personal living space can be constrained.
Our aim has been to explore fresh perspectives, to help imagine alternative technological futures. As an example, consider AI voice assistants like Alexa and Google Home. We have been experimenting with putting adapted versions of these devices into the streets of Dharavi, a large informal settlement in downtown Mumbai, India (Figure 5). In particular, we have been comparing the performance of boxes powered only by AI with those that enable answers via the cloud.
|Figure 5. Experimenting with AI and human-powered speech assistants in Dharavi .|
We've learned a lot from these deployments. One thing that stands out is that the AI struggles with many questions. Sometimes this is because of speech-recognition issues or lack of local context; however, other times the failure is because the AI is an alien. Not human, not of us.
In a blog post, Richard Harper points to Wittgenstein's famous statement, "If a lion could talk we could not understand him." The argument is that even if a lion and human could converse, because they come from completely different worlds, there would not be any meaningful conversations .
If the current hype over AI recedes, maybe we will see more clearly that we will never have deep, meaningful interactions with machines, even if we can converse with them. However, I am scared that something else might happen: that the only way we and AI will be able to truly understand each other is if we become like them—if we become robots, shaped by our continuous interactions on our mobiles.
To end, let's picture two future robots. One is the familiar, humanoid-like device. I caution that worrying about what these technologies will or won't do for and to humanity is a distraction. The other image is of a young baby, prodding, heads-down on a tablet computer (if you can't picture it, simply search online for "baby mobile phone"). Rather than being adorable and cute, this sight should wake us from our own digitally induced click-by-click hypnosis to act before the next generation becomes what we have always feared, but in ways we couldn't imagine.
If you want to read more suggestions for designing for humans rather than robots, expanded examples appear in the book I co-authored with Simon Robinson and Gary Marsden, There's Not an App for That (Morgan Kaufmann).
1. LaMotte, S. Smartphone addiction could be changing your brain. CNN. Dec. 1, 2017; https://edition.cnn.com/2017/11/30/health/smartphone-addiction-study/index.html
2. Hayes, M. How to quit your tech: A beginner's guide to divorcing your phone. The Guardian. Jan. 13, 2018; https://www.theguardian.com/technology/2018/jan/13/how-to-quit-your-tech-phone-digital-detox
3. Pearson, J., Robinson, S., and Jones, M. It's about time: Smartwatches as public displays. Proc. of the 33rd Annual ACM Conference on Human Factors in Computing Systems). ACM, New York, 2015, 1257–1266.
4. Höök, K. Transferring qualities from horseback riding to design. Proc. of the 6th Nordic Conference on Human- Computer Interaction: Extending Boundaries. ACM, New York, 2010, 226–235.
5. Pearson, J., Robinson, S., Reitmaier, T., Jones, M., Ahire, S., Joshi, A., Sahoo, D., Maravi, N., and Bhikne. B. StreetWise: Smart speakers vs. human help in public slum settings. Proc. of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, New York, 2019, Paper 96.
6. Harper, R. Dialogues with computers? Profharper blog. July 9, 2013; https://profharper.wordpress.com/category/dialogues-with-computers/
Matt Jones is the author of two books and many research articles that have helped shape the field of mobile HCI and UX (Mobile Interaction Design with Gary Marsden and There's Not an App for That with Simon Robinson and Gary Marsden). [email protected]
Copyright held by author. Publication rights licensed to ACM.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.
No Comments Found