Authors: Uday Gajendar
Posted: Thu, March 07, 2013 - 9:51:11
We're now living in an era of "super-smart" devices that learn what we prefer (Nest), correlate data to anticipate our needs (Google Now), and quantify our habits for behavior change (Nike FuelBand). What's the role of designers in such situations of designing for anticipation to support human expectations across shifting contexts and devices? Are designers necessarily going to become "anticipation engineers" or "adaptive magicians" to shape a beneficial dialogue of human-based understanding with such smart objects surrounding us? Whoa. Let's step back for a second…
This all started for me recently while arranging to see an open house with my realtor via our mobile devices. We exchanged emails using Google Mail with specific address info (which I looked up on Google Maps) using an Apple iPhone. I also have a Google Nexus 7 with Google Now, on standby but sitting at home charging, all the while connected to my WiFi network. It did not come with me to the open house. However, upon returning from the viewing, I picked up the Nexus and noticed the Google Now card had mapped out a route for me to the open house, with traffic info and weather details. How could this be?
I felt a mixture of astonishment, gratitude, and yet…creepiness. I never told my Nexus where I was going nor took it with me. Yet it presumed and anticipated my needs, through a confluence of data mappings across activities with Google Mail and Maps, tied to my single Google account. Not sure I like that. Or can I grow to love it?
This brought to mind the famous quote by Arthur C. Clarke about sufficiently advanced technology being like magic. Is this what Steve Jobs meant about mobile, intimate devices that we carry on ourselves (and in our homes, cars, schools) now being "magical"? Maybe so, but there's some serious human-centered design implications to consider.
What happens when devices that we own become too smart for their own good? Always on, listening, cross-mapping, correlating, recommending, and anticipating what's next. This really seems to take "interaction design" and "user experience" to another level, whereby the operations are silent, invisible, and serendipitous, not explicit clicks and taps on a specific UI to perform a task.
Now we must design for insinuated dialogues of potential, not just the daily dialog boxes of accept/reject. These linger in the background, waiting to arise when the relevant correlation of data appears. This also suggests that designers will become arbiters of ethical management of interactions and anticipations. This is more than simply privacy or security concerns housed within "terms of service" agreements or IT regulations. Instead, does the user trust their device (and its network-enabled adaptive software) to protect, defend, and serve the user in their interests? That this device will not "rat you out" for some perceptually or contextually illicit activity (knows that you visited a doctor who sells medicinal marijuana, for example). That this device can truly be a friend, partner, ally, a "digital familiar" that travels along with you, as a personal aide that doesn't snitch or gossip or betray your intentions and supports your goals.
There is a rise of organic, contextual super-smart computing coming that truly engages the anticipated, presumed needs of the user. This is way beyond the old "Clippy" disaster, but perhaps points toward something like HAL or Terminator. In the pursuit of magical smarts, designers may be the frontline of defense to ensure a trusting, healthy balance between useful intelligence and presumptuous interpretation.
Posted in: on Thu, March 07, 2013 - 9:51:11
View All Uday Gajendar's Posts