I am not the only one who has found comfort in the Retina display, for the sharper display grants the farsighted a few more hours of screen time before headache sets in. Apple, which trademarked the term to describe displays with pixels that are too small to be distinguished by the average human eye, is not the only manufacturer envisioning a high-resolution future. 4K, or Ultra High Definition (UHD), televisions are rapidly gaining traction in the U.S., with research firm IHS predicting that market share will grow from 10 percent to 34 percent in 2019. I suspect, however, that the rate of adoption may be even faster.
The reason is simple: Use a Retina display or watch 4K content on a UHD for a few days and everything else starts to seem, well, chunky. It’s rough, out of focus, like you have mistakenly picked up a pair of eyeglasses with an old prescription. One person I know was convinced that the new 27-inch monitor he had bought at Costco was defective, so he exchanged it for another. The second exhibited the same problem, so he called me for a diagnosis. The problem? There wasn’t one. The monitor, a high-quality IPS model, was running properly at its full 1080p resolution. He had become so accustomed to the crisp text on his MacBook that he perceived the big pixels on the new monitor to be blurry.
The anthropologist Grant McCracken defined the Diderot effect as the consumption spiral that results when a new possession makes others look shabby in comparison . A parallel process—let’s call it high-def Diderot—is happening with Retina displays, 4K and UHD televisions, and the like. Compared with a razor-sharp image, anything else seems like a bad copy. Add in High Dynamic Range technology, which manipulates contrast levels to add detail to images, and it is clear that we are crossing a threshold to the potential for on-screen images that equal, if not outstrip, the reality they are representing. For proof, head to the closest McDonald’s or Dunkin Donuts. Both chains are using high-resolution LCD displays as menu boards. Golden-brown French fries sparkle with salt, ham bounces in slow motion, and hamburgers glisten in a way that the food in your bag will not—without the use of psychedelics.
This would be merely an interesting anecdote if not for the rise in energy use. 4K displays increase the number of pixels on the screen by a factor of four over 1080p, and each one of those pixels requires calculations. To put it in technical language, quadruple the pixel count and you have quadrupled the calculations—and potentially the energy use. For example, the U.S. Energy Guide for a Vizio 55-inch UHD television estimates $27 of energy use per year. That’s a lot more than the energy usage of the most efficient large TV on the market, a Samsung 55-inch 1080p television with a quarter of the pixels, which uses less than $8 per year. Why the discrepancy? The well-known U.S. Energy Star program made matters worse by carving out a separate category for UHD televisions, which allows high-res displays to use more energy than their standard-definition counterparts. A consumer comparing a UHD and an HD set might find that both are Energy Star—rated, but the UHD set could use half again as much energy. Indeed, a systematic study by the National Resources Defense Council (NRDC) found that UHD televisions, on average, used 30 percent more energy than 1080p sets .
When you’re spending $600 or more on a TV, an extra $19 a year for energy may seem irrelevant. As one comment on an online article about the NRDC study put it, “People are bitching about 200 watts?” But considered from a systems perspective, the difference is significant. The NRDC study found that if all existing televisions larger than 36 inches in the U.S. were to be replaced with UHD sets running at today’s efficiency levels, the annual increase in energy would be “equivalent to three times the annual electricity use of all the homes in San Francisco.” And this calculation does not include the creep of big screens into every corner of life. My university recently installed them in most buildings on campus to replace the printed faculty directories and the chaotic mishmash of flyers layered upon the walls by students, faculty, and others—a characteristic feature of most university buildings. As far as I can tell, the screens run 24 hours a day, seven days a week. I have not noticed any slowmo breakfast meats, but I have noticed they are excellent for delivering marketing messages. The screens are controlled by the university’s centralized office of communications.
The NRDC study concludes with a plea to “arm users with information about how their behavior can have an impact on energy consumption.” But perhaps we should think more broadly about the systems consequences of high-resolution interfaces, including the energy they consume on an ongoing basis and the material resources. When the first Retina iPhone was released, programmers rushed to update their apps to take advantage of the new display. Owners of basic iPhones found that the memory consumed by those pixels choked their devices, creating a compelling reason to upgrade to a new Retina phone. The raw material needed to make millions of new high-resolution devices will be extracted from the Earth because consumers prefer more pixels. I am fully aware of this—I am typing this column on a gloriously sharp 4K display that arrived in my office only last week.
The clear trend is for more and more computing. We drive cars that proactively avoid collisions; our homes bristle with speakers that listen for our Amazon orders and music requests; and garage-door openers, water heaters, dishwashers, washing machines, and light bulbs have Bluetooth and Wi-Fi connections. Relying on energy efficiency is simply not enough, but that is the policy solution currently in place in the U.S. NRDC’s researchers applauded the designers of televisions that turned energy-setting measures on as a default. But there is much more for those who design human-computer interaction (HCI) to do. One partial solution that would benefit both industry and the environment would be to bake the lifetime cost of energy into the upfront cost of an appliance such as a TV, and then put financial mechanisms in place so that money flows to investment in renewable energy, such as wind and solar. Given a five-year lifespan, the cost of the television that uses $27 of energy per year would rise by $135, while the cost of the most efficient model would go up by only $40. Manufacturers would be incentivized to keep prices low by designing for efficiency and could position the additional cost as evidence of a meaningful commitment to sustainability—a valuable proposition for some consumers in today’s market. Note that this does not need to be a government regulation; the commitment of one or two big manufacturers such as Samsung, Vizio, or LG would be enough to make a significant impact. Perhaps it would shame competitors into action. But this proposal implores those of us who care about sustainability and HCI to think from a systems perspective, confidently expand our claim of authority, and pick up the phone.
While changing an on-screen menu to default to more energy-efficient choices seems squarely within the realm of HCI, dealing with the systems consequences of energy use has, for too long, been someone else’s job. The problem is that there isn’t someone else and it’s unlikely there ever will be. HCI is a broad enough field to engage thoughtfully with policy and marketing strategy. The evidence is clear: More efficient consumer electronics will not reduce energy use, at least not in the radical ways that are needed to stave off climate change. It is time to think more broadly about how those in HCI can design more sustainable outcomes at the systems level.
2. NRDC. The big picture: Ultra high-definition televisions Could add $1 billion to viewers’ annual electric bills. 2015; https://www.nrdc.org/sites/default/files/uhd-tv-energy-use-report.pdf
Jonathan Bean is an assistant professor of markets, innovation, and design at Bucknell University. His research deals with domestic consumption, technology, and taste. firstname.lastname@example.org
Copyright held by author
The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.