Wearables Shouldn’t Be An Exercise In Screen Miniaturization

Too many wearables today have screens that look like miniaturized smartphones.

Just as smartphones shouldn’t be PC screens shrunk down to a 4-5” screen, smartwatches shouldn’t look like smartphones shrunk to 1”. Nor is it a matter of responsive web design (RWD), which resizes web content to fit the screen.

Samsung's Gear 2 looks like a tiny smartphone screen.

Instead, it’s a different type of design philosophy – one with DNA in the mobile revolution, and then extending mobile thinking even further.

Let’s start with the concept of mobile moments. As my colleagues write in The Mobile Mind Shift, mobile moments are those points in time and space when someone pulls out a mobile device to get what he or she wants immediately, in context. In the case of wearables, the wearer often won’t need to pull out a device – it’s affixed to her wrist, clothing, or eyeglasses. But she might need to lift her wrist, as a visitor to Disney World must do with MagicBand.

Now we’re getting closer to what wearables should be. But there are additional dimensions to wearables that obviate the need for pixel-dense screens:

  • Time- and location- sensitive. This applies to both smartphones and smartwatches; the emphasis is on proactive notification. When Google Now looks at your calendar for you, then checks traffic with Waze, and alerts you that “you need to leave 15 minutes earlier than scheduled because traffic is bad and your next appointment is 10 miles away,” that’s a time- and location- sensitive application. Similarly, a doctor’s smartwatch can geofence her so that she receives a high level of notifications when she is in the hospital, a medium level when at her private practice, and a low/urgent-only level of notifications when she’s at home.
  • Urgent (related to time, location, or condition).  Another class of wearable activities allow you to take action urgently – much faster than what you could do by pulling out your smartphone. The Wink feature on Google Glass – “faster than the camera button or voice action and it even works when the display is off” – allows nearly instantaneous reactions to real-world events. Motorola Solutions’ future concept gun holster sensor would alert the entire police force if any given gun was unholstered from an officer.
  • Temporally informed, but intelligent. Other wearable moments take time into account, but use intelligence to determine when to alert you. The (screen-free) Jawbone UP24 collects data on a user’s fitness and sleep habits passively and invisibly – but then offers notifications and nudges to change behavior during the day; for instance, “you’re behind schedule on steps so far today, so go take a walk.”
  • Invisible experiences. Yet other use cases for wearables extend beyond conscious moments altogether. They’re invisible experiences generated simply by wearing the device, often in a B2B2C context in which a company sells (or gives) a wearable to its customers (as Disney does with MagicBand). In the reference application for casino experiences from Salesforce Wear and Bionym, VIPs wearing the band can enjoy all sorts of perks (having someone bring you your favorite drink) just by walking into their favored casino with the Nymi band as their loyalty card.

Naturally, at this still early stage of the wearables market, smartwatches still look like tiny smartphones. My colleague Moira Dorsey writes that early automobiles looked like horse-drawn carriages without horses. Designers didn’t yet have mental models to depart from the past.

David Rose, author of Enchanted Objects, emphasizes design simplicity. The cover of his book shows an umbrella that simply glows when it’s going to rain. The GlowCap glows when a senior needs to take his prescription medicine. Wearables have the opportunity to break out from what Rose calls “Terminal World” – the tyranny of screens – making them more relevant and effective.

I&O pros should keep in mind that smartwatches aren’t a “fourth screen,” but rather a new way of designing computing experiences, when designing wearables pilots for employees or customers.

J. P. Gownder is a vice president and principal analyst at Forrester Research serving Infrastructure & Operations Professionals. Follow him on Twitter at @jgownder











User Centricity and Wearable-First

The Mobile First design paradigm is all about embracing a user-centric view of the world, and understanding that the way an interface is designed must be built around the use cases the intended users will go through in interacting with your device. For wearables, this is even more true, as the limited tactical interface necessitates not just a more limited variation in use cases (necessitating deeper-cutting prioritization), but also more creative, intuitive interactions. It will be very interesting to watch as the Wearable-First paradigm develops.

Post new comment

If you have an account on Forrester.com, please login.

Or complete the information below to post a comment.

(Your name will appear next to your comment.)
(We will not display your email.)
Type the characters you see in this picture. (verify using audio)
Type the characters you see in the picture above; if you can't read them, submit the form and a new image will be generated. Not case sensitive.