Humans; Takes One To Know One

This post was originally published here:

We try to make good design and development choices for users by understanding their context – the circumstances or setting surrounding an event.  But we’re quickly transitioning from a paradigm of people sitting in front of PCs and clicking around to one where we are navigating back and forth between many devices across apps, times and locations.

Humans make decisions based on whatever emotions arise while being steeped in the influence of their environment.

Our industry leverages user research and multiple data points to create systems that serve people better. But now, we have to contend with a whole new dynamic of cognizance. Inanimate objects observe us, our devices track us, and lenses monitor us. The amount of data collected on each individual grows exponentially as we discover newer and more nuanced ways to keep tabs on what everyone is doing. But something is missing from the rattle and din of data – the “why?”

Technology and SEO expert Aaron Wall says “I think many people are focused on data to an extreme degree … for many players sustained success will likely require better leveraging soft skills rather than hard data. As much as we may want to, it is hard to create software which understands human emotions and what motivates people. As an industry I think we’ve likely become overly reliant on tools.”

In order to provide a truly exceptional user experience, it’s critical that we don’t take these data points at face value – we have to remember the human factors at work.

In the future, context will be reported by tools and interpreted by humans.

Humans are charming creatures. In a world where we are more trackable than ever, we still manage to defy logic with our emotions, our inexplicable affiliations, and our quirks.

And while these tendencies are what make us so loveable, they are also the easiest to forget about when building tools. We assume that we can build more advanced tools that will respond to human behavior, but technology tools make decisions based on encoded logic. Humans make decisions based on whatever emotions arise while being steeped in the influence of their environment. So how do you create logic systems that serve the sometimes illogical? You always send in a human interpreter.

Healthcare, in particular, is ripe for this kind of data inundation. But it’s also a space so heavily affected by context, that those overwhelming data points must always be checked, rechecked and interpreted by humans.

When I worked in healthcare technology, there was a large hospital system adopting iPad Minis at a breakneck speed right after their release to the market. We were thrilled to be working with early adopters that were open to the potential of the new technology. But when we visited the hospital to conduct user interviews, we discovered the real reason for the surge – an iPad Mini fit perfectly into the pocket of a lab coat. “iPhones fall out and a laptop is too cumbersome,” he told me, “and I can hold this in one hand while I’m with my patients.” The other doctors in the hospital, having heard about his discovery, also purchased their own lab-coat-ready iPad Minis. It had nothing to do with early adoption or the technology, and everything to do with an environmental constraint that they saw as unchangeable. You can’t modify the size of a lab coat pocket, they reasoned, so buy your technology accordingly.

The point is, we never would have understood this, and many other insights, based on numbers being spit out from our tools, because that didn’t show us the constraints of the doctors’ environment or the dimensions of their pockets. And while we may have reams of data flowing in from our tools for tracking and analytics, a human eye will always be necessary for interpreting the nuances and understanding the value of the idiosyncratic. That is something so exceptionally human that we’ll never be able to outsource it, because when it comes to people’s behavior, context is everything.

Olivia Hayes