The rise of empathy-enabling technology
When AI joins us at work, people will focus on tasks in which brain beats machine. This means that the most valuable human work will require cognitive skills that are difficult to automate: creative thinking, learning, flexible thinking, and empathy. As a result, amidst all the tech-hype, the greatest competitive advantage can now be gained through better understanding of human beings.
Companies have responded to this development. Today, more and more organizations are declaring themselves human-centric and customer-focused. Empathy is touted as the main ingredient of service design, and teamwork as the golden standard of effective operation.
But how good are we really at understanding humans, supporting high-quality interaction and promoting empathy?
Companies are certainly interested in things like customer feedback and employee satisfaction. However, quite often the assessment of both happens through enquiries, meaning that we rely on highly unreliable human memory as a source of data. Work is more and more often performed in teams, but corporate culture is plagued by meeting formats and collaboration technologies that do not support the mechanisms of functional interaction. And lastly, we may consider empathy an important work skill, yet create structures (such as hierarchies) that inhibit strict work roles and competition.
In short, we’re not as good as we could be.
One way to improve our human-centered approach is to bravely dive into the murky waters of emotions. If we want to be more empathetic in, for instance, service design, and increase our understanding of customer satisfaction, emotions are highly significant signals to pay attention to. Enquiring about emotions, however, leads to a jungle of concepts and characterizations that ultimately mean different things to different people.
This is where very, very cool technology comes in.
With the rapid development of sensor technology and machine vision, we can now measure biosignals related to human emotions in real time, in real environments. Take customer experience: we don’t need to ask about what a service felt like, we can see it for ourselves. Or, what if information about the dynamically changing emotional state of your team mate could be visible during your interaction on Slack? How about discussion forums that increase empathy and inhibit hate speech and bullying?
Recently, our research group at the University of Helsinki decided to turn these what ifs, coulds, and mights into a collaborative research project. Reaktor, one of the most forward-thinking companies in Finland, was the first one to join in.
Together, we’ve done playful preliminary testing about how biosignal measurement could reveal new aspects of or amplify the emotional experience of gameplay in VR. We started with having Mikko Olkkonen, a brave Reaktorian, wear all the sensors we could muster and play several games at Reaktors VR lab, Holodeck. Then we measured the following things:
ECG (electrocardiogram). Heart rate and heart rate variability are related to arousal which is high during for instance excitement or fear, and low during boredom.
EEG (electroencephalogram). Even though you can’t measure basic emotions from the brain’s electrical activity, you can measure activity related to concentration, relaxation, an experience of flow or cognitive strain.
Respiration. Breathing patterns are related to arousal.
Skin conductance. When you sweat, the conductivity of the skin increases. This signal is quite fast and reflects quick changes in arousal, such as caused by startles, increased stress, or increased excitement.
Our guinea pig found the experiment very intriguing.
“When you’re immersed into the game, it’s difficult to pay attention to your own feelings. It was surprisingly easy to forget all the sensors I was wearing. It was pretty revealing to check out my reactions afterwards, identify peak moments during the game and see what touched me and what did not. I could definitely use this kind of data to improve my gameplay – might also be interesting for watchers of, for example, e-sports”, Mikko says.
Reaktor was especially interested in exploring questions related to bettering digital experiences. Could data on customers’ emotional reactions be used in user interface design and testing, or games design? Could we improve communication between the offices in Tokyo, New York, Amsterdam, and Helsinki by systems that are better at supporting empathy, e.g. by conveying emotional information?
The short test with Mikko is just one example of what emotion measurement might entail and of the contexts in which data on emotions might be interesting. The technology that will mostly be used in our research project is machine vision which, of course, requires no electrodes.
Scientific questions that we hope to pursue include the following:
- How do emotional states influence the ability of individuals to cooperate?
- Does synchronization of physiological signals between individuals predict the quality of interaction?
- Could this synchronization be induced or deepened in order to improve interaction or more quickly achieve a cooperative state?
- Can digital collaboration be improved with the help of technology, for instance by showing emotion-related physiological data?
- Is VR an environment that more easily fosters synchronization of physiological signals and thereby more functional cooperation?
In summary, in their strive for human-centered business, companies need to up their game when it comes to understanding humans and in promoting empathy and fruitful interaction in human encounters. Luckily, new technology is emerging that allows us to see aspects of human experience that were previously hidden.
The current project aims to help companies in making the best use of this tech through research, enabling them to understand customer experiences more accurately, to help their people work together in a more fruitful way, and to become more authentically and intelligently human-centered.
The tech is already there. All that’s needed is a little science.