Emotional Intelligence in the Age of Robots

Article submitted by guest author Joni Roylance.

There is an interesting shift in our world occurring right now. It is the kind of shift that rubs elbows with the invention of electricity, the television, the internet, and even the cell phone. In other words, this moment in time will be one that those of us alive to witness will have to explain what life used to be like to those who come after us.

We are in the Fourth Industrial Revolution, or, put more simply, a technological upheaval at such speed and scale that it is going to change the way we work, the way we live and how (and who/what) we connect with in a truly dramatic fashion.

So, what role does Emotional Intelligence play in a world that is increasingly automated and artificially intelligent?

The real answer is no one knows for sure—not Google, not Amazon or Microsoft, not any organization that is truly honest about how new and rapidly evolving these technologies are. Researchers are just barely completing studies that reveal insights about the impact of screens on human development and social behaviors. Most experts don’t even agree on the definition of intelligence yet!

However, there are some known factors that should be considered, and as much as we should be asking ourselves about what the technology can and can’t or should and shouldn’t do, we simultaneously need to think critically about what those capabilities, duties and applications mean for humans, and how we can prepare now for the new realities of what it means to work and be human in the age of robots.

The reality of the future of work is the skills that will be needed most are those that a machine or software or algorithm simply cannot perform—they are emotional and relational in nature[1]. That’s good news for most of us—especially those of us already attuned to the value and impact of EQ. Unfortunately that awareness is not widespread and we’re becoming increasingly bad at EQ thanks to our preference for digital over live connection. In fact, “face to face interaction has dropped to third behind texting and IM/FB messaging in the so called ‘iGeneration,’ or those born from 1990-1999”[2]  and as a result, “ ‘digital natives’ […] are already having a harder time reading social cues.”[3] So, as practitioners, the time is now to up our game in creating tools and trainings and promoting awareness of the value proposition of Emotional Intelligence and its vital role to the next era of humanity and work.

Here are three ways in which EQ is going to be more fully utilized in the AI revolution (at least at present):

  1. Handling Complex Emotional Scenarios
    1. Chatbots are one of the most popular entrees into Cognitive Solutions. They are cheap, can be built and launched in a matter of weeks, and they can relieve humans of repetitive, mundane work (on a 24/7 basis, no less). A popular application of these tools is to leverage them to service basic customer questions or needs. This is a fantastic and preferred solution for basic questions and inquiries. However, research shows when a customer is truly dissatisfied or upset with their experience, their preferred channel for resolution is to connect with a human[4], presumably because a human can actively listen to their problem, empathize, and find the fastest path to the best solution. At least at present, even if a robot employs affective computing[5] techniques, humans do not yet believe a chatbot can fundamentally understand or relate to human problems, so escalated service issues are still best handled by flesh and bones, and EQ.
  2. Designing Loveable Cognitive Experiences
    1. Humans of today are impatient. We are an instant gratification culture and our digital prowess and global access make us pretty intolerant of less than ideal experiences. In the world of adopting cognitive technologies, like a chatbot, we are no more patient. In fact, when a bot doesn’t do what we expect it to, we generally give up on the thing within 1-3 attempts. Similarly, about 80-90% of downloaded apps are deleted after one engagement[6]. This is the case for applying experience design to the development, build and deployment of cognitive tools. It is only through subjective, qualitative human insights that experiences can be enhanced from functional to delightful, from perfunctory to memorable. Connecting with humans to collect such valuable data is a human activity, requiring the ability to be curious, creative and contextually aware.
  3. Securing Human Trust
    1. Lastly, humans are not rational beings. Even when given research and facts that tell us a right answer—we will “go with our gut” or ignore logical conclusions and make emotion-based decisions (even when we think we are being logical)[7]. The same will be true for technology—especially in high stakes scenarios. I do not care how fool proof a medical algorithm is—if it says that my child is unlikely to live through, say a cancer diagnosis—I absolutely do not care how fact-based or research backed that algorithm is. I would never give up on faith and hope that my daughter could beat the computer, and I would expect medical staff to act the same. When the stakes are high, even when machines are more reliably right, humans are not likely to believe them, even though logically they are more reliable (which is not to say without a margin of error). So, if you want to deploy cognitive tools in a space such as hiring, where there are sensitivities around hiring bias and diversity, it will still be expected that somewhere in the process, a human is validating or quality checking the decisions of the tools, with an increasing demand for what is known as “Explainable AI.”[8]

So the good news is, there is still plenty of work for humans to do. The opportunity is, as you surely know, the existing lack of awareness and strong skill base among the workplace regarding core Emotional Intelligence Competencies—skills that were valuable ten years ago, but are imperative for the next ten years.

[1] https://www.weforum.org/agenda/2017/02/employers-are-going-soft-the-skills-companies-are-looking-for

[2] https://medium.com/musings-of-a-writer/social-media-the-death-of-real-world-interaction-5e2f33cfd8ee

[3] http://www.nytimes.com/2010/05/02/fashion/02BEST.html

[4] https://pr.liveperson.com/index.php?s=43&item=496

[5] https://en.wikipedia.org/wiki/Affective_computing

[6] https://www.digitaltrends.com/mobile/16-percent-of-mobile-userstry-out-a-buggy-app-more-than-twice/

[7] http://bigthink.com/experts-corner/decisions-are-emotional-not-logical-the-neuroscience-behind-decision-making

[8] https://en.wikipedia.org/wiki/Explainable_Artificial_Intelligence

 

Leave a Reply

Upcoming Classes