(Big picture)
Most of us have had the experience of sending a text message or email that sounded insensitive or angry when we did not intend to do so.
Unfortunately, the lack of social cues in such messages makes it much easier to be misinterpreted. Depending on the communication, this can lead to misunderstandings, hurt feelings, or worse. This is a shortcoming that Bellevue, Washington, mpathic would like to correct with empathic AI.
Based on knowledge and data sets that have been collected over the past ten years, mpathic has set itself the goal of promoting human connection and understanding in the workplace.
To that end, they created plugins built into their cloud-based Empathy-as-a-Service, or EaaS, to help people speak to people using real-time text corrections. In this way, texts and e-mails can be checked and changes suggested before you click on “Send”. By integrating these functions in platforms such as Slack or Gmail, mpathic hopes to bring more empathy to the corporate communication landscape.
mpathic CEO and Co-Founder Grin Lord. (mpathetic photo)
“We realized that all of this can be conveyed using an AI empathy engine, much like Grammarly for empathy,” said co-founder Grin Lord. “We have seen amazing developments in AI that now enable us to do this in real time. This is the first time in human history that we have received dynamic real-time empathy correction. “
In an example from a recent pitch, the service suggested sending a inflammatory message such as, “Why does Nic always plan these meetings at the last minute? Am I right? “With a more open question:” How do you feel about the change in the meeting? “
Based on years of research into human interaction, mpathic offers a unique approach to user guidance. Lord, who has a PhD in psychology, initially based mpathics’ data set on insights she gained in the early 2000s at Harborview Medical Center in Seattle, the only Level I trauma center in Washington state.
During this time, Lord was part of a group researching empathic listening. DUI drivers were frequently brought into Harborview after a car accident. Instead of giving the drivers brochures or telling them what to do, or shaming them, the researchers listened to them for maybe 15 or 20 minutes according to specific protocols. In a randomized controlled trial, they saw a measurable decrease in alcohol consumption among these drivers that lasted up to three years, as well as a 48 percent reduction in hospital readmissions. Not only has this helped the issue recover, it has also resulted in significant cost reductions and increased public safety.
Since then, Lord has been involved in other startups including Lyssn, a platform for assessing the empathy and engagement of behavioral medicine practitioners during clinical sessions.
We are actually making corrections that are very behavioral.
Before its introduction, the team behind mpathic founded Empathy Rocks, which uses empathic AI to establish human connections via a gamified platform. The platform enables practitioners to improve their empathic listening skills while collecting continuing education points.
But during the early stages of Empathy Rocks’ seed funding, Lord and co-founder Nic Bertagnolli realized they already had a viable product in the underlying empathy engine for that platform. Pivoting they launched mpathic to make the engine lighter and more widely available.
With the development of “Grammarly for Empathy” and an API, mpathic wants to do more than just promote good relationships between employees. In view of the increasing globalization of many companies and the growing pool of employees from other parts of the country and the world, mpathic would like to provide HR departments with a tool that can facilitate employee onboarding. Since different regions have different ideas and attitudes towards civil and sensitive behavior, mpathic can be used to integrate new employees into your new team more quickly.
Lord quickly points out that mpathic not only suggests text corrections, but also makes other types of behavioral suggestions. In this way, the user builds an understanding of empathic communication and behavior through context, use and repetition.
“We are actually making very behavioral corrections,” said Lord. “So it cannot even be a replacement of a word or a transformation of the text. Instead, the AI may suggest calling or calling a meeting because certain things don’t need to be in an email. “
Although mpathic emerged from Empathy Rocks, the gamified training platform continues to offer empathic hearing training as it collects new data that is used to train mpathic’s EaaS. The platform was created by the team’s empathy designer, Dr. Jolley Paige, who identifies the many factors to consider at a time when AI bias is such an issue.
“We thought about gender, age, culture, where you are in the country, but also about different abilities,” said Jolley. “So if someone has a language processing disorder, how would that affect their interaction with this game?”
While some people may have concerns about using AI to alter human behavior, many companies see value in such an approach. “Some of our early corporate partners are considering adding Mpathic to their Slack, Gmail, or whatever, largely because they are interested in this idea of quickly onboarding cross-cultural and global teams,” said Lord. “I think it can be useful in unifying the language of mission values for a company.”
Last month, mpathic was one of 14 startups that presented at the PIE Demo Day. PIE (Portland Incubator Experiment) is led by General Manager Rick Turoczy and aims to offer founders – often first-time entrepreneurs – access to mentoring and networks.
Empathy Rocks and mpathic purposely source and curate their data to include underrepresented voices and are part of All Tech is Human as well as other communities committed to ethical AI development.
Empathic AI is part of a much broader field of computer science that was originally known as affective computing and more recently has been referred to as emotion AI, or artificial emotional intelligence. Originated from the MIT Media Lab and other research institutes around 25 years ago, Emotion AI encompasses systems that can read, interpret and interact with human emotions. Since emotions and especially empathy are central to human existence, this work has the potential to make our technologies a simpler, more humane and more responsible way of dealing with people at home and at work.