Leveraging Machine Learning to Improve Emotional Intelligence in Healthcare


MedCity News

The next frontier after AI (Artificial Intelligence), as we know, is teaching machines to touch, feel, and respond to human emotions – or what is commonly referred to as emotional intelligence.

Few will argue that AI needs to simplify healthcare. But isn’t the humanization of healthcare becoming a victim of transactional AI in healthcare? The recent deactivation or decommissioning of Pepper and Fabio are suitable examples that show how difficult it is to program empathy for robots, even if they are humanoid. Pepper has been tested on a range of transactions, from helping autistic students to caring for the elderly; but for all of these, the cost-benefit analysis did not go in Pepper’s favor. As a result, Pepper’s production hiatus appears to be about cost-effectiveness relative to value. Regardless, however, the Fabios of the AI ​​world seemed to have difficulty translating empathic gestures into context, and as a result, the “creep” factor became rampant. So much so that in the UK, where Fabio was first tested as a greeter in an upscale wine shop, customers went out of their way to play a “game of hide and seek” to avoid Fabio!

The exciting (some say frustrating) thing about emotional or “affective” machine learning is that machines are developed with amazing capabilities to analyze and continuously monitor our hidden emotional responses, not just our behavior and buying response. Like it or hate it, we are now addicted to our devices. And in the new virtual, always active world, it is difficult to completely disconnect and disconnect smartphones, computers, televisions, live streams and even cameras in clinics or stores while we embrace virtual normalcy.

These ubiquitous devices continuously record our smiles and frowns and map our innermost emotions. In healthcare, this data, when appropriately collected and used, can enable us to better understand the minds of healthcare consumers. It can provide a clearer, real-time understanding of consumer problems. AI can help us decipher body language, for example to understand whether a consumer has fully understood their instructions for aftercare in a doctor’s office; In addition, it can help us decipher which patients are most receptive to follow-up. This avoids wasted consumption for consumers unwilling to engage and enriches the customer to connect with those who are.

New technologies known as “Emotional AI” are part of the broader technology of AI. Emotional AI today refers to the many ways machines can interpret our thoughts and emotions and assign values ​​to a smile, a frown, or a puzzled eyebrow. Machines are already able to analyze mountains of data within seconds. For the past two decades, machines have been taught to read emotions and images. You can correlate these with positive experiences (like brand fulfillment) or negative experiences (like fear, stress, disappointment, or anger).

In our business as healthcare professionals, we know that AI and EI are already in use. They are valuable tools for health research, as they can correlate subconscious reactions with actual buying behavior, satisfaction and, in the future, even with net promoter scores or at least with the likelihood of recommendation. In call centers, emotional AI can provide caregivers and performance navigators with useful feedback on the customer’s mindset. These sentiment indicators can help nurses and navigators, so to speak, to optimally adapt their tone and pitch to the calling consumer. When we combine AI emotional data with speech analytics software, we can get a blueprint for adjusting health product functions and changing delivery and increasing customer satisfaction in real time.

Emotional AI has wide applications in mental health, remote monitoring (through voice and other biometric data, e.g. blood pressure, heartbeat), and telemedicine. In the field of mental health, for example, emotional AI can help decipher and predict different degrees of depression in patients. Smart cars will soon draw attention to the state of mind of the driver and the discouraged mood or fatigue. People with autism cannot tell you how they are feeling. However, emotional AI can capture facial expressions or increased pulse rates and create an emotional profile of the person’s health and mental state. In workplaces that still require shift work and long hours, such as manufacturing and retail, healthcare companies that employ emotional AI can help employers track frustration or dwindling motivation among employees and provide proactive employee support programs (EAP) and provide them with advice and guidance associate.

This brave new world can probably not be thought of as intimidation but as useful intermediation. Not machine controlled people, but people getting better at being people and healthcare professionals who are able to drive targeted initiatives that are specific to each industry and outcome.

This is another appeal to those who can use EI ….. with a shot of what I call Covidacity, aka the Positive and Big Bold Goals Despite the Pandemic.

Photo: wildpixel, Getty Images

Source Link

Leave a Reply