Posted by Richard Willett - Memes and headline comments by David Icke Posted on 3 April 2024

Robots That Know If We’re Sad Or Happy Are Coming

By Masha Borak

Many of us are already talking to our smart devices as if they are humans. In the future, smart devices may be able to talk back in a similar way, imitating the “umms” and “ahhs,” the laughs and the sighs, alongside other signals that indicate human emotions such as tune, rhythm and timbre.

New York-based startup Hume AI has debuted a new “emotionally intelligent” AI voice interface that can be built into different applications ranging from customer service and to healthcare to virtual and augmented reality.

The beta version of the product named Empathic Voice Interface (EVI) was released after securing a US$50 million series B round of financing led by EQT Ventures. Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures and LG Technology Ventures also participated in the funding.

The conversational AI is the first to be trained to understand when users are finished speaking, predict their preferences and generate vocal responses optimized for user satisfaction over time, Hume AI says in a release.

“The main limitation of current AI systems is that they’re guided by superficial human ratings and instructions, which are error-prone and fail to tap into AI’s vast potential to come up with new ways to make people happy,” the company’s founder and former Google scientist Alan Cowen says. “By building AI that learns directly from proxies of human happiness, we’re effectively teaching it to reconstruct human preferences from first principles and then update that knowledge with every new person it talks to and every new application it’s embedded in.”

The voice model was trained on data from millions of human interactions and built on a multimodal generative AI that integrates large language models (LLMs) with expression measures. Hume calls this an empathic large language model (eLLM) and it helps its product to adjust the words and voice tone based on the context and the user’s emotional expressions.

Hume AI is not the only company experimenting with bringing emotion into technology.

Read More: Robots That Know If We’re Sad Or Happy Are Coming

The Dream

From our advertisers