Hume AI: $50 Million Secured To Build AI Optimized For Well-Being

By Amit Chowdhry ● Updated April 25, 2024

Hume AI – a company and research lab building artificial intelligence optimized for human well-being – announced it had raised a $50 million Series B funding round. EQT Ventures led this round. Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, and LG Technology Ventures also joined the round.

The funding will support the development of Hume’s new flagship product: an emotionally intelligent voice interface that can be built into any application.

Hume AI was launched by Dr. Alan Cowen, a former Google researcher, and scientist best known for pioneering semantic space theory – which is a computational approach to understanding emotional experience and expression and has revealed nuances of the voice, face, and gesture that is now understood to be central to human communication globally.

The company operates at the intersection of artificial intelligence, human behavior, and health and well-being. The company created an advanced API toolkit for measuring human emotional expression already used in industries such as robotics, customer service, healthcare, health and wellness, user research, etc.

In connection with the funding, Hume AI has released a beta version of its flagship product – an Empathic Voice Interface (EVI). And the emotionally intelligent conversational AI is the first to be trained on data from millions of human interactions to understand when users are finished speaking, predict their preferences, and generate vocal responses optimized for user satisfaction over time. These capabilities will be available to developers with just a few lines of code and can be built into any application.

AI voice products can transform interactions with technology. However, their stilted and mechanical responses hinder truly immersive conversational experiences. Hume-EVI aims to provide the basis for engaging voice-first experiences that emulate the natural speech patterns of human conversation.

Hume’s EVI was built with a new form of multimodal generative AI that integrates large language models (LLMs) with expression measures, which Hume refers to as an empathic large language model (eLLM). And the company’s eLLM enables EVI to adjust its words and tone of voice based on the context and the user’s emotional expressions.

EVI accurately detects when a user is ending their conversational turn to start speaking, stops speaking when the user interrupts the AI and generates rapid responses in real-time with latency under 700 ms – allowing for fluid, near-human-level conversation. With a single API call, developers can integrate EVI into any application to create state-of-the-art voice AI experiences.

KEY QUOTES:

“Hume’s empathic models are the crucial missing ingredient we’ve been looking for in the AI space. We believe that Hume is building the foundational technology needed to create AI that truly understands our wants and needs, and are particularly excited by Hume’s plan to deploy it as a universal interface.”

  • Ted Persson, Partner at EQT Ventures who led the investment

“The main limitation of current AI systems is that they’re guided by superficial human ratings and instructions, which are error-prone and fail to tap into AI’s vast potential to come up with new ways to make people happy. By building AI that learns directly from proxies of human happiness, we’re effectively teaching it to reconstruct human preferences from first principles and then update that knowledge with every new person it talks to and every new application it’s embedded in.”

  • Hume AI founder Alan Cowen

“What sets Hume AI apart is the scientific rigor and unprecedented data quality underpinning their technologies. Hume AI’s toolkit supports an exceptionally wide range of applications, from customer service to improving the accuracy of medical diagnoses and patient care, as Hume AI’s collaborations with Softbank, Lawyer.com, and researchers at Harvard and Mt. Sinai have demonstrated.”

  • Andy Weissman, managing partner at Union Square Ventures

“Alan Cowen’s research has transformed our understanding of the rich languages of emotional expression in the voice, face, body, and gesture. His work has opened up entire fields of inquiry into understanding the emotional richness of the voice and the subtleties of facial expression.”

  • Dacher Keltner, a leading emotion scientist
Exit mobile version