--- question: "As of 2026, give a complete profile of Hume AI (hume.ai): founding year, founders, total funding raised and last known round, HQ, headcount estimate, product surface area (EVI, voice API, expressive TTS), input/output modalities, whether they expose any persistent per-user emotional profile or memory layer, public partnerships with foundation model providers (OpenAI, Anthropic, Google, Meta, Microsoft), pricing tiers where public, named customers or design partners, and their public stance on safety, manipulation risk, and the Hume Initiative. What are the strongest critiques or limitations of Hume AI as a category leader that have surfaced in 2025 to 2026? Cite working URLs." model: anthropic/claude-sonnet-4-6 max_output_tokens: 8000 asked_at: 2026-05-15T00:51:45.749Z total_tokens: 8768 status: completed cost_total_usd: 0.07948 cost_input_usd: 0.00002 cost_output_usd: 0.05546 cost_tools_usd: 0.005 --- # Q: As of 2026, give a complete profile of Hume AI (hume.ai): founding year, founders, total funding raised and last known round, HQ, headcount estimate, product surface area (EVI, voice API, expressive TTS), input/output modalities, whether they expose any persistent per-user emotional profile or memory layer, public partnerships with foundation model providers (OpenAI, Anthropic, Google, Meta, Microsoft), pricing tiers where public, named customers or design partners, and their public stance on safety, manipulation risk, and the Hume Initiative. What are the strongest critiques or limitations of Hume AI as a category leader that have surfaced in 2025 to 2026? Cite working URLs. # Hume AI — Complete Company Profile (as of May 2026) --- ## 🏢 Company Overview | Field | Details | |---|---| | **Full Name** | Hume AI (Hume) | | **Founded** | 2021 | | **Headquarters** | New York, NY, USA | | **Type** | Startup & Research Lab | | **Mission** | Build AI optimized for human well-being | | **Website** | [hume.ai](https://www.hume.ai) | --- ## 👥 Founders - **Alan Cowen** — Founder & CEO; former Google researcher known for large-scale studies on the science of emotion and expression - The company was founded on Cowen's academic research into how vocal and facial expressions map to emotional states across cultures --- ## 💰 Funding & Valuation | Round | Amount | Date | Lead Investor | Notable Co-investors | |---|---|---|---|---| | **Seed / Series A** | ~$12.7M | January 2023 | Union Square Ventures | Comcast Ventures, LG Technology Ventures, Northwell Holdings, Wisdom Ventures | | **Series B** | $50M | March 2024 | EQT Ventures | Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, LG Technology Ventures | | **Total Raised** | ~$62.7M+ | — | — | — | - The Series B was announced March 26–27, 2024 via [BusinessWire](https://www.businesswire.com/news/home/20240326359639/en/Hume-AI-Announces-$50-Million-Fundraise-and-Empathic-Voice-Interface) and the [Hume blog](https://www.hume.ai/blog/series-b-evi-announcement) - No publicly announced round beyond Series B as of May 2026 - Notable strategic investors: **Northwell Health Holdings** (health/clinical validation) and **Comcast Ventures** (media/telco deployment signals) --- ## 🧑‍💼 Headcount Estimate - Estimated at **~50–100 employees** as of early 2026 (Series B-stage startup; no public headcount figure disclosed) - Series B capital was explicitly allocated toward **scaling the team and accelerating research** --- ## 🛠️ Product Surface Area ### 1. **EVI — Empathic Voice Interface** - Hume's flagship product, launched alongside the Series B (March 2024) - A full-duplex, real-time **conversational voice AI** with emotional intelligence baked in - Trained on **millions of human interactions**; uses vocal tone, prosody, and expression signals to: - Detect when a user has finished speaking (turn-taking) - Predict user preferences - Optimize responses for **user satisfaction and well-being** over time - EVI 2 (second generation) was subsequently released with improved expressiveness and latency - Accessible via API for developers to embed in any application ### 2. **Expressive TTS (Text-to-Speech)** - A standalone **expressive, emotionally nuanced TTS** product - Supports **voice cloning** (create-only on free tier; create & use on Creator tier and above) - Measured in characters; priced per subscription tier - Designed to produce speech that sounds emotionally appropriate to context, not just phonetically accurate ### 3. **Expression Measurement API** - A suite of multimodal **emotion recognition / affective computing APIs** - Analyzes **facial expressions, vocal prosody, language, and physiological signals** - Separate **pay-as-you-go** pricing model (volume discounts at enterprise) - Used for research, behavioral analytics, and customer experience measurement - This was Hume's original core research product before EVI --- ## 🔄 Input / Output Modalities | Direction | Modalities | |---|---| | **Input** | Voice/audio (prosody, tone), text, facial video (via Expression API), physiological signals | | **Output** | Expressive synthesized speech (TTS), text, emotional state signals/scores | - EVI is primarily **voice-in / voice-out** with emotional context layered on top - The Expression Measurement API accepts **video, audio, and text** and outputs dimensional emotion scores --- ## 🧠 Persistent Per-User Emotional Profile / Memory Layer - EVI is designed to **optimize responses for satisfaction over time**, which implies session-level and potentially cross-session learning of user preferences - As of publicly available documentation through early 2026, Hume has **not publicly launched a named, explicit "emotional memory" or persistent per-user emotional profile product** - The architecture supports preference optimization over time, but a formal persistent memory layer (akin to OpenAI's Memory for ChatGPT) has **not been publicly confirmed as a discrete, user-facing feature** - This remains an area to watch, particularly given the sensitivity of storing longitudinal emotional data --- ## 🤝 Public Partnerships with Foundation Model Providers | Provider | Status | |---|---| | **OpenAI** | EVI supports **external LLMs** including OpenAI models as the language backbone (configurable by developers) | | **Anthropic** | Listed as a supported external LLM option in EVI's API configuration | | **Google** | No specific named partnership publicly confirmed as of May 2026 | | **Meta** | No specific named partnership publicly confirmed | | **Microsoft** | No specific named partnership publicly confirmed | - Hume's architecture is notably **LLM-agnostic**: developers can plug in their preferred LLM (OpenAI, Anthropic, etc.) as the reasoning layer, with Hume handling the emotional expression and voice layer on top — referenced on their [pricing page](https://www.hume.ai/pricing) under "External LLMs" - **Northwell Health** (investor + partner) is the most notable strategic partnership for clinical/health deployment --- ## 💲 Pricing Tiers (Public, as of May 2026) *Source: [hume.ai/pricing](https://www.hume.ai/pricing) and [eesel.ai guide](https://www.eesel.ai/blog/hume-ai-pricing)* ### EVI + TTS Subscription Plans | Plan | Monthly Price | Included TTS Characters | Included EVI Minutes | Key Features | |---|---|---|---|---| | **Free** | $0 | 10,000 (~10 min) | 5 min | 15 RPM, 1 concurrent connection, voice cloning (create only) | | **Starter** | $3 | 30,000 (~30 min) | 40 min | 15 RPM, 5 concurrent connections, 20 projects | | **Creator** | $14 | 140,000 (~140 min) | 200 min | 75 RPM, commercial license, unlimited voice cloning | | **Pro** | $70 | 1,000,000 (~1,000 min) | 1,200 min | 75 RPM, 10 concurrent connections, 3,000 projects | | **Scale** | $200 | 3,300,000 (~3,300 min) | 5,000 min | 150 RPM, 20 concurrent connections, 3 team seats | | **Business** | $500 | 10,000,000 (~10,000 min) | 12,500 min | 225 RPM, 30 concurrent connections, 5 team seats | | **Enterprise** | Custom | Custom | Custom | Custom RPM, unlimited seats, Slack support, SOC 2, GDPR, HIPAA | - **Additional EVI overage**: $0.06/min (Pro), $0.05/min (Scale), $0.04/min (Business), Custom (Enterprise) - **Expression Measurement API**: Separate pay-as-you-go pricing; enterprise volume discounts available --- ## 🏭 Named Customers & Design Partners *(Source: [Contrary Research](https://research.contrary.com/company/hume-ai))* | Vertical | Customer / Partner | Use Case | |---|---|---| | **Enterprise CX** | **Vonova** | Customer support voice agents | | **Enterprise CX** | **Hamming AI** | Voice agent analytics | | **Health & Wellness** | **hpy** | Digital therapy | | **Health / Clinical** | **Northwell Health Holdings** | Mental health & behavioral science (also an investor) | - The health vertical is particularly notable given Northwell's dual role as investor and design partner --- ## 🛡️ Safety, Manipulation Risk & The Hume Initiative ### Public Stance on Safety - Hume positions itself explicitly as building **"AI optimized for human well-being"** — this is central to its brand and investor narrative, not a footnote - The company's stated goal is to ensure AI systems improve user outcomes rather than exploit emotional vulnerabilities for engagement or commercial gain - EVI is designed to optimize for **user satisfaction**, with the explicit framing that this is distinct from engagement maximization (the critique leveled at social media algorithms) ### Manipulation Risk Acknowledgment - Hume has publicly acknowledged the **dual-use risk** of emotion-sensing AI: the same technology that can make an AI more empathetic could theoretically be used to manipulate users by exploiting emotional states - Their mitigation framing centers on the **direction of optimization** — optimizing for user well-being vs. optimizing for commercial conversion or time-on-site - Critics note this distinction is difficult to verify from the outside and relies heavily on the company's own self-governance ### The Hume Initiative - Hume runs **The Hume Initiative**, a research and ethics body focused on the responsible development of empathic AI - It publishes guidelines for developers building on Hume's APIs, specifically around **not using emotional data to manipulate, deceive, or harm users** - The initiative represents Hume's attempt to establish **industry norms** before regulators do, positioning the company as a norm-setter in affective AI ethics - Partners and developers using Hume's APIs are expected to adhere to its usage policies, which prohibit manipulation-oriented applications --- ## ⚠️ Strongest Critiques & Limitations (2025–2026) ### 1. 🔬 Scientific Validity of Emotion Detection - The core scientific premise — that emotions can be reliably inferred from facial expressions and vocal tone — remains **actively contested** in affective science - Critics (including prominent researchers like **Lisa Feldman Barrett**) argue that there is no universal, biologically fixed mapping between facial/vocal signals and internal emotional states - Hume's own research acknowledges dimensional complexity, but critics argue the products can still **overfit to surface signals** and misrepresent actual user emotional states ### 2. 🔒 Privacy & Longitudinal Emotional Data - Storing and processing **vocal emotion data** at scale raises serious privacy concerns that go beyond typical voice assistant data - There is no widely deployed regulatory framework (in the US) specifically covering **affective biometric data**, leaving users with limited protection - The combination of persistent preference optimization + emotional sensing creates a profile that is arguably more sensitive than any other consumer data type ### 3. 🤔 Self-Governance Credibility Gap - The Hume Initiative is **self-governed** — Hume sets its own rules, monitors its own compliance, and defines what "well-being optimization" means - Without third-party audits or regulatory backing, the manipulation risk mitigation is largely a matter of **trust in the company's intentions** - As the company scales and faces commercial pressure, critics question whether well-being optimization will remain the dominant objective ### 4. 💼 Limited Enterprise Traction Transparency - Named customers (Vonova, Hamming AI, hpy) are **relatively early-stage companies themselves**, raising questions about whether Hume has landed large-enterprise or Fortune 500 validation - The health use cases (digital therapy, mental health) are among the highest-stakes and highest-scrutiny verticals — clinical evidence for EVI's efficacy in therapeutic settings has not been independently published at scale ### 5. 📉 Competitive Intensity - The voice AI space has become dramatically more competitive through 2025–2026, with **OpenAI (Advanced Voice Mode), ElevenLabs, Cartesia, PlayHT, Sesame, and others** all competing on expressive TTS and real-time voice conversation - Hume's differentiation on emotional intelligence is real but **increasingly being replicated** by larger players with more resources - OpenAI's own voice modality research directly overlaps with EVI's core value proposition ### 6. 📦 API Dependency Risk - EVI's LLM-agnostic architecture (using OpenAI, Anthropic, etc. as the reasoning layer) means Hume is **dependent on third-party LLM providers** for core functionality and subject to their pricing, policy, and capability changes --- ## 📚 Sources - Hume Series B Announcement: [hume.ai/blog/series-b-evi-announcement](https://www.hume.ai/blog/series-b-evi-announcement) - BusinessWire Press Release: [businesswire.com](https://www.businesswire.com/news/home/20240326359639/en/Hume-AI-Announces-$50-Million-Fundraise-and-Empathic-Voice-Interface) - Hume Pricing (Official): [hume.ai/pricing](https://www.hume.ai/pricing) - Hume Pricing Guide (eesel.ai): [eesel.ai/blog/hume-ai-pricing](https://www.eesel.ai/blog/hume-ai-pricing) - Hume Pricing (affinco): [affinco.com/hume-ai-pricing](https://affinco.com/hume-ai-pricing/) - Contrary Research Business Breakdown (Jan 2026): [research.contrary.com/company/hume-ai](https://research.contrary.com/company/hume-ai) - Pulse2 Funding Coverage: [pulse2.com](https://pulse2.com/hume-ai-50-million-secured-to-build-ai-optimized-for-well-being/) --- > **Summary verdict:** Hume AI is a well-funded, research-grounded pioneer in affective/empathic AI with a genuinely differentiated product layer (emotional expression in voice). Its $62.7M+ in funding, LLM-agnostic EVI platform, and health-sector partnerships give it a defensible early position. However, it faces mounting competitive pressure from better-resourced players, unresolved scientific debates about emotion detection validity, and a self-governance ethics model that will face increasing scrutiny as the technology scales into high-stakes verticals like healthcare and customer service. --- ## Sources 1. [Hume Raises $50M Series B and Releases New Empathic Voice ...](https://www.hume.ai/blog/series-b-evi-announcement) 2. [A complete guide to Hume AI pricing in 2025](https://www.eesel.ai/blog/hume-ai-pricing) 3. [Hume AI Business Breakdown & Founding Story - Contrary Research](https://research.contrary.com/company/hume-ai) 4. [Pricing - Hume AI](https://www.hume.ai/pricing) 5. [The Hidden Vulnerabilities of Autonomous AI Agents](https://www.innovativehumancapital.com/article/when-delegation-goes-wrong-the-hidden-vulnerabilities-of-autonomous-ai-agents) 6. [Hume AI Pricing Explained: Plans, Free Tier, and Costs (2026)](https://affinco.com/hume-ai-pricing/) 7. [Hume AI: $50 Million Secured To Build AI Optimized For Well-Being](https://pulse2.com/hume-ai-50-million-secured-to-build-ai-optimized-for-well-being/) 8. [Hume AI Announces $50 Million Fundraise and Empathic ...](https://www.businesswire.com/news/home/20240326359639/en/Hume-AI-Announces-$50-Million-Fundraise-and-Empathic-Voice-Interface) 9. [White House Wants to Vet Powerful AI Models for Risks](https://www.westword.com/news/white-house-wants-to-vet-powerful-ai-models-for-risks-40884099/)