The new Enterprise Saas Emerging Tech Research report from PitchBook identifies this field as "emotion AI" and projects that this technology will grow in the near future.
The logic goes something like this: How can an AI function effectively if it can't distinguish between an irate "What do you mean by that?" and an informed "Executioner, what do you mean by that?" If companies are using AI assistants to aid executives and staff, and if AI chatbots are front-line salespeople and customer care representatives? as well as a perplexed "What do you mean by that?"
Emotion artificial intelligence is positioned as the more advanced offspring of sentiment analysis, the pre-AI technology that strives to extract human emotion.
![]() |
source |Elegant Themes |
"A key component of emotion AI's hardware side are cameras and microphones. These may be found separately at a physical location or on a phone or laptop. Beyond these gadgets, wearable hardware will probably offer another way to use emotion AI, Hernandez tells TechCrunch. (So this could be the reason if the chatbot for customer support requests access to the camera.)
To achieve that, an increasing number of startups are being established. According to PitchBook estimates, these include Uniphore (which has funded $610 million in total, of which $400 million was led by NEA in 2022), as well as MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, all of which have raised modest sums from various VCs.
Naturally, the approach to emotional AI is very Silicon Valley: Employ technology
To achieve that, an increasing number of startups are being established. According to PitchBook estimates, these include Uniphore (which has funded $610 million in total, of which $400 million was led by NEA in 2022), as well as MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, all of which have raised modest sums from various VCs.
Naturally, the approach to emotional AI is very Silicon Valley: Employ technology
Comments
Post a Comment