Emotion Recognition AI, M Unveiled at Conference, Rivaling Industry Leaders in Accuracy
Product News Dec 04, 2018
Founder and CEO of IPMD, Inc. Min Lee recently unveiled an artificial intelligence platform that recognizes human emotions with unprecedented accuracy. The UC Berkeley alum pronounced it a milestone that surpasses attempts made by other technological giants such as Microsoft, Google, and MIT at the experiMental Conference Showcase at UCSF.
The system is called Project M, and it uses a combination of highly sophisticated training data and a machine learning approach that strays away from computer vision algorithms normally focused on facial muscle movements. The preparation for its launch totaled more than 48,000 hours of invested manpower from the IPMD's Project M team, allowing the AI to correctly identify and correlate facial expressions with emotional and mental states.
In a brief presentation before an audience of entrepreneurs, industry executives, and investors, Lee gave a live demonstration of Project M’s superior capabilities over other companies using an image of a woman visibly in tears as a basis of comparison. He first revealed Microsoft’s diagnosis of the woman’s emotional state—66.3% neutral, 21% sad, and ironically, 8.6% happy. But Project M, to the audience’s amazement, demonstrated far more reasonable results—84.6% sad, 3.2% fear, 5.5% disgust, 3.4% anger, and 1.1% contempt.
“That’s what the best current AI can do,” Lee said of Microsoft’s results. “Industries and academia have invested hundreds of millions of dollars to develop such a system. Unfortunately, no one has achieved their goals until today.” Out of 500 comparisons made between Project M and current industry leaders using similar scenarios, Lee stated that Project M consistently triumphed, ultimately winning all 500 cases.
“This is the perfect screening of human emotion at this point,” Lee claimed. “We hope to [use this breakthrough] to change this world and eliminate depression and mental diseases.”