The machine learning models that can detect our face and movements are now part of our daily lives with smartphone features like face unlocking and Animoji. However, those AI models can’t predict how we feel by looking at our face. That’s where EmoNet comes in.
Researchers from the University of Colorado and Duke University have developed the neural net that can accurately classify images in 11 emotional categories. To train the model, researchers used 2,187 videos that were clearly classified into 27 distinct emotion categories including anxiety, surprise, and sadness.
Comments are closed.