Researchers at the University of North Carolina at Chapel Hill and the University of Maryland at College Park are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles.
Exploiting gait features to classify emotional state
Through RGB videos of an individual walking, the team extracted his/her walking gait in the form of a series of 3D poses. The aim was to exploit the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. The researchers’ perceived emotion recognition approach is based on using deep features learned via long short-term memory (LSTM) on labeled emotion datasets.
Moreover, the team combined these features with affective features computed from the gaits utilizing posture and movement cues. Such features are classified using a Random Forest Classifier (a type of algorithm). The team showed that its mapping between the combined feature space and the perceived emotional state provides 80.07% accuracy in identifying the perceived emotions. In addition to classifying discrete categories of emotions, the algorithm also predicts the values of perceived valence and arousal from gaits.
The researchers also presented an “EWalk (Emotion Walk)” dataset that consists of videos of walking individuals with gaits and labeled emotions. To the best of their knowledge, this is the first gait-based model that identifies perceived emotions from videos of walking individuals.
The team showed that its mapping between the combined feature space and…