AI identifies human emotion based on walking style

Researchers are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles

Phil Siarri

--

Image: person walking in the woods
“Forest” by Simon Lehmann (Pixabay)

Researchers at the University of North Carolina at Chapel Hill and the University of Maryland at College Park are working on a new data-driven model and algorithm to identify the perceived emotions of individuals based on their walking styles.

Exploiting gait features to classify emotional state

Through RGB videos of an individual walking, the team extracted his/her walking gait in the form of a series of 3D poses. The aim was to exploit the gait features to classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral. The researchers’ perceived emotion recognition approach is based on using deep features learned via long short-term memory (LSTM) on labeled emotion datasets.

A representation of the novel algorithm to identify the perceived emotions of individuals based on their walking styles. Given an RGB video of an individual walking (top), the researchers extracted his/her walking gait as a series of 3D poses (bottom). The used a combination of deep features learned via an LSTM and affective features computed using posture and movement cues and classify using a Random Forest Classifier into basic emotions (e.g., happy, sad, etc.). Credit: Randhavane et al., Fair Use.

Moreover, the team combined these features with affective features computed from the gaits utilizing posture and movement cues. Such features are classified using a Random Forest Classifier (a type of algorithm). The team showed that its mapping between the combined feature space and the perceived emotional state provides 80.07% accuracy in identifying the perceived emotions. In addition to classifying discrete categories of emotions, the algorithm also predicts the values of perceived valence and arousal from gaits.

The visualization of the motion-captured gaits of four individuals with their classified emotion labels. Gait videos from 248 motion-captured gaits were displayed to the participants in a web-based user study to generate labels. The researchers used that data for training and validation. Credit: Randhavane et al., Fair Use.

The researchers also presented an “EWalk (Emotion Walk)” dataset that consists of videos of walking individuals with gaits and labeled emotions. To the best of their knowledge, this is the first gait-based model that identifies perceived emotions from videos of walking individuals.

The team showed that its mapping between the combined feature space and…

--

--

Phil Siarri

Founder of Nuadox | Tech & Innovation Commentator | Digital Strategist | MTL | More about me> linktr.ee/philsiarri