Conference article

An Approach for Emotion Recognition using Purely Segment-Level Acoustic Features

Hao Zhang
School of Engineering, The University of Tokyo, Japan

Shin’ichi Warisawa
School of Engineering, The University of Tokyo, Japan

Ichiro Yamada
School of Engineering, The University of Tokyo, Japan

Download article

Published in: KEER2014. Proceedings of the 5th Kanesi Engineering and Emotion Research; International Conference; Linköping; Sweden; June 11-13

Linköping Electronic Conference Proceedings 100:4, p. 39-49

Show more +

Published: 2014-06-11

ISBN: 978-91-7519-276-5

ISSN: 1650-3686 (print), 1650-3740 (online)

Abstract

A purely segment-level approach is proposed in this paper that entirely abandons the utterance-level features. We focus on better extracting the emotional information from a number of selected segments within utterances. We designed two segment selection approaches (miSATIR and crSATIR) for selecting utterance segments for use in extracting features that are based on information theory and correlation coefficients to create the purely segment-level concept of the model. We established a model using these selected segment-level speech frames after clarifying the time interval for the segments. Testing has been carried out on a 50-person emotional speech database that was specifically designed for this research; and we found that there were significant improvements in the average level of accuracy (more than 20%) compared to that using the existing approaches for all the utterances’ information. The test results that were based on the speech signals stimulated by the International Affective Picture System (IAPS) database showed that the proposed method could be used in emotion strength analyses.

Keywords

A purely segment-level approach is proposed in this paper that entirely abandons the utterance-level features. We focus on better extracting the emotional information from a number of selected segments within utterances. We designed two segment selection

References

No references available

Citations in Crossref