[PDF][PDF] Using speech acoustics to drive facial motion

H Yehia, T Kuratate… - Proc. the 14th …, 1999 - internationalphoneticassociation.org
Proc. the 14th International Congress of Phonetic …, 1999internationalphoneticassociation.org
This paper describes and evaluates a method to estimate facial motion during speech from
the speech acoustics. It is a statistical method based on simultaneous measurements of
facial motion and speech acoustics. Experiments were carried out for one American English
and one Japanese speaker. Facial motion is characterized by the 3D position of markers
placed on the face and tracked at 60 frames/s. The speech acoustics is characterized by
LSP parameters. The method is based on two points:(i) using appropriate constraints, the …
Abstract
This paper describes and evaluates a method to estimate facial motion during speech from the speech acoustics. It is a statistical method based on simultaneous measurements of facial motion and speech acoustics. Experiments were carried out for one American English and one Japanese speaker. Facial motion is characterized by the 3D position of markers placed on the face and tracked at 60 frames/s. The speech acoustics is characterized by LSP parameters. The method is based on two points:(i) using appropriate constraints, the vocal-tract shape can be estimated from the speech acoustics; and (ii) most of facial motion is a consequence of vocal-tract motion. Marker positions and LSP parameters were collected during several utterances and used to train artificial neural networks, which were then evaluated with test data. In the results obtained, approximately 85% of the facial motion variance were determined from the speech acoustics.
internationalphoneticassociation.org
以上显示的是最相近的搜索结果。 查看全部搜索结果