Detailseite
Projekt Druckansicht

The recognition and simulation of conversational facial expressions at various levels of photorealism

Fachliche Zuordnung Sicherheit und Verlässlichkeit, Betriebs-, Kommunikations- und verteilte Systeme
Förderung Förderung von 2006 bis 2009
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 18510149
 
Erstellungsjahr 2010

Zusammenfassung der Projektergebnisse

The research project used a unique combination of psychophysical and computer graphics methodologies to develop a systematic description of the facial motions that are necessary and sufficient to generate realistic, natural looking, conversational facial animations. The project consisted of two main lines of research. In the first, subtle manipulations of both computer generated and real video recordings of facial expressions determined the perceptual components of facial expressions. The second line examined the generalizability of these components to different levels of realism, from highly abstracted faces (e.g., cartoon faces) through detailed-but-stylized faces (e.g., medical illustration) to highly realistic computer animations. During the first two years the project focused on refining the techniques and methodology, as well as producing initial insights into facial expression components. During the final year, more intense integration with the other sub-projects in the cluster grant occurred. All tasks planned for the project have been completed, and either submitted or are in preparation for publication. Furthermore, a new, unified research methodological was developed ih collaboration with Dr. Fleming. This new methodology - which is a combination of semantic differentials, similarity ratings, factor analysis, and multiple dimensional scaling - allows one to quantitatively determined the mapping between the perceptual, semantic, and aesthetic nature of a scene, object, or concept on the one hand and the different physical dimensions and algorithmic parameters used to create that item on the other. The new methodology was validated and published using material properties. In the final phase of the project, the unified methodology was be extended and applied to determine the fundamental perceptual and semantic space for facial expressions, as well as to provide insights into the specific characteristics necessary for recognizable facial expressions. Future work will focus more on producing exact mathematical characterizations of the dynamic characteristics of facial expressions (e.g., parametric descriptions of the motion trajectories of critical facial features. A new DFG project focusing on this is about to begin). Applications to the encoding of facial animation as well as to improved animation generation techniques are planned.

Projektbezogene Publikationen (Auswahl)

  • (2006). Processing of identity and emotion in faces: a psychophysical, physiological and computational perspective. Progress in Brain Research, 156, 321-343
    A. Schwaninger, C. Wallraven, D. W. Cunningham and S. Chiller-Glaus
  • (2006). The Evaluation of Stylized Facial Expressions. Proceedings of the 3rd Symposium on Applied Perception in Graphics and Visualization (APGV06), 85-92. (Eds.) Spencer, S. N. ACM Press, New York, NY, USA
    C. Wallraven, J. Fischer, D. W. Cunningham, D. Bartz and H. H. Bülthoff
  • (2007). Perceptual reparameterization of material properties. Proceedings of the International Symposium on Computational Aesthetics, pp. 89-96
    D. W. Cunningham, R. W. Fleming, C. Wallraven, and W. Strasser
  • (2007). Psychophysical investigation of facial expressions using computer animated faces. Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (APGV'07), 11-18, ACM Press, New York. NY, USA
    R. T. Griesser, D. W. Cunningham, C. Wallraven and H. H. Bülthoff
  • (2007). The Evaluation of Real-World and Computer-Generated Stylized Facial Expressions. ACM Transactions on Applied Perception,A(3), 1-24
    C. Wallraven, J. Fischer, D. W. Cunningham, D. Bartz and H. H. Bülthoff
  • (2008) The contribution of different facial regions to the recognition of conversational expressions. Journal of Vision,8,1-23
    M. Nusseck, D. W. Cunningham, C. Wallraven, and H. H. Bülthoff
  • (2008). Evaluating the Perceptual Realism of Animated Facial Expressions. ACM Transactions on Applied Perception, 4,1-20
    C. Wallraven, M. Breidt, D. W. Cunningham and H. H. Bülthoff
  • (2008). State-of-the-Art of the Rote of Perception for Computer Graphics. Proceedings of the 29th Annual Conference Eurographics (EG 2008), 65-86. (Eds.) Brown, P Blackwell, Oxford, United Kingdom
    D. Bartz, D. W. Cunningham, J. Fischer and C. Wallraven
  • (2009). Dynamic information for the recognition of conversational expressions. Journal of Vision. 9, 1-17
    D. W. Cunningham and C. Wallraven
  • (2009). Going beyond universal expressions: investigating the visual perception of dynamic facial expressions. Perception 38(ECVP Abstract Supplement), 83
    K.C. Kaulard, C. Wallraven, D. W. Cunningham and H. H. Bülthoff
  • (2009). Motion and form interact in expression recognition: Insights from computer animated faces. Perception 38(ECVP Abstract Supplement), 163
    D.W. Cunningham, and C. Wallraven
  • (2009). The interaction between motion and form in expression recognition. Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization (APGV2009), 41-44. (Eds.) Mania, K., B. E. Riecke, S. N. Spencer, B. Bodenheimer, C. O'Sullivan, ACM Press, NewYork, NY, USA
    D.W. Cunningham and C. Wallraven
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung