Detailseite
Projekt Druckansicht

Videorealistische Gesichtsanimation mit natürlichem Gesichtsausdruck für interaktive Dienste

Fachliche Zuordnung Elektronische Halbleiter, Bauelemente und Schaltungen, Integrierte Systeme, Sensorik, Theoretische Elektrotechnik
Förderung Förderung von 2009 bis 2014
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 158317137
 
Erstellungsjahr 2014

Zusammenfassung der Projektergebnisse

This project presents an image-based talking head which provides a higher level of realism when compared to 3D model-based talking head. We first improve the image-based reference talking head system. Based on the optimized system, the image-based talking head system is extended with facial expression synthesis and head motion synthesis, so that the talking head is even more realistic and lifelike. In this DFG project, we have improved the basic facial animation system. Furthermore, the facial animation system is extended with realistic facial expression and added with flexible head motion. Combined with a dialog system, the developed expressive talking head, integrated with eye animation, will open a wide space for many applications, such as web-based customer service, E-education, and E-care. The expressive talking head is able to give a very personalized and believable interface for human-machine communication. These intuitive and efficient interfaces will be extensively used in the near future. The sources of the expressive database and all animations in the project are listed as follows. The animations and recorded videos can be found on our web server. • Basic animations: http://www.tnt.uni-hannover.de/project/facialanimation/demo/mouth • Facial expression synthesis: http://www.tnt.uni-hannover.de/project/facialanimation/demo/emotion • Head motion synthesis: http://www.tnt.uni-hannover.de/project/facialanimation/demo/headmotion • The expressive database is made available to the facial animation community for scientific purposes. Recorded clips: http://www.tnt.uni-hannover.de/project/facialanimation/demo/database

Projektbezogene Publikationen (Auswahl)

  • “Image-based talking head: Analysis and synthesis,” in DAGA 2010, 36. International Conference on Acoustics, Mar. 2010, pp. 87–88
    K. Liu and J. Ostermann
  • Verfahren, Einrichtung und Computerprogramm zur Erzeugung einer fotorealistischen Gesichtsanimation”, 2011. DE 10 2011 107 295 A1 2013.01.10
    K. Liu and J. Ostermann
  • “Evaluation of an image-based talking head with realistic facial expression and head motion,” in Proceedings of CASA (Computer Animation and Social Agents) workshop on Emotion-based Interaction, May 2011
    K. Liu and J. Ostermann
  • “Evaluation of an Image-based Talking Head with Realistic Facial Expression and Head Motion”, Journal on Multimodal User Interfaces, Special issue: Emotion-based Interaction, Springer Verlag, October 2011
    K. Liu, J. Ostermann
  • “Realistic and Expressive Talking Head: Implementation and Evaluation”, Dissertation, Leibniz Universität Hannover, June 2011
    K. Liu
  • “Realistic facial expression synthesis for an image-based talking head,” in IEEE Conference on Multimedia and Expo, ICME2011, Jul. 2011, p. 6
    K. Liu and J. Ostermann
  • “Realistic head motion synthesis for an image-based talking head,” in FG 2011, The 9th IEEE Conference on Automatic Face and Gesture Recognition, Mar. 2011, p. 6
    K. Liu and J. Ostermann
  • “Performance of image registration and its extensions for interpolation of facial motion,” in PSIVT 2013 Workshops, Oct. 2013, pp. 216–227
    S. Graßhof and J. Ostermann
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung