Research Output
Real-time Facial Animation for 3D Stylized Character with Emotion Dynamics
  Our aim is to improve animation production techniques' efficiency and effectiveness. We present two real-time solutions which drive character expressions in a geometrically consistent and perceptually valid way. Our first solution combines keyframe animation techniques with machine learning models. We propose a 3D emotion transfer network makes use of a 2D human image to generate a stylized 3D rig parameter. Our second solution combines blendshape-based motion capture animation techniques with machine learning models. We propose a blendshape adaption network which generates the character rig parameter motions with geometric consistency and temporally stability. We demonstrate the effectiveness of our system by comparing it to a commercial product Faceware. Results reveal that ratings of the recognition, intensity, and attractiveness of expressions depicted for animated characters via our systems are statistically higher than Faceware. Our results may be implemented into the animation pipeline, supporting animators to create expressions more rapidly and precisely.

  • Date:

    27 October 2023

  • Publication Status:

    Published

  • DOI:

    10.1145/3581783.3613803

  • Funders:

    EC European Commission; National Natural Science Foundation of China

Citation

Pan, Y., Zhang, R., Wang, J., Ding, Y., & Mitchell, K. (2023). Real-time Facial Animation for 3D Stylized Character with Emotion Dynamics. In MM '23: Proceedings of the 31st ACM International Conference on Multimedia (6851-6859). https://doi.org/10.1145/3581783.3613803

Authors

Keywords

motion capture, virtual characters, emotion

Monthly Views:

Linked Projects

Available Documents