Character animation technology has many applications. Characters in games need to move naturally and interact with the game world. Realistic characters are also often required in virtual training environments, such as job interview training systems or crowd/crisis management trainings. Animation technology is also important in the medical domain, where the techniques can be used to measure various aspects of motion, such as balance, range of motion of limbs, usage of muscles that produce the motion, and so on. We do research in the field of character animation to develop new technologies for interpreting motion, animating virtual characters taking into account their physical surroundings, as well as using animation technologies in various ways in applied games.
The first line of research focuses on automatically generating motion for virtual characters. Most games and simulations use a database of motion capture clips that determines what kinds of movements characters can make. Although motion capture recordings look natural because they are based on real human motion, there are several limitations. For one, once a motion is recorded, it is not trivial to edit this motion afterward. Also, in order to cover the wide range of motions that humans can perform, generally huge databases of motion are required to get a variety of natural looking behaviour. Our goal is to develop animation systems that can deal with a variety of motion capture data in a smart way, leading to new animation tools that allow even non-expert animators to produce plausible motion with minimal effort.
Motion synthesis technology provides a large number of techniques to realistically animate virtual characters. We also believe in the potential of anatomical character-specific animation. For that, we need a deeper understanding and more accurate modelling of the origin of human motion. For instance, alterations in underlying anatomical models such as musculoskeletal injuries or physical muscle fatigue highly influence the motion of virtual characters. So, in our second line of research, we develop technologies merging physics-based animation, soft body deformation and musculoskeletal simulation. In particular, we are interested in applying these technologies in real-time virtual environments.
Finally, we are interested in simulating not only the body motion of virtual characters, but also consider other aspects in which the anatomy and the biological processes underlying the human body influence the motion and appearance of characters. For instance, we focus on automatically generating biology-driven human skin textures, or simulating emotional phenomena such as crying or blushing.
Our research is currently supported by the Horizon 2020 RAGE EU project (http://rageproject.eu/) (No. 644187), the Game Research Focus Area Seed Funding, and the COMMIT project. The software we have built can be used in other projects. Please contact Arjan Egges, Frank van der Stappen, or Zerrin Yumak for further information.
|Forough Madehkhaksar and Arjan Egges: Effect of dual task type on gait and dynamic stability during stair negotiation at different inclinations. Gait & Posture 43: 114-119 (2016).|
|Zhiping Luo, Remco Veltkamp, and Arjan Egges: As-Rigid-As-Possible Character Deformation Using Point Handles. 11th International Symposium on Visual Computing, ISVC 2015, Las Vegas, NV, USA. Switzerland : Springer International Publishing (2015).|
|Thomas Geijtenbeek, Michiel van de Panne, and Frank van der Stappen: Flexible muscle-based locomotion for bipedal creatures. ACM Transactions on Graphics 32(6): 206:1-206:11 (2013).|
|Sybren Stüvel, Nadia Magnenat-Thalmann, Daniel Thalmann, Arjan Egges, and Frank van der Stappen: Hierarchical structures for collision checking between virtual characters. Computer Animation and Virtual Worlds 25(3-4): 333-342 (2014).|
|Ben van Basten and Arjan Egges: Motion Transplantation Techniques: A Survey. IEEE computer graphics and applications 32(3): 16-23 (2012).|