UPLINX-Logo

Motion Synthesis for Virtual Characters

Seminar an der Universität des Saarlandes, Fachrichtung Informatik, LSF 116960

Below we give a tentative list of topics which can be extended in the course of the next weeks. Do not hesitate to contact us if the list does not match your interests. We'll be glad to discuss with you whether your area of interest could be included.

Topic#

Topic

Motion Synthesis for Virtual Characters

1

Data-driven synthesis based on Motion Graphs

Reference:

Kovar, Lucas, Michael Gleicher, and Frédéric Pighin. "Motion graphs." ACM SIGGRAPH 2008 classes. ACM, 2008.

Background:

Arikan, Okan, David A. Forsyth, and James F. O'Brien. "Motion synthesis from annotations." *ACM Transactions on Graphics (TOG)*. Vol. 22. No. 3. ACM, 2003.

Lee, Jehee, et al. "Interactive control of avatars animated with human motion data." ACM Transactions on Graphics (ToG). Vol. 21. No. 3. ACM, 2002.

Sung, Mankyu, Lucas Kovar, and Michael Gleicher. "Fast and accurate goal-directed motion synthesis for crowds." Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation. ACM, 2005.

2

Data-driven synthesis based on Motion Fields

Reference:

Lee, Yongjoon, et al. "Motion fields for interactive character locomotion." ACM Transactions on Graphics (TOG). Vol. 29. No. 6. ACM, 2010.

Background:

Arikan, Okan, David A. Forsyth, and James F. O'Brien. "Motion synthesis from annotations." *ACM Transactions on Graphics (TOG)*. Vol. 22. No. 3. ACM, 2003.

Lee, Jehee, et al. "Interactive control of avatars animated with human motion data." ACM Transactions on Graphics (ToG). Vol. 21. No. 3. ACM, 2002.

3

Data-driven approaches based on Statistical Models and Statistical Models

Reference:

Min, Jianyuan, and Jinxiang Chai. "Motion graphs++: a compact generative model for semantic motion analysis and synthesis." ACM Transactions on Graphics (TOG) 31.6 (2012): 153.


Background:

Min J, Chen YL, Chai J. Interactive generation of human animation with deformable motion models. ACM Transactions on Graphics (TOG). 2009 Dec 15;29(1):9.

4

Data-driven approaches based on Convolutional Neural Networks

Reference:

Holden, Daniel, Jun Saito, and Taku Komura. "A deep learning framework for character motion synthesis and editing." ACM Transactions on Graphics (TOG) 35.4 (2016): 138.


Background:

Holden, Daniel, Taku Komura, and Jun Saito. "Phase-functioned neural networks for character control." ACM Transactions on Graphics (TOG) 36.4 (2017): 42.

5

Interactive Data-driven synthesis based on Fully Connected Neural Networks

Reference:

Holden, Daniel, Taku Komura, and Jun Saito. "Phase-functioned neural networks for character control." ACM Transactions on Graphics (TOG) 36.4 (2017): 42.


Background:

Lee, Kyungho, Seyoung Lee, and Jehee Lee. "Interactive character animation by learning multi-objective control." SIGGRAPH Asia 2018 Technical Papers. ACM, 2018.

Zhang, He, et al. "Mode-adaptive neural networks for quadruped motion control." ACM Transactions on Graphics (TOG) 37.4 (2018): 145.

6

Motion synthesis based on generative Neural Networks

Reference:

Taylor, Graham W., and Geoffrey E. Hinton. "Factored conditional restricted Boltzmann machines for modeling motion style." Proceedings of the 26th annual international conference on machine learning. ACM, 2009.

Heydari, Muhamad Javad, and Saeed Shiry Ghidary. "Cross-modal motion regeneration using Multimodal Deep Belief Network." Journal of Visual Communication and Image Representation 58 (2019): 245-260.

7

Classical procedural motion synthesis techniques

Reference:

Coros, Stelian, Philippe Beaudoin, and Michiel Van de Panne. "Generalized biped walking control." ACM Transactions on Graphics (TOG). Vol. 29. No. 4. ACM, 2010.

Tsai, Yao-Yang, et al. "Real-time physics-based 3d biped character animation using an inverted pendulum model." IEEE transactions on visualization and computer graphics 16.2 (2010): 325-337.

Background:

Yin, KangKang, Kevin Loken, and Michiel Van de Panne. "Simbicon: Simple biped locomotion control." ACM Transactions on Graphics (TOG). Vol. 26. No. 3. ACM, 2007.

8

Training locomotion controllers using reinforcement learning

Reference:

Xue Bin Peng, Glen Berseth, Michiel van de Panne (2016): "Terrain-Adaptive Locomotion Skills Using Deep Reinforcement Learning". ACM Transactions on Graphics (TOG) [Proc. SIGGRAPH 2016].

Yu, Wenhao, Greg Turk, and C. Karen Liu. "Learning symmetric and low-energy locomotion." ACM Transactions on Graphics (TOG) 37.4 (2018): 144.


Background:

Xue Bin Peng, Glen Berseth, Michiel van de Panne (2015): "Dynamic Terrain Traversal Skills Using Reinforcement Learning". ACM Transactions on Graphics (TOG) [Proc. SIGGRAPH 2015].

Peng, Xue Bin, et al. "Deeploco: Dynamic locomotion skills using hierarchical deep reinforcement learning." ACM Transactions on Graphics (TOG) 36.4 (2017): 41

Xue Bin Peng, Michiel van de Panne (2017): "Learning Locomotion Skills Using DeepRL: Does the Choice of Action Space Matter?". ACM SIGGRAPH / Eurographics Symposium on Computer Animation 2017.

9

Training physical model controllers using sampling based optimization

Reference:

Liu, Libin, Michiel Van De Panne, and KangKang Yin. "Guided learning of control graphs for physics-based characters." *ACM Transactions on Graphics (TOG)* 35.3 (2016): 29.

Rajamäki, Joose, and Perttu Hämäläinen. "Augmenting sampling based controllers with machine learning." Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation. ACM, 2017.

10

Imitating motion capture data with a physical model using reinforcement learning

Reference:

Peng, Xue Bin, et al. "DeepMimic: Example-Guided Deep Reinforcement Learning of Physics-Based Character Skills." arXiv preprint arXiv:1804.02717 (2018).

Merel, Josh, et al. Learning human behaviors from motion capture by adversarial imitation. arXiv preprint arXiv:1707.02201 (2017).


Background:

John Schulman, Filip Wolski, Prafulla Dhariwal, Alec Radford, Oleg Klimov: "Proximal Policy Optimization Algorithms". arXiv:1707.06347 (2017)

11

Motion editing using space-time constraints

Reference:

Schulz, C., von Tycowicz, C., Seidel, H. P., & Hildebrandt, K. (2015, August). Animating articulated characters using wiggly splines. In Proceedings of the 14th ACM SIGGRAPH/Eurographics Symposium on Computer Animation(pp. 101-109). ACM.

Gleicher, Michael. "Retargeting motion to new characters." Proceedings of the 25th annual conference on Computer graphics and interactive techniques. ACM, 1998.

Popović, Zoran, and Andrew Witkin. "Physically based motion transformation." Proceedings of the 26th annual conference on Computer graphics and interactive techniques. ACM Press/Addison-Wesley Publishing Co., 1999.

12

Motion synthesis as non-linear optimization problem

Reference:

De Lasa, Martin, Igor Mordatch, and Aaron Hertzmann. "Feature-based locomotion controllers." *ACM Transactions on Graphics (TOG)*. Vol. 29. No. 4. ACM, 2010.

Muico, Uldarico, et al. "Contact-aware nonlinear control of dynamic characters." *ACM Transactions on Graphics (TOG)*. Vol. 28. No. 3. ACM, 2009.

13

Motion Editing using Neural Networks

Reference:

Mason, Ian, et al. "Few‐shot Learning of Homogeneous Human Locomotion Styles." Computer Graphics Forum. Vol. 37. No. 7. 2018.

Villegas, Ruben, et al. "Neural kinematic networks for unsupervised motion retargetting." IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Vol. 3. 2018.

Jang, Hanyoung, et al. "A Deep Learning Approach for Motion Retargeting." (2018).

14

Motion Style Transfer

Reference:

Daniel Holden, Ikhsanul Habibie, Ikuo Kusajima, and Taku Komura.2017. Fast Neural Style Transfer for Motion Data.IEEE computergraphics and applications37, 4 (2017), 42–49.


Background:

Eugene Hsu, Kari Pulli, and Jovan Popović. 2005. Style translation forhuman motion. In ACM Transactions on Graphics (TOG), Vol. 24. ACM,1082–1089.Eugene Hsu, Kari Pulli, and Jovan Popović. 2005. Style translation forhuman motion. In ACM Transactions on Graphics (TOG), Vol. 24. ACM, 1082–1089.

Shihong Xia, Congyi Wang, Jinxiang Chai, and Jessica Hodgins. 2015. Realtime style transfer for unlabeled heterogeneous human motion. ACM Transactions on Graphics (TOG) 34, 4 (2015), 119

15

Biologically inspired procedural motion synthesis

Reference:

Jack M. Wang, Samuel R. Hamner, Scott L. Delp, and Vladlen Koltun (2012): "Optimizing Locomotion Controllers Using Biologically-Based Actuators and Objectives". ACM Trans Graph. (TOG) 2012 Jul; 31(4): 25.

16

Motion Capturing

Reference:

Chen, Chen, Roozbeh Jafari, and Nasser Kehtarnavaz. "A survey of depth and inertial sensor fusion for human action recognition." Multimedia Tools and Applications 76.3 (2017): 4405-4425.

Elhayek, Ahmed, et al. "Fully Automatic Multi-person Human Motion Capture for VR Applications." International Conference on Virtual Reality and Augmented Reality. Springer, Cham, 2018.

17

Motion recognition and segmentation using neural networks

Reference:

Andreas Aristidou, Daniel Cohen-Or, Jessica K. Hodgins, Jessica K. Hodgins and Jessica K. Hodgins. "Deep motifs and motion signatures". ACM Transactions on Graphics (TOG). Vol. 37 Issue 6, November 2018 Article No. 187.

18

Unsupervised motion capture segmentation

Reference:

F. Zhou, F. de la Torre, and J. K. Hodgins. "Hierarchical Aligned cluster analysis for temporal segmentation of human motion" IEEE Transactions on Pattern Analysis & Machine Intelligence (PAMI), 2010.


Background:

Anna Vögele, Björn Krüger and Reinhard Klein. "Efficient Unsupervised Temporal Segmentation of Human Motion". In Proceedings of ACM Symposium of Computer Animation 2014.

19

Data-driven synthesis based on Statistical Motion Graphs

Reference:

Lee, Jehee, et al. "Interactive control of avatars animated with human motion data." ACM Transactions on Graphics (ToG). Vol. 21. No. 3. ACM, 2002.


Background:

Arikan, Okan, David A. Forsyth, and James F. O'Brien. "Motion synthesis from annotations." *ACM Transactions on Graphics (TOG)*. Vol. 22. No. 3. ACM, 2003.