Local Motion Phases Technique Boosts Basketball Animation Richness and Realism

Synced
SyncedReview
Published in
4 min readJun 17, 2020

Researchers from the University of Edinburgh School of Informatics and video game company Electronic Arts have proposed a novel framework that learns fast and dynamic character interactions. Trained on an unstructured basketball motion capture database, the model can animate multiple contacts between a player and the ball and other players and the environment.

The team’s modular and stable framework for data-driven character animation includes data processing, network training and runtime control; and was developed using Unity, Tensor flow, and PyTorch. The approach can perform complex and realistic animations of bipeds or quadrupeds engaged in sports and beyond.

Movement types synthesized by the animation system

Enabling characters to perform a wide variety of dynamic fast-paced and quickly changing movements is a key challenge in character animation. The new research proposes a deep learning framework that can interactively synthesize such animations in high quality from unstructured motion data without requiring any manual labelling.

The researchers propose a local motion phases concept that can produce basketball-related motion skills such as dribbling, shooting, catching, avoidance, sharp turning and multiple look. The character and object interactions are all generated under a unified framework.

Typically, in order to effectively align character movements a start and end reference pulse is defined to obtain a global phase parameter. However, such frameworks can result in mismatched poses when new movements are added. There are also actions for which it is difficult to define a global phase parameter to overcome such problems.

The researchers used a large database of one-on-one basketball play where one player catches and dribbles the ball while avoiding another player attempting to defend and intercept to design and train a neural character controller that produces realistic offence and switches to defence when ball possession changes hands.

Local motion phases can align character movements at the bone level in an asynchronous manner. Researchers then fit a sinusoidal function which returns to optimized amplitude and phase parameters to compute the motion phase vectors. This technique works in a fully automatic fashion and does not require the animator to define handcrafted rules for each type of motion.

In experiments, the local motion phase method was compared with PFNN, MANN, and LSTM approaches. The results show that the new method performs better on action styles such as body movement, foot skating and response time. Faster and more complex interactions between the character and the ball or another character can be synthesized in real time after training the system with the basketball play motion capture data.

The researchers say the technique is useful for animating contact-rich, complex interactions for real-time applications such as computer games and can also be applied to VR. The paper and novel animation system have garnered keen interest on the Internet, with a number of people expressing their eagerness to invest in the project.

The paper Local Motion Phases for Learning Multi-Contact Character Movement is on GitHub.

Author: Xuehan Wang | Editor: Michael Sarazen; Fangyu Cai

We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Thinking of contributing to Synced Review? Synced’s new column Share My Research welcomes scholars to share their own research breakthroughs with global AI enthusiasts.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global