Zhang, XiaotangChang, ZiyiMen, QianhuiShum, Hubert P. H.Masia, BelenThies, Justus2026-04-172026-04-1720261467-8659https://diglib.eg.org/handle/10.1111/cgf70336https://doi.org/10.1111/cgf.70336Motion tracking has been an important technique for imitating human-like movement from large-scale datasets in physics-based motion synthesis. However, existing approaches focus on tracking either single character or a particular type of interaction, limiting their ability to handle contact-rich interactions. Extending single-character tracking approaches suffers from instability due to forces transferred through contacts. Contact-rich interactions require levels of control that place greater demands on model capacity. To this end, we propose a robust tracking method based on progressive neural networks (PNN) where multiple experts specialize in learning skills of various difficulties. Our method automatically assigns training samples to experts without manual scheduling. Both qualitative and quantitative results show that our method delivers more stable motion tracking in densely interactive movements while enabling more efficient model training.CC-BY-4.0Keywords: animation system, physical simulation, motion trackinganimation systemphysical simulationmotion trackingComputing methodologies → Physical simulationComputing methodologies → Motion captureComputing methodologies → Motion processingPhysics-Based Motion Tracking of Contact-Rich Interacting Characters10.1111/cgf.703369 pages