Tojo, KenjiChen, YifeiUmetani, NobuyukiPelechano, NuriaVanderhaeghe, David2022-04-222022-04-222022978-3-03868-169-41017-4656https://doi.org/10.2312/egs.20221033https://diglib.eg.org:443/handle/10.2312/egs20221033We present a neural-network-based compression method to alleviate the storage cost of motion capture data. Human motions such as locomotion, often consist of periodic movements. We leverage this periodicity by applying Fourier features to a multilayered perceptron network. Our novel algorithm finds a set of Fourier feature frequencies based on the discrete cosine transformation (DCT) of motion. During training, we incrementally added a dominant frequency of the DCT to a current set of Fourier feature frequencies until a given quality threshold was satisfied. We conducted an experiment using CMU motion dataset, and the results suggest that our method achieves overall high compression ratio while maintaining its quality.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies --> Animation; Neural networksComputing methodologiesAnimationNeural networksNeural Motion Compression with Frequency-adaptive Fourier Feature Network10.2312/egs.2022103361-644 pages