Hwang, JaepyungKim, JongminSuh, Il HongKwon, TaesooGutierrez, Diego and Sheffer, Alla2018-04-142018-04-1420181467-8659https://doi.org/10.1111/cgf.13361https://diglib.eg.org:443/handle/10.1111/cgf13361In this paper, we propose a novel motion controller for the online generation of natural character locomotion that adapts to new situations such as changing user control or applying external forces. This controller continuously estimates the next footstep while walking and running, and automatically switches the stepping strategy based on situational changes. To develop the controller, we devise a new physical model called an inverted-pendulum-based abstract model (IPAM). The proposed abstract model represents high-dimensional character motions, inheriting the naturalness of captured motions by estimating the appropriate footstep location, speed and switching time at every frame. The estimation is achieved by a deep learning based regressor that extracts important features in captured motions. To validate the proposed controller, we train the model using captured motions of a human stopping, walking, and running in a limited space. Then, the motion controller generates humanlike locomotion with continuously varying speeds, transitions between walking and running, and collision response strategies in a cluttered space in real time.Computing methodologiesAnimationReal-time Locomotion Controller using an Inverted-Pendulum-based Abstract Model10.1111/cgf.13361287-296