Wang, TingtingFu, QiangWang, MinggangBi, HuikunDeng, QixinDeng, ZhigangChen, RenjieRitschel, TobiasWhiting, Emily2024-10-132024-10-132024978-3-03868-250-9https://doi.org/10.2312/pg.20241295https://diglib.eg.org/handle/10.2312/pg20241295In this paper we propose a novel density/trend map based method to predict both group behavior and individual pedestrian motion from video input. Existing motion prediction methods represent pedestrian motion as a set of spatial-temporal trajectories; however, besides such a per-pedestrian representation, a high-level representation for crowd motion is often needed in many crowd applications. Our method leverages density maps and trend maps to represent the spatial-temporal states of dense crowds. Based on such representations, we propose a crowd density map net that extracts a density map from a video clip, and a crowd prediction net that utilizes the historical states of a video clip to predict density maps and trend maps for future frames. Moreover, since the crowd motion consists of the motion of individual pedestrians in a group, we also leverage the predicted crowd motion as a clue to improve the accuracy of traditional trajectory-based motion prediction methods. Through a series of experiments and comparisons with state-of-the-art motion prediction methods, we demonstrate the effectiveness and robustness of our method.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Neural networks; TrackingComputing methodologies → Neural networksTrackingDense Crowd Motion Prediction through Density and Trend Maps10.2312/pg.202412959 pages