Efficient acoustic perception for virtual AI agents

Loading...
Thumbnail Image
Date
2021
Journal Title
Journal ISSN
Volume Title
Publisher
ACM
Abstract
We model acoustic perception in AI agents efficiently within complex scenes with many sound events. The key idea is to employ perceptual parameters that capture how each sound event propagates through the scene to the agent's location. This naturally conforms virtual perception to human. We propose a simplified auditory masking model that limits localization capability in the presence of distracting sounds. We show that anisotropic reflections as well as the initial sound serve as useful localization cues. Our system is simple, fast, and modular and obtains natural results in our tests, letting agents navigate through passageways and portals by sound alone, and anticipate or track occluded but audible targets. Source code is provided.
Description

        
@inproceedings{
10.1145:3480139
, booktitle = {
Proceedings of the ACM on Computer Graphics and Interactive Techniques
}, editor = {
Narain, Rahul and Neff, Michael and Zordan, Victor
}, title = {{
Efficient acoustic perception for virtual AI agents
}}, author = {
Chemistruck, Mike
 and
Allen, Andrew
 and
Snyder, John
 and
Raghuvanshi, Nikunj
}, year = {
2021
}, publisher = {
ACM
}, ISSN = {
2577-6193
}, ISBN = {}, DOI = {
10.1145/3480139
} }
Citation