Show simple item record

dc.contributor.authorChemistruck, Mikeen_US
dc.contributor.authorAllen, Andrewen_US
dc.contributor.authorSnyder, Johnen_US
dc.contributor.authorRaghuvanshi, Nikunjen_US
dc.contributor.editorNarain, Rahul and Neff, Michael and Zordan, Victoren_US
dc.date.accessioned2022-02-07T13:32:37Z
dc.date.available2022-02-07T13:32:37Z
dc.date.issued2021
dc.identifier.issn2577-6193
dc.identifier.urihttps://doi.org/10.1145/3480139
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1145/3480139
dc.description.abstractWe model acoustic perception in AI agents efficiently within complex scenes with many sound events. The key idea is to employ perceptual parameters that capture how each sound event propagates through the scene to the agent's location. This naturally conforms virtual perception to human. We propose a simplified auditory masking model that limits localization capability in the presence of distracting sounds. We show that anisotropic reflections as well as the initial sound serve as useful localization cues. Our system is simple, fast, and modular and obtains natural results in our tests, letting agents navigate through passageways and portals by sound alone, and anticipate or track occluded but audible targets. Source code is provided.en_US
dc.publisherACMen_US
dc.subjectComputing methodologies
dc.subjectPhysical simulation
dc.subjectVirtual reality
dc.subjectApplied computing
dc.subjectSound and music computing
dc.subjectacoustics
dc.subjectperception
dc.subjectmasking
dc.subjectlocalization
dc.subjectsound propagation
dc.subjectvirtual agents
dc.subjectgame AI
dc.subjectNPC AI
dc.titleEfficient acoustic perception for virtual AI agentsen_US
dc.description.seriesinformationProceedings of the ACM on Computer Graphics and Interactive Techniques
dc.description.sectionheaderspapers
dc.description.volume4
dc.description.number3
dc.identifier.doi10.1145/3480139


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record