Computer Graphics & Visual Computing (CGVC) 2016ISBN 978-3-03868-022-2https://diglib.eg.org:443/handle/10.2312/26308622024-03-29T11:06:17Z2024-03-29T11:06:17ZPED: Pedestrian Environment DesignerMcIlveen, JamesMaddock, SteveHeywood, PeterRichmond, Paulhttps://diglib.eg.org:443/handle/10.2312/cgvc201613042017-03-16T11:30:13Z2016-01-01T00:00:00ZPED: Pedestrian Environment Designer
McIlveen, James; Maddock, Steve; Heywood, Peter; Richmond, Paul
Cagatay Turkay and Tao Ruan Wan
Pedestrian simulations have many uses, from pedestrian planning for architecture design through to games and entertainment. However, it is still challenging to efficiently author such simulations, especially for non-technical users. Direct pedestrian control is usually laborious, and, while indirect, environment-level control is often faster, it currently lacks the necessary tools to create complex environments easily and without extensive prior technical knowledge. This paper describes an indirect, environment-level control system in which pedestrians' behaviour can be specified efficiently and then interactively tuned. With the Pedestrian Environment Designer (PED) interface, authors can define environments using tools similar to those found in raster graphics editing software such as PhotoshopTM. Users paint on two-dimensional bitmap layers to control the behaviour of pedestrians in a three-dimensional simulation. The layers are then compiled to produce a live, agent-based pedestrian simulation using the FLAME GPU framework. Entrances and exits can be inserted, collision boundaries defined, and areas of attraction and avoidance added. The system also offers dynamic simulation updates at runtime giving immediate author feedback and enabling authors to simulate scenarios with dynamic elements such as barriers, or dynamic circumstances such as temporary areas of avoidance. As a result, authors are able to create complex crowd simulations more effectively and with minimal training.
2016-01-01T00:00:00ZFire and Gas Detection Mapping using Volumetric RenderingCotterill, CameronDavison, TyroneO'Connor, Simon J.Orr, DavidCharles, FredTang, Wenhttps://diglib.eg.org:443/handle/10.2312/cgvc201613062017-03-16T11:30:13Z2016-01-01T00:00:00ZFire and Gas Detection Mapping using Volumetric Rendering
Cotterill, Cameron; Davison, Tyrone; O'Connor, Simon J.; Orr, David; Charles, Fred; Tang, Wen
Cagatay Turkay and Tao Ruan Wan
The software presented here provides an interactive real-time tool for the simulation of fire and gas detection mapping using volumetric rendering based on the layouts of fire and gas detectors within 3D virtual environments.
2016-01-01T00:00:00ZA Calibrated Olfactory Display for High Fidelity Virtual EnvironmentsDhokia, AmarDoukakis, EfstratiousAsadipour, AliHarvey, CarloBashford-Rogers, ThomasDebattista, KurtWaterfield, BrianChalmers, Alanhttps://diglib.eg.org:443/handle/10.2312/cgvc201613052017-03-16T21:35:23Z2016-01-01T00:00:00ZA Calibrated Olfactory Display for High Fidelity Virtual Environments
Dhokia, Amar; Doukakis, Efstratious; Asadipour, Ali; Harvey, Carlo; Bashford-Rogers, Thomas; Debattista, Kurt; Waterfield, Brian; Chalmers, Alan
Cagatay Turkay and Tao Ruan Wan
Olfactory displays provide a means to reproduce olfactory stimuli for use in virtual environments. Many of the designs produced by researchers, strive to provide stimuli quickly to users and focus on improving usability and portability, yet concentrate less on providing high levels of accuracy to improve the fidelity of odour delivery. This paper provides the guidance to build a reproducible and low cost olfactory display which is able to provide odours to users in a virtual environment at accurate concentration levels that are typical in everyday interactions; this includes ranges of concentration below parts per million and into parts per billion. This paper investigates build concerns of the olfactometer and its proper calibration in order to ensure concentration accuracy of the device. An analysis is provided on the recovery rates of a specific compound after excitation. This analysis provides insight into how this result can be generalisable to the recovery rates of any volatile organic compound, given knowledge of the specific vapour pressure of the compound.
2016-01-01T00:00:00ZTactile Mesh Saliency: A Brief SynopsisLau, ManfredDev, Kapilhttps://diglib.eg.org:443/handle/10.2312/cgvc201613022017-03-16T11:30:13Z2016-01-01T00:00:00ZTactile Mesh Saliency: A Brief Synopsis
Lau, Manfred; Dev, Kapil
Cagatay Turkay and Tao Ruan Wan
This work has previously been published [LDS 16] and this extended abstract provides a synopsis for further discussion at the UK CGVC 2016 conference. We introduce the concept of tactile mesh saliency, where tactile salient points on a virtual mesh are those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the tactile saliency of every mesh vertex. The key to solving this problem is in a new formulation that combines deep learning and learning-to-rank methods to compute a tactile saliency measure. Finally, we discuss possibilities for future work.
2016-01-01T00:00:00Z