Cai, ShaoyuBan, YukiNarumi, TakujiZhu, KeningArgelaguet, Ferran and McMahan, Ryan and Sugimoto, Maki2020-12-012020-12-012020978-3-03868-111-31727-530Xhttps://doi.org/10.2312/egve.20201254https://diglib.eg.org:443/handle/10.2312/egve20201254The electrostatic tactile display could render the tactile feeling of different haptic texture surfaces by generating the frictional force through voltage modulation when a finger is sliding on the display surface. However, it is challenging to prepare and fine-tune the appropriate frictional signals for haptic design and texture simulation. We present FrictGAN, a deep-learningbased framework to synthesize frictional signals for electrostatic tactile displays from fabric texture images. Leveraging GANs (Generative Adversarial Networks), FrictGAN could generate the displacement-series data of frictional coefficients for the electrostatic tactile display to simulate the tactile feedback of fabric material. Our preliminary experimental results showed that FrictGAN could achieve considerable performance on frictional signal generation based on the input images of fabric textures.Computing methodologiesGenerative adversarial networkHuman centered computingVirtual realityFrictGAN: Frictional Signal Generation from Fabric Texture Images using Generative Adversarial Network10.2312/egve.2020125411-15