Kim, MinjeongChoi, MinsukLee, SunwoongTang, JianPark, HaesunChoo, JaegulJeffrey Heer and Heike Leitte and Timo Ropinski2018-06-022018-06-0220181467-8659https://doi.org/10.1111/cgf.13418https://diglib.eg.org:443/handle/10.1111/cgf13418Embedding and visualizing large-scale high-dimensional data in a two-dimensional space is an important problem, because such visualization can reveal deep insights of complex data. However, most of the existing embedding approaches run on an excessively high precision, even when users want to obtain a brief insight from a visualization of large-scale datasets, ignoring the fact that in the end, the outputs are embedded onto a fixed-range pixel-based screen space. Motivated by this observation and directly considering the properties of screen space in an embedding algorithm, we propose Pixel-Aligned Stochastic Neighbor Embedding (PixelSNE), a highly efficient screen resolution-driven 2D embedding method which accelerates Barnes-Hut treebased t-distributed stochastic neighbor embedding (BH-SNE), which is known to be a state-of-the-art 2D embedding method. Our experimental results show a significantly faster running time for PixelSNE compared to BH-SNE for various datasets while maintaining comparable embedding quality.PixelSNE: Pixel-Aligned Stochastic Neighbor Embedding for Efficient 2D Visualization with Screen-Resolution Precision10.1111/cgf.13418267-276