Show simple item record

dc.contributor.authorKo, Hyung-Kwonen_US
dc.contributor.authorJo, Jaeminen_US
dc.contributor.authorSeo, Jinwooken_US
dc.contributor.editorKerren, Andreas and Garth, Christoph and Marai, G. Elisabetaen_US
dc.date.accessioned2020-05-24T13:52:06Z
dc.date.available2020-05-24T13:52:06Z
dc.date.issued2020
dc.identifier.isbn978-3-03868-106-9
dc.identifier.urihttps://doi.org/10.2312/evs.20201061
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/evs20201061
dc.description.abstractWe present a progressive algorithm for the Uniform Manifold Approximation and Projection (UMAP), called the Progressive UMAP. Based on the theory of Riemannian geometry and algebraic topology, UMAP is an emerging dimensionality reduction technique that offers better versatility and stability than t-SNE. Although UMAP is also more efficient than t-SNE, it still suffers from an initial delay of a few minutes to produce the first projection, which limits its use in interactive data exploration. To tackle this problem, we improve the sequential computations in UMAP by making them progressive, which allows people to incrementally append a batch of data points into the projection at the desired pace. In our experiment with the Fashion MNIST dataset, we found that Progressive UMAP could generate the first approximate projection within a few seconds while also sufficiently capturing the important structures of the high-dimensional dataset.en_US
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/]
dc.subjectHuman centered computing
dc.subjectVisual analytics
dc.titleProgressive Uniform Manifold Approximation and Projectionen_US
dc.description.seriesinformationEuroVis 2020 - Short Papers
dc.description.sectionheadersRepresentation, Perception, and ML
dc.identifier.doi10.2312/evs.20201061
dc.identifier.pages133-137


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 International License
Except where otherwise noted, this item's license is described as Attribution 4.0 International License