Temporally Coherent and Spatially Accurate Video Matting

dc.contributor.authorShahrian, Ehsanen_US
dc.contributor.authorPrice, Brianen_US
dc.contributor.authorCohen, Scotten_US
dc.contributor.authorRajan, Deepuen_US
dc.contributor.editorB. Levy and J. Kautzen_US
dc.date.accessioned2015-03-03T12:29:32Z
dc.date.available2015-03-03T12:29:32Z
dc.date.issued2014en_US
dc.description.abstractImage and video matting are still challenging problems in areas with low foreground-background contrast. Video matting also has the challenge of ensuring temporally coherent mattes because the human visual system is highly sensitive to temporal jitter and flickering. On the other hand, video provides the opportunity to use information from other frames to improve the matte accuracy on a given frame. In this paper, we present a new video matting approach that improves the temporal coherence while maintaining high spatial accuracy in the computed mattes. We build sample sets of temporal and local samples that cover all the color distributions of the object and background over all previous frames. This helps guarantee spatial accuracy and temporal coherence by ensuring that proper samples are found even when distantly located in space or time. An explicit energy term encourages temporal consistency in the mattes derived from the selected samples. In addition, we use localized texture features to improve spatial accuracy in low contrast regions where color distributions overlap. The proposed method results in better spatial accuracy and temporal coherence than existing video matting methods.en_US
dc.description.seriesinformationComputer Graphics Forumen_US
dc.identifier.issn1467-8659en_US
dc.identifier.urihttps://doi.org/10.1111/cgf.12297en_US
dc.publisherThe Eurographics Association and John Wiley and Sons Ltd.en_US
dc.titleTemporally Coherent and Spatially Accurate Video Mattingen_US
Files