Show simple item record

dc.contributor.authorLi, Ming
dc.date.accessioned2016-01-20T15:22:06Z
dc.date.available2016-01-20T15:22:06Z
dc.date.issued2005
dc.identifier.urihttp://diglib.eg.org/handle/10.2312/14682
dc.description.abstractThis thesis discusses fast novel view synthesis from multiple images taken from different viewpoints. We propose several new algorithms that take advantage of modern graphics hardware to create novel views. Although different approaches are explored, one geometry representation, the visual hull, is employed throughout our work. First the visual hull plays an auxiliary role and assists in reconstruction of depth maps that are utilized for novel view synthesis. Then we treat the visual hull as the principal geometry representation of scene objects. A hardwareaccelerated approach is presented to reconstruct and render visual hulls directly from a set of silhouette images. The reconstruction is embedded in the rendering process and accomplished with an alpha map trimming technique. We go on by combining this technique with hardware-accelerated CSG reconstruction to improve the rendering quality of visual hulls. Finally, photometric information is exploited to overcome an inherent limitation of the visual hull. All algorithms are implemented on a distributed system. Novel views are generated at interactive or real-time frame rates.en_US
dc.language.isoenen_US
dc.titleTowards Real-Time Novel View Synthesis Using Visual Hullsen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record