Perception-Aware Computational Fabrication: Increasing The Apparent Gamut of Digital Fabrication

dc.contributor.authorPiovarci, Michal
dc.date.accessioned2020-12-29T07:29:03Z
dc.date.available2020-12-29T07:29:03Z
dc.date.issued2020-10-19
dc.description.abstractHaptic and visual feedback are important for assessing objects' quality and affordance. One of the benefits of additive manufacturing is that it enables the creation of objects with personalized tactile and visual properties. This personalization is realized by the ability to deposit functionally graded materials at microscopic resolution. However, faithfully reproducing real-world objects on a 3D printer is a challenging endeavor. A large number of available materials and freedom in material deposition make exploring the space of printable objects difficult. Furthermore, current 3D printers can perfectly capture only a small amount of objects from the real world which makes high-quality reproductions challenging. Interestingly, similar to the manufacturing hardware, our senses of touch and sight have inborn limitations given by biological constraints. In this work, we propose that it is possible to leverage the limitations of human perception to increase the apparent gamut of a 3D printer by combining numerical optimization with perceptual insights. Instead of optimizing for exact replicas, we search for perceptually equivalent solutions. This not only simplifies the optimization but also achieves prints that better resemble the target behavior. To guide us towards the desired behavior, we design perceptual error metrics. Recovering such a metric requires conducting costly experiments. We tackle this problem by proposing a likelihood-based optimization that automatically recovers a metric that relates perception with physical properties. To minimize fabrication during the optimization we map new designs into perception via numerical models. As with many complex design tasks modeling the governing physics is either computationally expensive or we lack predictive models. We address this issue by applying perception-aware coarsening that focuses the computation towards perceptually relevant phenomena. Additionally, we propose a data-driven fabrication-in-the-loop model that implicitly handles the fabrication constraints. We demonstrate the capabilities of our approach in the contexts of haptic and appearance reproduction. We show its applications to designing objects with prescribed compliance, and mimicking the haptics of drawing tools. Furthermore, we propose a system for manufacturing objects with spatially-varying gloss.en_US
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:ch:rero-006-118877
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/2632992
dc.language.isoenen_US
dc.publisherUniversità della Svizzera italianaen_US
dc.relation.ispartofseries20201106112851-SK;
dc.subjectComputational fabricationen_US
dc.subjectPerceptionen_US
dc.subjectHaptics reproductionen_US
dc.subjectAppearance reproductionen_US
dc.subjectComplianceen_US
dc.subjectDigital drawingen_US
dc.subjectStylusen_US
dc.subjectContact simulationen_US
dc.subjectFabrication-in-the-loopen_US
dc.subjectCo-optimizationen_US
dc.subjectSpatially-varying reflectance manufacturingen_US
dc.subjectGloss printingen_US
dc.titlePerception-Aware Computational Fabrication: Increasing The Apparent Gamut of Digital Fabricationen_US
dc.typeThesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Perception-Aware Computational Fabrication.pdf
Size:
68.72 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.79 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections