Grünwald, DustinImoto, YusukeTaninaka, Isabella MikaSato, KosukeIwai, DaisukeJorge, Joaquim A.Sakata, Nobuchika2025-11-262025-11-262025978-3-03868-278-31727-530Xhttps://doi.org/10.2312/egve.20251352https://diglib.eg.org/handle/10.2312/egve20251352Reading unknown words in non-native languages can hinder comprehension or slow reading by requiring dictionary consultation. Existing solutions for faster lookup only work digitally or, in the case of printed text, require a separate display. To enable a more seamless reading experience in the latter scenario, we present ProTrans (Projected Translation), a projector-camera system that detects words users point to and projects their translations onto nearby surfaces. We compare three projection targets-paper, finger, and hand-and gather initial user feedback. Results indicate that projecting onto the back of the hand balances legibility and viewing comfort, supporting the feasibility of skin-based projection for translation tasks.Attribution 4.0 International LicenseCCS Concepts: Human-centered computing → Mixed / augmented reality; Interaction techniques; Computing methodologies → Machine translation; Computer visionHuman centered computing → Mixed / augmented realityInteraction techniquesComputing methodologies → Machine translationComputer visionProTrans: Projecting In-Place Translations for Printed Text10.2312/egve.202513525 pages