Yu, Ja EunKim, Gerard J.Dirk Reiners and Daisuke Iwai and Frank Steinicke2016-12-072016-12-072016978-3-03868-012-31727-530Xhttps://doi.org/10.2312/egve.20161434https://diglib.eg.org:443/handle/10.2312/egve20161434Most AR interaction techniques are focused on direct interaction with close objects within one’s reach (e.g. using the hands). Interacting with distant objects, especially those that are real, has not received much attention. The most prevalent method is using a hand-held device to control the cursor to indirectly designate a target object on the AR display. This may not be a natural and efficient method when used with an optical see-through glass due to its multi-focus problem. In this paper, we propose the "Blurry (Sticky) Finger" in which one uses the finger to aim and point at a distant object, but focusing only on the target with both eyes open (thus without the multi-focus problem) and relying upon the proprioceptive sense. We demonstrate and validate our claim through an experiment comparing three distant pointing/selection methods: (1) indirect cursor based method using a 3D air mouse, (2) proprioceptive finger aiming (Blurry Finger) with a cursor, (3) proprioceptive finger aiming without a cursor. In the experiment, Blurry Finger showed superior performance for selecting relatively small objects and in fact showed low sensitivity to the target object size. It also clearly showed advantages in the initial object selection where the hand/finger starts from a rest position. The Blurry Finger was also evaluated to be the most intuitive and natural.H.5.2 [INFORMATION INTERFACES AND PRESENTATION ]User InterfacesInput devices and strategiesBlurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-through based Augmented Reality10.2312/egve.2016143449-56