Various touch-based interaction techniques have been developed to make interactions on mobile
devices more effective, efficient, and intuitive. Finger orientation, especially, has attracted a
lot of attentions since it intuitively brings three additional degrees of freedom (DOF) compared
with two-dimensional (2D) touching points. The mapping of finger orientation can be classified as
being either absolute or relative, suitable for different interaction applications. However, only
absolute orientation has been explored in prior works. The relative angles can be calculated based
on two estimated absolute orientations, although, a higher accuracy is expected by predicting
relative rotation from input images directly. Consequently, in this paper, we propose to estimate
complete 3D relative finger angles based on two fingerprint images, which incorporate more
information with a higher image resolution than capacitive images. For algorithm training and
evaluation, we constructed a dataset consisting of fingerprint images and their corresponding
ground truth 3D relative finger rotation angles. Experimental results on this dataset revealed
that our method outperforms previous approaches with absolute finger angle models. Further,
extensive experiments were conducted to explore the impact of image resolutions, finger types, and
rotation ranges on performance. A user study was also conducted to examine the efficiency and
precision using 3D relative finger orientation in 3D object rotation task.
Mon 6 NovDisplayed time zone: Eastern Time (US & Canada) change
11:00 - 12:30 | |||
11:00 22mTalk | 3D Finger Rotation Estimation from Fingerprint Images Papers A: Yongjie Duan Tsinghua University, A: Jinyang Yu BNRist, Department of Automation, Tsinghua University, A: Jianjiang Feng Tsinghua University, A: Ke He Department of Automation, BNRist, Tsinghua University, A: Jiwen Lu Tsinghua University, A: Jie Zhou Department of Automation, BNRist, Tsinghua University DOI | ||
11:22 22mTalk | BlendMR: A Computational Method to Create Ambient Mixed Reality InterfacesBest Paper Papers A: Violet Yinuo Han Carnegie Mellon University, A: Hyunsung Cho Carnegie Mellon University, A: Kiyosu Maeda The University of Tokyo, A: Alexandra Ion Carnegie Mellon University, A: David Lindlbauer Carnegie Mellon University DOI | ||
11:45 22mTalk | CADTrack: Instructions and Support for Orientation Disambiguation of Near-Symmetrical Objects Papers A: João Marcelo Evangelista Belo Aarhus University, A: Jon Wissing Aarhus University, A: Tiare Feuchtner , A: Kaj Grønbæk Aarhus University DOI Media Attached | ||
12:07 22mTalk | UbiSurface: A Robotic Touch Surface for Supporting Mid-air Planar Interactions in Room-Scale VR Papers A: Ryota Gomi Tohoku University, A: Kazuki Takashima , A: Yuki Onishi Singapore Management University, A: Kazuyuki Fujita , A: Yoshifumi Kitamura Tohoku University DOI |