Publications

カンファレンス (国際) ReflecTrace: Touchless Hover Interaction on Commodity Smartphones via Corneal Reflection

Yudai Nakamura (Keio University), Kaori Ikematsu, Naoto Takayanagi (Keio University), Kunihiro Kato Tokyo (University of Technology), Toshiya Isomoto, Yuta Sugiura (Keio University)

The 31st International Conference on Intelligent User Interfaces (ACM IUI 2026)

2026.3.22

We propose an approach to detect finger hover inputs on a smartphone screen using corneal reflection images captured by the device’s built-in front camera. This method requires no external sensors or hardware, enabling hover input detection in the near-screen space that is not directly visible to the camera. By leveraging a convolutional neural network (CNN), we estimate the two-dimensional position of a hovering finger and classify it into a predefined screen grid. Experimental results show that our model achieves approximately 95% accuracy for coarse grids and maintains over 88% accuracy for finer divisions. Furthermore, our system demonstrates real-time processing capability with an end-to-end latency of approximately 22 ms on a standard smartphone. These findings highlight the practical feasibility of camera-only hover sensing and suggest a wide range of touchless interaction applications, enabling touchless interaction when touch is undesirable, pre-touch UI adaptation, and accessibility support on commodity mobile devices.

Paper : ReflecTrace: Touchless Hover Interaction on Commodity Smartphones via Corneal Reflection新しいタブまたはウィンドウで開く (外部サイト)