跳至内容
  • 首页
  • 资讯
  • 资源下载
  • 行业方案
  • Job招聘
  • Paper论文
  • Patent专利
  • 映维会员
  • 导航收录
  • 合作
  • 关于
  • 微信群
  • All
  • XR
  • CV
  • CG
  • HCI
  • Video
  • Optics
  • Perception
  • Reconstruction

EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing

编辑:广东客   |   分类:CV   |   2025年2月27日

Note: We don't have the ability to review paper

PubDate: Sep 2024

Teams:Indian Institute of Technology Kharagpur, Singapore Management University

Writers:Argha Sen, Nuwan Bandara, Ila Gokarn, Thivya Kandappu, Archan Misra

PDF:EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing

Abstract

Eye-tracking technology has gained significant attention in recent years due to its wide range of applications in human-computer interaction, virtual and augmented reality, and wearable health. Traditional RGB camera-based eye-tracking systems often struggle with poor temporal resolution and computational constraints, limiting their effectiveness in capturing rapid eye movements. To address these limitations, we propose EyeTrAES, a novel approach using neuromorphic event cameras for high-fidelity tracking of natural pupillary movement that shows significant kinematic variance. One of EyeTrAES's highlights is the use of a novel adaptive windowing/slicing algorithm that ensures just the right amount of descriptive asynchronous event data accumulation within an event frame, across a wide range of eye movement patterns. EyeTrAES then applies lightweight image processing functions over accumulated event frames from just a single eye to perform pupil segmentation and tracking. We show that these methods boost pupil tracking fidelity by 6+%, achieving IoU~=92%, while incurring at least 3x lower latency than competing pure event-based eye tracking alternatives [38]. We additionally demonstrate that the microscopic pupillary motion captured by EyeTrAES exhibits distinctive variations across individuals and can thus serve as a biometric fingerprint. For robust user authentication, we train a lightweight per-user Random Forest classifier using a novel feature vector of short-term pupillary kinematics, comprising a sliding window of pupil (location, velocity, acceleration) triples. Experimental studies with two different datasets demonstrate that the EyeTrAES-based authentication technique can simultaneously achieve high authentication accuracy (~=0.82) and low processing latency (~=12ms), and significantly outperform multiple state-of-the-art competitive baselines.

本文链接:https://paper.nweon.com/16225

您可能还喜欢...

  • Model-Aware Gesture-to-Gesture Translation

    2021年06月30日 映维

  • 8d7b82594006297770fe30c80f69d65e-thumb-medium

    State of the Art on Neural Rendering

    2020年08月26日 映维

  • b3f12b99db9f8278e3a98937e919d600-thumb-medium

    Reconstructing Interacting Hands with Interaction Prior from Monocular Images

    2023年09月14日 映维

关注:

最新AR/VR行业分享

  • ★ 暂无数据(等待更新) 2025年12月10日

最新AR/VR专利

  • ★ 暂无数据(等待更新) 2025年12月10日

最新AR/VR行业招聘

  • ★ 暂无数据(等待更新) 2025年12月10日
  • 首页
  • 资讯
  • 资源下载
  • 行业方案
  • Job招聘
  • Paper论文
  • Patent专利
  • 映维会员
  • 导航收录
  • 合作
  • 关于
  • 微信群

联系微信:ovalics

版权所有:广州映维网络有限公司 © 2025

备案许可:粤ICP备17113731号-2

备案粤公网安备:44011302004835号

友情链接: AR/VR行业导航

读者QQ群:251118691

Quest QQ群:526200310

开发者QQ群:688769630

Paper