空 挡 广 告 位 | 空 挡 广 告 位

EchoFlex: hand gesture recognition using ultrasound imaging

Note: We don't have the ability to review paper

Title: EchoFlex: hand gesture recognition using ultrasound imaging

Teams: University of Bristol

Writers: Jess McIntosh, Asier Marzo, Mike Fraser and Carol Phillips

Publication date: October 2017

Abstract

The research team led by Professor Mike Fraser, Asier Marzo and Jess McIntosh from the Bristol Interaction Group (BIG) at the University of Bristol, together with University Hospitals Bristol NHS Foundation Trust (UH Bristol), presented their paper this summer [8-11 May] at one of the world’s most important conferences on human-computer interfaces, ACM CHI 2017 held in Denver, USA.

Computers are growing in number and wearable computers, such as smartwatches, are gaining popularity. Devices around the home, such as WiFi light bulbs and smart thermostats, are also on the increase. However, current technology limits the capability to interact with these devices.

Hand gestures have been suggested as an intuitive and easy way of interacting with and controlling smart devices in different surroundings. For instance, a gesture could be used to dim the lights in the living room, or to open or close a window. Hand gesture recognition can be achieved in many ways, but the placement of a sensor is a major restriction and often rules out certain techniques. However, with smartwatches becoming the leading wearable device this allows sensors to be put in the watch to sense hand movement.

The research team propose ultrasonic imaging of the forearm could be used to recognise hand gestures. Ultrasonic imaging is already used in medicine, such as pregnancy scans along with muscle and tendon movement, and the researchers saw the potential for this to be used as a way of understanding hand movement.

您可能还喜欢...

Paper