CS SEMINAR

Object recognition in HCI with Radar, Vision and Touch

Speaker
Professor Aaron Quigley, University of St Andrews, Scotland

18 Apr 2019 Thursday, 02:00 PM to 03:00 PM

MR3, COM2-02-26

Abstract:

The exploration of novel sensing to facilitate new interaction modalities is an active research topic in Human-Computer Interaction. Across the breadth of HCI we can see the development of new forms of interaction underpinned by the appropriation or adaptation of sensing techniques based on the measurement of sound, light, electric fields, radio waves, biosignals etc. In this talk I will delve into three forms of sensing for object detection and interaction with radar, blurred images and touch.

RadarCat (UIST 2016, Interactions 2018, IMWUT 2018) is a small, versatile system for material and object classification which enables new forms of everyday proximate interaction with digital devices. RadarCat exploits the raw radar signals that are unique when different material and objects are placed on the sensor. By using machine learning techniques, these objects can be accurately recognized. An object's thickness, state (filled or empty mug) and different body parts can also be recognized. This gives rise to research and applications in context-aware computing, tangible interaction (with tokens and objects), and in industrial automation (e.g., recycling), or laboratory process control (e.g., traceability). While AquaCat is a low-cost radar-based system capable of discriminating between a range of liquids and powders. Further in Solinteraction we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic with Radar.

Beyond Radar, SpeCam (MobileHCI '17) is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support "discreet computing" with micro-interactions to avoid the numerous distractions that users daily face with today's mobile devices. Our two-parts study shows that SpeCam can i) recognize colors in the HSB space with 10 degrees apart near the 3 dominant colors and 4 degrees otherwise and ii) 30 types of surface materials with 99% accuracy. These findings are further supported by a spectroscopy study. Finally, we suggest a series of applications based on simple mobile micro-interactions suitable for using the phone when placed face-down with blurred images.

Finally, with touch we can show a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures in "discreet computing" can be used to control a wearable computer without calling attention to the user in public. We present two user studies where we test recognition accuracy for these movements. I will conclude this talk with some speculations around how touch, radar and vision processing might be used to realise "blended reality" interactions in AR and beyond.

I will also use this talk to answer questions on the upcoming Blended Reality Summer School, May 13, 2019 to May 17, 2019 at the Keio-NUS CUTE Center, National University of Singapore. Applications for this will open soon. http://cutecenter.nus.edu.sg/BRSummerSchool-2019.html


Biography:

Professor Aaron Quigley is the Chair of Human Computer Interaction and Director of Impact in the School of Computer Science at the University of St Andrews in Scotland. He is currently on sabbatical in the CUTE center in the National University of Singapore. https://aaronquigley.org/biography/


Links to Relevant Papers can be found here
https://aaronquigley.org/2019/03/nus-seminar-object-recognition-in-hci-with-radar-vision-and-touch/