Now showing items 1-20 of 20

    • AdaM : adapting multi-user interfaces for collaborative environments in real-time 

      Park, Seonwook; Gebhardt, Christoph; Rädle, Roman; Feit, Anna; Vrzakova, Hana; Dayama, Niraj; Yeo, Hui Shyong; Klokmose, Clemens; Quigley, Aaron John; Oulasvirta, Antti; Hilliges, Otmar (ACM, 2018-04-21) - Conference item
      Developing cross-device multi-user interfaces (UIs) is a challenging problem. There are numerous ways in which content and interactivity can be distributed. However, good solutions must consider multiple users, their roles, ...
    • Augmented sports for learning using wearable head-worn and wrist-worn devices 

      Yeo, Hui Shyong; Koike, Hideki; Quigley, Aaron John (IEEE Computer Society, 2019-03-24) - Conference item
      Novices can learn sports in a variety of ways ranging from guidance from an instructor to watching video tutorials. In each case, subsequent and repeated self-directed practice sessions are an essential step. However, ...
    • Automated data gathering and training tool for personalized "Itchy Nose" 

      lee, Juyoung; Yeo, Hui Shyong; Starner, Thad; Quigley, Aaron John; Kunze, Kai; Woo, Woontack (ACM, 2018-02-07) - Conference item
      In "Itchy Nose" we proposed a sensing technique for detecting finger movements on the nose for supporting subtle and discreet interaction. It uses the electrooculography sensors embedded in the frame of a pair of eyeglasses ...
    • Counterpoint : exploring mixed-scale gesture interaction for AR applications 

      Ens, Barrett; Quigley, Aaron; Yeo, Hui Shyong; Irani, Pourang; Piumsomboon, Thammathip; Billinghurst, Mark (Association for Computing Machinery, Inc, 2018-04-20) - Conference item
      This paper presents ongoing work on a design exploration for mixed-scale gestures, which interleave microgestures with larger gestures for computer interaction. We describe three prototype applications that show various ...
    • Enabling single-handed interaction in mobile and wearable computing 

      Yeo, Hui Shyong (ACM, 2018-10-11) - Conference item
      Mobile and wearable computing are increasingly pervasive as people carry and use personal devices in everyday life. Screen sizes of such devices are becoming larger and smaller to accommodate both intimate and practical ...
    • Exploring mixed-scale gesture interaction 

      Ens, Barrett; Quigley, Aaron John; Yeo, Hui Shyong; Irani, Pourang; Piumsomboon, Thammathip; Billinghurst, Mark (ACM, 2017-11-27) - Conference item
      This paper presents ongoing work toward a design exploration for combining microgestures with other types of gestures within the greater lexicon of gestures for computer interaction. We describe three prototype applications ...
    • Exploring tangible interactions with radar sensing 

      Yeo, Hui Shyong; Minami, Ryosuke; Rodriguez, Kirill; Shaker, George; Quigley, Aaron John (2018-12-27) - Journal article
      Research has explored miniature radar as a promising sensing technique for the recognition of gestures, objects, users’ presence and activity. However, within Human-Computer Interaction (HCI), its use remains underexplored, ...
    • Investigating tilt-based gesture keyboard entry for single-handed text entry on large devices 

      Yeo, Hui Shyong; Phang, Xiao-Shen; Castellucci, Steven J.; Kristensson, Per Ola; Quigley, Aaron John (ACM, 2017-05-02) - Conference item
      The popularity of mobile devices with large screens is making single-handed interaction difficult. We propose and evaluate a novel design point around a tilt-based text entry technique which supports single handed usage. ...
    • Itchy Nose : discreet gesture interaction using EOG sensors in smart eyewear 

      Lee, Juyoung; Yeo, Hui Shyong; Dhuliawala, Murtaza; Akano, Jedidiah; Shimizu, Junichi; Starner, Thad; Quigley, Aaron John; Woo, Woontack; Kunze, Kai (ACM, 2017-09-11) - Conference item
      We propose a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on ...
    • Multi-scale gestural interaction for augmented reality 

      Ens, Barrett; Quigley, Aaron John; Yeo, Hui Shyong; Irani, Pourang; Billinghurst, Mark (ACM, 2017-11-27) - Conference item
      We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially ...
    • RadarCat  : Radar Categorization for input & interaction 

      Yeo, Hui Shyong; Flamich, Gergely; Schrempf, Patrick Maurice; Harris-Birtill, David Cameron Christopher; Quigley, Aaron John (ACM, 2016-10-16) - Conference item
      In RadarCat we present a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. We demonstrate that we can train and ...
    • RotoSwype : word-gesture typing using a ring 

      Gupta, Aakar; Ji, Chen; Yeo, Hui Shyong; Quigley, Aaron John; Vogel, Daniel (ACM, 2019-05-02) - Conference item
      We propose RotoSwype, a technique for word-gesture typing using the orientation of a ring worn on the index finger. RotoSwype enables one-handed text-input without encumbering the hand with a device, a desirable quality ...
    • Sidetap & slingshot gestures on unmodified smartwatches 

      Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Quigley, Aaron John (ACM, 2016-10-16) - Conference item
      We present a technique for detecting gestures on the edge of an unmodified smartwatch. We demonstrate two exemplary gestures, i) Sidetap - tapping on any side and ii) Slingshot - pressing on the edge and then releasing ...
    • SpeCam: sensing surface color and material with the front-facing camera of mobile device 

      Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Harris-Birtill, David; Quigley, Aaron John (ACM, 2017-09-04) - Conference item
      SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices ...
    • SWAG demo : smart watch assisted gesture interaction for mixed reality head-mounted displays 

      Kim, Hyung-il; Lee, Juyoung; Yeo, Hui Shyong; Quigley, Aaron John; Woo, Woontack (IEEE Computer Society, 2018-12-16) - Conference item
      In this demonstration, we will show a prototype system with sensor fusion approach to robustly track 6 degrees of freedom of hand movement and support intuitive hand gesture interaction and 3D object manipulation for Mixed ...
    • Tangible UI by object and material classification with radar 

      Yeo, Hui Shyong; Ens, Barrett; Quigley, Aaron John (ACM, 2017-11-27) - Conference item
      Radar signals penetrate, scatter, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a ...
    • TiTAN: exploring midair text entry using freehand input 

      Yeo, Hui Shyong; Phang, Xiao-Shen; Ha, Taejin; Woo, Woontack; Quigley, Aaron John (ACM, 2017-05-06) - Conference item
      TiTAN is a spatial user interface that enables freehand,midair text entry with a distant display while only requiring a low-cost depth sensor. Our system aims to leverage one’s familiarity with the QWERTY layout. It allows ...
    • WatchMI: applications of watch movement input on unmodified smartwatches 

      Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Quigley, Aaron John (ACM, 2016-09-06) - Conference item
      In this demo, we show that it is possible to enhance touch interaction on unmodified smartwatch to support continuous pressure touch, twist and pan gestures, by only analyzing the real-time data of Inertial Measurement ...
    • WatchMI: pressure touch, twist and pan gesture input on unmodified smartwatches 

      Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Quigley, Aaron John (ACM, 2016-09-07) - Conference item
      The screen size of a smartwatch provides limited space to enable expressive multi-touch input, resulting in a markedly difficult and limited experience. We present WatchMI: Watch Movement Input that enhances touch interaction ...
    • Workshop on object recognition for input and mobile interaction 

      Yeo, Hui Shyong; Laput, Gierad; Gillian, Nicholas; Quigley, Aaron John (ACM, 2017-09-04) - Conference item
      Today we can see an increasing number of Object Recognition systems of very different sizes, portability, embedability and form factors which are starting to become part of the ubiquitous, tangible, mobile and wearable ...