Now showing items 1-14 of 14

  • AdaM : adapting multi-user interfaces for collaborative environments in real-time 

    Park, Seonwook; Gebhardt, Christoph; Rädle, Roman; Feit, Anna; Vrzakova, Hana; Dayama, Niraj; Yeo, Hui Shyong; Klokmose, Clemens; Quigley, Aaron John; Oulasvirta, Antti; Hilliges, Otmar (ACM, 2018-04-21) - Conference item
    Developing cross-device multi-user interfaces (UIs) is a challenging problem. There are numerous ways in which content and interactivity can be distributed. However, good solutions must consider multiple users, their roles, ...
  • Automated data gathering and training tool for personalized "Itchy Nose" 

    lee, Juyoung; Yeo, Hui Shyong; Starner, Thad; Quigley, Aaron John; Kunze, Kai; Woo, Woontack (ACM, 2018-02-07) - Conference item
    In "Itchy Nose" we proposed a sensing technique for detecting finger movements on the nose for supporting subtle and discreet interaction. It uses the electrooculography sensors embedded in the frame of a pair of eyeglasses ...
  • Exploring mixed-scale gesture interaction 

    Ens, Barrett; Quigley, Aaron John; Yeo, Hui Shyong; Irani, Pourang; Piumsomboon, Thammathip; Billinghurst, Mark (ACM, 2017-11-27) - Conference item
    This paper presents ongoing work toward a design exploration for combining microgestures with other types of gestures within the greater lexicon of gestures for computer interaction. We describe three prototype applications ...
  • Investigating tilt-based gesture keyboard entry for single-handed text entry on large devices 

    Yeo, Hui Shyong; Phang, Xiao-Shen; Castellucci, Steven J.; Kristensson, Per Ola; Quigley, Aaron John (ACM, 2017-05-02) - Conference item
    The popularity of mobile devices with large screens is making single-handed interaction difficult. We propose and evaluate a novel design point around a tilt-based text entry technique which supports single handed usage. ...
  • Itchy Nose : discreet gesture interaction using EOG sensors in smart eyewear 

    Lee, Juyoung; Yeo, Hui Shyong; Dhuliawala, Murtaza; Akano, Jedidiah; Shimizu, Junichi; Starner, Thad; Quigley, Aaron John; Woo, Woontack; Kunze, Kai (ACM, 2017-09-11) - Conference item
    We propose a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on ...
  • Multi-scale gestural interaction for augmented reality 

    Ens, Barrett; Quigley, Aaron John; Yeo, Hui Shyong; Irani, Pourang; Billinghurst, Mark (ACM, 2017-11-27) - Conference item
    We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially ...
  • RadarCat  : Radar Categorization for input & interaction 

    Yeo, Hui Shyong; Flamich, Gergely; Schrempf, Patrick; Harris-Birtill, David Cameron Christopher; Quigley, Aaron John (ACM, 2016-10-16) - Conference item
    In RadarCat we present a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. We demonstrate that we can train and ...
  • Sidetap & slingshot gestures on unmodified smartwatches 

    Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Quigley, Aaron John (ACM, 2016-10-16) - Conference item
    We present a technique for detecting gestures on the edge of an unmodified smartwatch. We demonstrate two exemplary gestures, i) Sidetap - tapping on any side and ii) Slingshot - pressing on the edge and then releasing ...
  • SpeCam: sensing surface color and material with the front-facing camera of mobile device 

    Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Harris-Birtill, David; Quigley, Aaron John (ACM, 2017-09-04) - Conference item
    SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices ...
  • Tangible UI by object and material classification with radar 

    Yeo, Hui Shyong; Ens, Barrett; Quigley, Aaron John (ACM, 2017-11-27) - Conference item
    Radar signals penetrate, scatter, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a ...
  • TiTAN: exploring midair text entry using freehand input 

    Yeo, Hui Shyong; Phang, Xiao-Shen; Ha, Taejin; Woo, Woontack; Quigley, Aaron John (ACM, 2017-05-06) - Conference item
    TiTAN is a spatial user interface that enables freehand,midair text entry with a distant display while only requiring a low-cost depth sensor. Our system aims to leverage one’s familiarity with the QWERTY layout. It allows ...
  • WatchMI: applications of watch movement input on unmodified smartwatches 

    Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Quigley, Aaron John (ACM, 2016-09-06) - Conference item
    In this demo, we show that it is possible to enhance touch interaction on unmodified smartwatch to support continuous pressure touch, twist and pan gestures, by only analyzing the real-time data of Inertial Measurement ...
  • WatchMI: pressure touch, twist and pan gesture input on unmodified smartwatches 

    Yeo, Hui Shyong; Lee, Juyoung; Bianchi, Andrea; Quigley, Aaron John (ACM, 2016-09-07) - Conference item
    The screen size of a smartwatch provides limited space to enable expressive multi-touch input, resulting in a markedly difficult and limited experience. We present WatchMI: Watch Movement Input that enhances touch interaction ...
  • Workshop on object recognition for input and mobile interaction 

    Yeo, Hui Shyong; Laput, Gierad; Gillian, Nicholas; Quigley, Aaron John (ACM, 2017-09-04) - Conference item
    Today we can see an increasing number of Object Recognition systems of very different sizes, portability, embedability and form factors which are starting to become part of the ubiquitous, tangible, mobile and wearable ...