Biologically inspired vision for human-robot interaction
MetadataShow full item record
Human-robot interaction is an interdisciplinary research area that is becoming more and more relevant as robots start to enter our homes, workplaces, schools, etc. In order to navigate safely among us, robots must be able to understand human behavior, to communicate, and to interpret instructions from humans, either by recognizing their speech or by understanding their body movements and gestures. We present a biologically inspired vision system for human-robot interaction which integrates several components: visual saliency, stereo vision, face and hand detection and gesture recognition. Visual saliency is computed using color, motion and disparity. Both the stereo vision and gesture recognition components are based on keypoints coded by means of cortical V1 simple, complex and end-stopped cells. Hand and face detection is achieved by using a linear SVM classifier. The system was tested on a child-sized robot.
Saleiro , M , Farrajota , M , Terzić , K , Krishna , S , Rodrigues , J M F & du Buf , J M H 2015 , Biologically inspired vision for human-robot interaction . in M Antona & C Stephanidis (eds) , Universal Access in Human-Computer Interaction. Access to Interaction : 9th International Conference, UAHCI 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA, August 2-7, 2015, Proceedings, Part II . Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , vol. 9176 , Springer , Cham , pp. 505-517 , 9th International Conference on Universal Access in Human-Computer Interaction, UAHCI 2015 Held as Part of 17th International Conference on Human-Computer Interaction, HCI International 2015 , Los Angeles , United States , 2/08/15 . https://doi.org/10.1007/978-3-319-20681-3_48conference
Universal Access in Human-Computer Interaction. Access to Interaction
© 2015, Springer International Publishing Switzerland. This work has been made available online in accordance with the publisher’s policies. This is the author created accepted version manuscript following peer review and as such may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1007/978-3-319-20681-3_48
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.