Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorWu, Chi-Jui
dc.contributor.authorQuigley, Aaron John
dc.contributor.authorHarris-Birtill, David Cameron Christopher
dc.date.accessioned2016-12-12T12:30:40Z
dc.date.available2016-12-12T12:30:40Z
dc.date.issued2017-02
dc.identifier.citationWu , C-J , Quigley , A J & Harris-Birtill , D C C 2017 , ' Out of sight : a toolkit for tracking occluded human joint positions ' , Personal and Ubiquitous Computing , vol. 21 , no. 1 , pp. 125-135 . https://doi.org/10.1007/s00779-016-0997-6en
dc.identifier.issn1617-4909
dc.identifier.otherPURE: 248040132
dc.identifier.otherPURE UUID: 2d9d2fe7-0c3c-44f7-89f1-13ffd06cd106
dc.identifier.otherScopus: 85001105653
dc.identifier.otherORCID: /0000-0002-5274-6889/work/34040078
dc.identifier.otherORCID: /0000-0002-0740-3668/work/43832023
dc.identifier.otherWOS: 000393760800012
dc.identifier.urihttps://hdl.handle.net/10023/9961
dc.description.abstractReal-time identification and tracking of the joint positions of people can be achieved with off-the-shelf sensing technologies such as the Microsoft Kinect, or other camera-based systems with computer vision. However, tracking is constrained by the system’s field of view of people. When a person is occluded from the camera view, their position can no longer be followed. Out of Sight addresses the occlusion problem in depth-sensing tracking systems. Our new tracking infrastructure provides human skeleton joint positions during occlusion, by combining the field of view of multiple Kinects using geometric calibration and affine transformation. We verified the technique’s accuracy through a system evaluation consisting of 20 participants in stationary position and in motion, with two Kinects positioned parallel, 45°, and 90° apart. Results show that our skeleton matching is accurate to within 16.1 cm (s.d. = 5.8 cm), which is within a person’s personal space. In a realistic scenario study, groups of two people quickly occlude each other, and occlusion is resolved for 85% of the participants. A RESTful API was developed to allow distributed access of occlusion-free skeleton joint positions. As a further contribution, we provide the system as open source.
dc.format.extent11
dc.language.isoeng
dc.relation.ispartofPersonal and Ubiquitous Computingen
dc.rights© The Author(s) 2016. This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.en
dc.subjectKinecten
dc.subjectOcclusionen
dc.subjectToolkiten
dc.subjectQA75 Electronic computers. Computer scienceen
dc.subjectTA Engineering (General). Civil engineering (General)en
dc.subjectDASen
dc.subjectBDCen
dc.subjectR2Cen
dc.subject.lccQA75en
dc.subject.lccTAen
dc.titleOut of sight : a toolkit for tracking occluded human joint positionsen
dc.typeJournal articleen
dc.description.versionPublisher PDFen
dc.contributor.institutionUniversity of St Andrews. School of Computer Scienceen
dc.identifier.doihttps://doi.org/10.1007/s00779-016-0997-6
dc.description.statusPeer revieweden


This item appears in the following Collection(s)

Show simple item record