Learnt quasi-transitive similarity for retrieval from large collections of faces
MetadataShow full item record
We are interested in identity-based retrieval of face sets from large unlabelled collections acquired in uncontrolled environments. Given a baseline algorithm for measuring the similarity of two face sets, the meta-algorithm introduced in this paper seeks to leverage the structure of the data corpus to make the best use of the available baseline.In particular, we show how partial transitivity of inter-personal similarity can be exploited to improve the retrieval of particularly challenging sets which poorly match the query under the baseline measure. We: (i) describe the use of proxy sets as a means of computing the similarity between two sets, (ii) introduce transitivity meta-features based on the similarity of salient modes of appearance variation between sets, (iii) show how quasi-transitivity can be learnt from such features without any labelling or manual intervention, and (iv) demonstrate the effectiveness of the proposed methodology through experiments on the notoriously challenging YouTube database.
Arandelovic , O 2016 , Learnt quasi-transitive similarity for retrieval from large collections of faces . in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition . IEEE Computer Society , pp. 4883-4892 , IEEE Conference on Computer Vision and Pattern Recognition , Las Vegas , United States , 26/06/16 . https://doi.org/10.1109/CVPR.2016.528conference
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
© 2016, IEEE. This work is made available online in accordance with the publisher’s policies. This is the author created, accepted version manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at ieeexplore.ieee.org / https://doi.org/10.1109/CVPR.2016.528
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.