Achieving stable subspace clustering by post-processing generic clustering results
MetadataShow full item record
We propose an effective subspace selection scheme as a post-processing step to improve results obtained by sparse subspace clustering (SSC). Our method starts by the computation of stable subspaces using a novel random sampling scheme. Thus constructed preliminary subspaces are used to identify the initially incorrectly clustered data points and then to reassign them to more suitable clusters based on their goodness-of-fit to the preliminary model. To improve the robustness of the algorithm, we use a dominant nearest subspace classification scheme that controls the level of sensitivity against reassignment. We demonstrate that our algorithm is convergent and superior to the direct application of a generic alternative such as principal component analysis. On several popular datasets for motion segmentation and face clustering pervasively used in the sparse subspace clustering literature the proposed method is shown to reduce greatly the incidence of clustering errors while introducing negligible disturbance to the data points already correctly clustered.
Pham , D-S , Arandjelovic , O & Venkatesh , S 2016 , Achieving stable subspace clustering by post-processing generic clustering results . in 2016 International Joint Conference on Neural Networks (IJCNN) . vol. 2016-October , 7727496 , IEEE , pp. 2390-2396 , IEEE World Congress on Computational Intelligence , Vancouver , Canada , 24-29 July . DOI: 10.1109/IJCNN.2016.7727496conference
2016 International Joint Conference on Neural Networks (IJCNN)
© 2016, IEEE. This work has been made available online in accordance with the publisher’s policies. This is the author created, accepted version manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at ieeexplore.ieee.org / http://dx.doi.org/10.1109/IJCNN.2016.7727496
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.