Haeusser P, Plapp J, Golkov V, Aljalbout E, Cremers D (2019)
Publication Type: Conference contribution
Publication year: 2019
Publisher: Springer Verlag
Book Volume: 11269 LNCS
Pages Range: 18-32
Conference Proceedings Title: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Event location: Stuttgart, DEU
ISBN: 9783030129385
DOI: 10.1007/978-3-030-12939-2_2
We propose a novel end-to-end clustering training schedule for neural networks that is direct, i.e. the output is a probability distribution over cluster memberships. A neural network maps images to embeddings. We introduce centroid variables that have the same shape as image embeddings. These variables are jointly optimized with the network’s parameters. This is achieved by a cost function that associates the centroid variables with embeddings of input images. Finally, an additional layer maps embeddings to logits, allowing for the direct estimation of the respective cluster membership. Unlike other methods, this does not require any additional classifier to be trained on the embeddings in a separate step. The proposed approach achieves state-of-the-art results in unsupervised classification and we provide an extensive ablation study to demonstrate its capabilities.
APA:
Haeusser, P., Plapp, J., Golkov, V., Aljalbout, E., & Cremers, D. (2019). Associative Deep Clustering: Training a Classification Network with No Labels. In Andrés Bruhn, Thomas Brox, Mario Fritz (Eds.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (pp. 18-32). Stuttgart, DEU: Springer Verlag.
MLA:
Haeusser, Philip, et al. "Associative Deep Clustering: Training a Classification Network with No Labels." Proceedings of the 40th German Conference on Pattern Recognition, GCPR 2018, Stuttgart, DEU Ed. Andrés Bruhn, Thomas Brox, Mario Fritz, Springer Verlag, 2019. 18-32.
BibTeX: Download