Weber S, Demmel N, Cremers D (2021)
Publication Type: Conference contribution
Publication year: 2021
Publisher: Springer Science and Business Media Deutschland GmbH
Book Volume: 13024 LNCS
Pages Range: 712-724
Conference Proceedings Title: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Event location: Virtual, Online
ISBN: 9783030926588
DOI: 10.1007/978-3-030-92659-5_46
We revisit the problem of large-scale bundle adjustment and propose a technique called Multidirectional Conjugate Gradients that accelerates the solution of the normal equation by up to 61%. The key idea is that we enlarge the search space of classical preconditioned conjugate gradients to include multiple search directions. As a consequence, the resulting algorithm requires fewer iterations, leading to a significant speedup of large-scale reconstruction, in particular for denser problems where traditional approaches notoriously struggle. We provide a number of experimental ablation studies revealing the robustness to variations in the hyper-parameters and the speedup as a function of problem density.
APA:
Weber, S., Demmel, N., & Cremers, D. (2021). Multidirectional Conjugate Gradients for Scalable Bundle Adjustment. In Christian Bauckhage, Juergen Gall, Alexander Schwing (Eds.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (pp. 712-724). Virtual, Online: Springer Science and Business Media Deutschland GmbH.
MLA:
Weber, Simon, Nikolaus Demmel, and Daniel Cremers. "Multidirectional Conjugate Gradients for Scalable Bundle Adjustment." Proceedings of the 43rd DAGM German Conference on Pattern Recognition, DAGM GCPR 2021, Virtual, Online Ed. Christian Bauckhage, Juergen Gall, Alexander Schwing, Springer Science and Business Media Deutschland GmbH, 2021. 712-724.
BibTeX: Download