Meyer N, Scherer DD, Plinge A, Mutschler C, Hartmann M (2023)
Publication Type: Conference contribution
Publication year: 2023
Publisher: Institute of Electrical and Electronics Engineers Inc.
Book Volume: 2
Pages Range: 36-41
Conference Proceedings Title: Proceedings - 2023 IEEE International Conference on Quantum Computing and Engineering, QCE 2023
ISBN: 9798350343236
DOI: 10.1109/QCE57702.2023.10181
Reinforcement learning is a growing field in AI with a lot of potential. Intelligent behavior is learned automatically through trial and error in interaction with the environment. However, this learning process is often costly. Using variational quantum circuits as function approximators potentially can reduce this cost. In order to implement this, we propose the quantum natural policy gradient (QNPG) algorithm - a second-order gradient-based routine that takes advantage of an efficient approximation of the quantum Fisher information matrix. We experimentally demonstrate that QNPG outperforms first-order based training on different Contextual Bandits environments regarding convergence speed and stability and moreover reduces the sample complexity. Furthermore, we provide evidence for the practical feasibility of our approach by training on a 12-qubit hardware device.
APA:
Meyer, N., Scherer, D.D., Plinge, A., Mutschler, C., & Hartmann, M. (2023). Quantum Natural Policy Gradients: Towards Sample-Efficient Reinforcement Learning. In Hausi Muller, Yuri Alexev, Andrea Delgado, Greg Byrd (Eds.), Proceedings - 2023 IEEE International Conference on Quantum Computing and Engineering, QCE 2023 (pp. 36-41). Bellevue, WA, US: Institute of Electrical and Electronics Engineers Inc..
MLA:
Meyer, Nico, et al. "Quantum Natural Policy Gradients: Towards Sample-Efficient Reinforcement Learning." Proceedings of the 4th IEEE International Conference on Quantum Computing and Engineering, QCE 2023, Bellevue, WA Ed. Hausi Muller, Yuri Alexev, Andrea Delgado, Greg Byrd, Institute of Electrical and Electronics Engineers Inc., 2023. 36-41.
BibTeX: Download