DSL-based SNN Accelerator Design using Chisel

Plagwitz P, Hannig F, Teich J, Keszöcze O (2024)


Publication Language: English

Publication Type: Conference contribution, Conference Contribution

Publication year: 2024

Conference Proceedings Title: 2024 27th Euromicro Conference on Digital System Design (DSD)

Event location: Paris FR

Abstract

Spiking Neural Networks (SNNs) are a promising class of algorithms for hardware acceleration, even outperforming traditional neural networks in some cases. However, existing SNN accelerator approaches do not perform exhaustive explorations of possible network parameters, including neuron models and spike codings; instead, they often focus on a single network setting and a given fixed hardware architecture for its implementation. Chisel is a hardware construction language that allows the modeling of high-level abstractions from the Register-Transfer Level (RTL) and above. It promises a more transparent and predictable approach than compiler-based methodologies, like High-Level Synthesis (HLS). In this paper, we propose a novel multi-layer Domain-Specific Language (DSL) for SNN accelerator design based on Chisel, allowing for design space explorations that vary neuron models, spike codings, reset behaviors, and even accelerator topologies. Moreover, we propose an SNN accelerator generation framework using this DSL, which covers training to deployment. We explore and evaluate implementations and provide results regarding execution time, Field-Programmable Gate Array (FPGA) resource usage, power consumption, and accuracy.

Authors with CRIS profile

Related research project(s)

Involved external institutions

How to cite

APA:

Plagwitz, P., Hannig, F., Teich, J., & Keszöcze, O. (2024). DSL-based SNN Accelerator Design using Chisel. In 2024 27th Euromicro Conference on Digital System Design (DSD). Paris, FR.

MLA:

Plagwitz, Patrick, et al. "DSL-based SNN Accelerator Design using Chisel." Proceedings of the 27th Euromicro Conference Series on Digital System Design (DSD), Paris 2024.

BibTeX: Download