Recognition: unknown
A Neuromodulable Current-Mode Silicon Neuron for Robust and Adaptive Neuromorphic Systems
read the original abstract
Neuromorphic engineering makes use of mixed-signal analog and digital circuits to directly emulate the computational principles of biological brains. Such electronic systems offer a high degree of adaptability, robustness, and energy efficiency across a wide range of tasks, from edge computing to robotics. Within this context, we investigate a key feature of biological neurons: their ability to carry out robust and reliable computation by adapting their input responses and spiking patterns to context through neuromodulation. Achieving analogous levels of robustness and adaptation in neuromorphic circuits through modulatory mechanisms is a largely unexplored path. We present a novel current-mode neuron design that supports robust neuromodulation with minimal model complexity, compatible with standard CMOS technologies. We first introduce a mathematical model of the circuit and provide tools to analyze and tune the neuron behavior; we then demonstrate both theoretically and experimentally the biologically plausible neuromodulation adaptation capabilities of the circuit over a wide range of parameters. All theoretical predictions were verified in experiments on a low-power 180 nm CMOS implementation of the proposed neuron circuit. Due to the analog underlying feedback structure, the proposed adaptive neuromodulable neuron exhibits a high degree of robustness, flexibility, and scalability across operating ranges of currents and temperatures, making it a perfect candidate for real-world neuromorphic applications.
This paper has not been read by Pith yet.
Forward citations
Cited by 3 Pith papers
-
Multi-Timescale Conductance Spiking Networks: A Sparse, Gradient-Trainable Framework with Rich Firing Dynamics for Enhanced Temporal Processing
Multi-timescale conductance spiking networks deliver a gradient-trainable, sparse neuron model with diverse firing regimes that outperforms LIF and AdLIF baselines on Mackey-Glass regression.
-
Energy-Efficient Implementation of Spiking Recurrent Cells on FPGA
An FPGA implementation of SRC-based SNNs reaches 96.31% MNIST accuracy at 1.74 ms per digit and drops to 0.45 mJ per digit with 4-bit weights and shorter traces while retaining richer dynamics than LIF models.
-
Energy-Efficient Implementation of Spiking Recurrent Cells on FPGA
Simplified Spiking Recurrent Cells enable FPGA SNNs to reach 92-96% MNIST accuracy at 0.45-1.74 mJ per classification while retaining richer dynamics than basic LIF models.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.