This is a topic close to my heart. Could you expand a bit on why you think SNN will succeed now? There is a long history of work with them in the neuromorphic engineering community for over two decades now. Thanks!

There is one major benefit of a Spiking Neural Networks is the power consumption. A ‘normal’ neural network uses big GPUs or CPUs that draw hundreds of Watts of power. SNN only uses for the same network size just a few nano Watts.

I stopped reading here. The author has posited that SNN are 12 orders of magnitude more power efficient than typical neural networks.

A line of zeros that long requires either a citation or a correction.

Not a problem at all, thanks for your response! Here the citation

Rozenberg, M. J., O. Schneegans, and P. Stoliar. “An ultra-compact leaky-integrate-and-fire model for building spiking neural networks.” Scientific reports 9.1 (2019): 1-7.

Ok, this is a hardware implementation wherein each “neuron” is made out of two transistors and a thyristor. That is dramatically different than the software implementation I thought we were talking about! Carry on, forget my comment! :)

An ultra-compact leaky-integrate-and-fire model for building spiking neural networks

I don’t know AI or EE at all, so this might be a dumb question, but: I read the paper and it looks like they’re manually encoding the weights and connections as circuit components. Is that correct? If so, wouldn’t most of the energy savings be from using an analog computer?

That is correct, with ‘normal’ computational everything should be exact, and as precise possible, but ANNs can have a 98% accuracy. Therefore accuracy and precision in calculations, has a less prior need. Therefore they can calculate with analog signals. Normal CPUs or GPUs have many functions, so to calculate integrations the CPU need many cycles to compute this (many transistor need to do stuff). In analog circuits this is ‘easy’ been done using a few passive components.

This is a topic close to my heart. Could you expand a bit on why you think SNN will succeed now? There is a long history of work with them in the neuromorphic engineering community for over two decades now. Thanks!

I stopped reading here. The author has posited that SNN are 12 orders of magnitude more power efficient than typical neural networks.

A line of zeros that long requires either a citation or a correction.

Not a problem at all, thanks for your response! Here the citation

Ok, this is a hardware implementation wherein each “neuron” is made out of two transistors and a thyristor. That is dramatically different than the software implementation I thought we were talking about! Carry on, forget my comment! :)

I don’t know AI or EE at all, so this might be a dumb question, but: I read the paper and it looks like they’re manually encoding the weights and connections as circuit components. Is that correct? If so, wouldn’t most of the energy savings be from using an analog computer?

That is correct, with ‘normal’ computational everything should be exact, and as precise possible, but ANNs can have a 98% accuracy. Therefore accuracy and precision in calculations, has a less prior need. Therefore they can calculate with analog signals. Normal CPUs or GPUs have many functions, so to calculate integrations the CPU need many cycles to compute this (many transistor need to do stuff). In analog circuits this is ‘easy’ been done using a few passive components.

The submission probably needs hardware tag.