Explanation of the ANN to SNN Conversion Mechanism #203
-
Hi, I am converting an ANN to SNN. I wished to know the conversion mechanism. Is it just replacing the ReLU with a spiking layer? In that case it would be necessary to manually tune the conversion by experimenting with different parameters like minimum threshold, reset and spikes at each timestep. In other words is Sinabs doing some mechanism of its own by adjusting the parameters or it just adds a spiking layer with the parameters we specify? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
You are correct. ReLU layers are replaced by spiking layers. Other than, parameters and the architecture of the network remain untouched. There is no automatic normalization of the weights, so you generally have to scale either the weights or the firing thresholds manually. See here for more details. |
Beta Was this translation helpful? Give feedback.
You are correct. ReLU layers are replaced by spiking layers. Other than, parameters and the architecture of the network remain untouched.
There is no automatic normalization of the weights, so you generally have to scale either the weights or the firing thresholds manually. See here for more details.