-
Quick question regarding this tutorial: sinabs_model = from_model(
ann, input_shape=input_shape, add_spiking_output=True, synops=False, num_timesteps=num_timesteps
) What I don't understand is, where does the weight conversion from the ANN to the SNN actually happen? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
Hi @cowolff , thank you for bringing up this question. Your understanding of how the conversion from ANN to SNN works is pretty good. Let me try to summarize what's happening: First, the original model is copied, including all layers, weights, and all other parameters. After that the activation layers (e.g. ReLU) are replaced by spiking layers. So the weights are copied to the new model 1:1 - their shape and values stay exactly the same. This means that you might have to scale weights or firing thresholds to make sure that the spiking activity is in the desired range. You can read more about this here: https://sinabs.readthedocs.io/en/v1.2.1/tutorials/weight_scaling.html |
Beta Was this translation helpful? Give feedback.
-
Thanks for your reply! But I am still wondering where this copying of the weights is happening in the code of the library? :) |
Beta Was this translation helpful? Give feedback.
-
In this line, |
Beta Was this translation helpful? Give feedback.
-
I converted this issue to a discussion. Since there seem to be no further questions, I will close it. |
Beta Was this translation helpful? Give feedback.
In this line,
from_module
calls thereplace_module
function (defined here, which first makes adeepcopy
of the network and then replaces theReLU
layers.