Skip to content

Explanation of the ANN to SNN Conversion Mechanism #203

Closed Answered by bauerfe
arkapravag22 asked this question in Q&A
Discussion options

You must be logged in to vote

You are correct. ReLU layers are replaced by spiking layers. Other than, parameters and the architecture of the network remain untouched.

There is no automatic normalization of the weights, so you generally have to scale either the weights or the firing thresholds manually. See here for more details.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by bauerfe
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants