SNN converted to lava has different outputs #106
Replies: 3 comments 10 replies
-
Can you try these two things:
|
Beta Was this translation helpful? Give feedback.
-
@bamsumit, synapse_kwargs = dict(weight_norm=False, pre_hook_fx=quantize_8bit) Thanks and Rgds, |
Beta Was this translation helpful? Give feedback.
-
@bamsumit could you please tell me how to dequantize the voltage state that is read from CUBA LIF neurons voltage state ? during training i set threshold of those ouput layer neurons as 2048. but i did not apply any kind of quantization (like quanyize_8bit) for those voltages during training in slayer. Thanks and Rgds, |
Beta Was this translation helpful? Give feedback.
-
Hi, I have trained a SNN to classify the MNIST dataset. The SNN is built by lava-dl and works well. But when I convert the SNN to lava, I gets different outputs:
The outputs are:
'./net_ladl.pt' and './net_lava_dl.net' are attached:
weights.zip
I have installed lava 0.4.0. To use the latest lava-dl, I firstly install lava-dl 0.2.0 by anaconda, and overwirtes codes in
~/anaconda3/envs/lava-env/lib/python3.10/site-packages/lava/lib/dl
by the latest codes from github.Beta Was this translation helpful? Give feedback.
All reactions