You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi I have encountered a problem while working with nn-tools for GAP8 (AI-deck to be more specific). Please see details below.
The problem:
Outputs from the network seem very different between running it in tensorflow (tf-lite after quantization) and on the actual AI deck. My neural networks takes two inputs: a stack of three grey-scale images and a stack of three states [velocities quaternions] array. Despite making inputs exactly the same (0s for input_1 and 180s for input_2) both for AI-deck and tflite, the outputs are very different (as shown below):
Tflite:
[[-28 28 123]]
AI-deck:
171 16 61
I could understand if there is a small discrepancy between the two but here the results are completely different. I am using GAP-sdk V4.12.0.
What I tried
The network uses transpose in the middle and I tried playing with it (removing etc) to see if I could make the outputs match - no luck
I tried disabling and reenabling ‘adjust’ option (while taking care to make input shape appropriate) for nn-tools - no luck
I tired adding softmax at the end of my network - no luck
I tried defining outputs as different types - no luck
I have attached link (below) to minimal example for the AI-deck with instructions on how to run it should you wish to check it (minimal_network_code.c is the main entry point) as well as minimum working example for execution of the tflite network using python. Also, I have included my network as quantized tflite file (inside GAP8 network test repository in model/test_model.tflite).
Hi I have encountered a problem while working with nn-tools for GAP8 (AI-deck to be more specific). Please see details below.
The problem:
Outputs from the network seem very different between running it in tensorflow (tf-lite after quantization) and on the actual AI deck. My neural networks takes two inputs: a stack of three grey-scale images and a stack of three states [velocities quaternions] array. Despite making inputs exactly the same (0s for input_1 and 180s for input_2) both for AI-deck and tflite, the outputs are very different (as shown below):
Tflite:
[[-28 28 123]]
AI-deck:
171 16 61
I could understand if there is a small discrepancy between the two but here the results are completely different. I am using GAP-sdk V4.12.0.
What I tried
I have attached link (below) to minimal example for the AI-deck with instructions on how to run it should you wish to check it (minimal_network_code.c is the main entry point) as well as minimum working example for execution of the tflite network using python. Also, I have included my network as quantized tflite file (inside GAP8 network test repository in model/test_model.tflite).
Python script to check tflite performance:
AI-deck side code:
https://github.com/pladosz/GAP8_network_test/
I suspect it is some sort of output interpreted as wrong type problem, but I am not sure.
Many Thanks,
The text was updated successfully, but these errors were encountered: