-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
snake model on STM32 #8
Comments
Model: "sequential" Layer (type) Output Shape Param # ================================================================= dense (Dense) (None, 32) 224 dense_1 (Dense) (None, 32) 1056 dense_2 (Dense) (None, 4) 132 ================================================================= Total params: 1412 (5.52 KB) Trainable params: 1412 (5.52 KB) Non-trainable params: 0 (0.00 Byte) Traceback (most recent call last): File "snake4m.py", line 59, in
File "snake4m.py", line 17, in weights_to_cpp
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (3,) + inhomogeneous part. Hi again, |
int[] activations = {SIGMOID, SIGMOID, SIGMOID} //Activations for each layer excluding the input layer |
void activate(int l, double* z, int activation) { I implement SOFTMAX. It is not verified yet. |
The snake model works fine on STM32!!! The SOFTMAX is verified. It uses 56% of memory. If you need more info please tell me. |
A complete description of the project can be found here : link |
Hi,
I want to use this model
model = tf.keras.Sequential([
tf.keras.layers.Input(shape=(6,)),
tf.keras.layers.Dense(32, activation='relu'),
tf.keras.layers.Dense(32, activation='relu'),
tf.keras.layers.Dense(4, activation='softmax')
])
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.001),loss='mse')
to play snake game on an STM32. You mention that only Dense is implemented. Does Input can be implemented as a separate layer or must be included in Dense?
The text was updated successfully, but these errors were encountered: