Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

snake model on STM32 #8

Open
sv1bds opened this issue Oct 24, 2023 · 5 comments
Open

snake model on STM32 #8

sv1bds opened this issue Oct 24, 2023 · 5 comments

Comments

@sv1bds
Copy link

sv1bds commented Oct 24, 2023

Hi,
I want to use this model
model = tf.keras.Sequential([
tf.keras.layers.Input(shape=(6,)),
tf.keras.layers.Dense(32, activation='relu'),
tf.keras.layers.Dense(32, activation='relu'),
tf.keras.layers.Dense(4, activation='softmax')
])
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.001),loss='mse')
to play snake game on an STM32. You mention that only Dense is implemented. Does Input can be implemented as a separate layer or must be included in Dense?

@sv1bds
Copy link
Author

sv1bds commented Oct 24, 2023

Model: "sequential"


Layer (type) Output Shape Param #

=================================================================

dense (Dense) (None, 32) 224

dense_1 (Dense) (None, 32) 1056

dense_2 (Dense) (None, 4) 132

=================================================================

Total params: 1412 (5.52 KB)

Trainable params: 1412 (5.52 KB)

Non-trainable params: 0 (0.00 Byte)


Traceback (most recent call last):

File "snake4m.py", line 59, in

weights_to_cpp(model)

File "snake4m.py", line 17, in weights_to_cpp

for i in np.array(weights):

ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (3,) + inhomogeneous part.

Hi again,
When I try to parse my model it stopped with this message.
I replace the for i in np.array(weights): with for i in weights: and also for biases and seems working ok.
Thanks

@sv1bds
Copy link
Author

sv1bds commented Oct 24, 2023

int[] activations = {SIGMOID, SIGMOID, SIGMOID} //Activations for each layer excluding the input layer
it must be so?
int activations[] = {SIGMOID, SIGMOID, SIGMOID}; //Activations for each layer excluding the input layer

@sv1bds
Copy link
Author

sv1bds commented Oct 24, 2023

void activate(int l, double* z, int activation) {
double sum = 0.0;
if (activation == SOFTMAX) {
for (int j = 0; j < l; j++) {
sum += exp(z[j]);
}
}
for (int i = 0; i < l; i++) {
if (activation == SIGMOID) {
z[i] = 1 / (1 + exp(-z[i]));
} else if (activation == TANH) {
z[i] = tanh(z[i]);
} else if (activation == EXPONENTIAL){
z[i] = exp(z[i]);
} else if (activation == SWISH){
z[i] = z[i] / (1 + exp(-z[i]));
} else if (activation == RELU){
z[i] = fmax(0, z[i]);
} else if (activation == SOFTMAX) {
z[i] = exp(z[i])/sum;
}
}
}

I implement SOFTMAX. It is not verified yet.

@sv1bds
Copy link
Author

sv1bds commented Oct 24, 2023

The snake model works fine on STM32!!! The SOFTMAX is verified. It uses 56% of memory. If you need more info please tell me.
Thanks

@sv1bds
Copy link
Author

sv1bds commented Oct 31, 2023

A complete description of the project can be found here : link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant