-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Preserve node naming/namespacing #47
Comments
Hi! I'm not sure how to implement that. with tf.name_scope('block'):
dense = tf.keras.layers.Dense(10, input_shape=(2, ))
inputs = tf.keras.Input((1, 2))
outputs = dense(inputs)
model = tf.keras.Model(inputs, outputs)
for w in model.weights:
print(w.name) dense/kernel:0
dense/bias:0 Any ideas? |
It's super weird how this works in TF. Usually, Then I stumbled upon this: keras-team/tf-keras#269 TL;DR it works if you enable I'm still trying to find the proper place to apply the named scopes in Nobuco. |
I managed to hack it out but it's far from ideal: microblink@8b5a142 |
Hey, could you give me an example script? I tried the patch, and I see no effect on my machine. My other concern is that disabling |
Hello,
Thanks for developing this great tool! I have a question regarding layer/node naming in the converted Keras model.
For the context, I have a usecase where I postprocess the converted graph and I expect the nodes to conform to some custom naming schema (more specifically, to be "namespaced" or to contain some specific keywords). In the current Nobuco setup, module names in the Keras graph are autogenerated and thus not connected to Pytorch module names.
Do you think it would make sense to support some sort of naming/namespacing here?
For 1-1 mappings PT module name could be used for Keras layer name, and for 1-N cases I'd propose using the PT module name as a prefix to the autogenerated name.
If you like this idea, I can offer some help with the implementation. Thanks in advance!
The text was updated successfully, but these errors were encountered: