-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TF2 - Implement tf.saved_model.load #63
Comments
I'm using below alternate solution:
|
Thanks for your response. Are you using this function? or something like it?
|
Correct. |
Thankyou for your direction. I'm just not sure I understand how it works. Are there any other examples or further explanation on how to do this? I note that when I export my tf built model, I see this warning:
Should I still be able to build / convert my model in this suggested way? Also, I really would like to know the answer to my first question: are there plans to implement TF2 functions? |
Hello, as per the subject, are there plans to implement TF2 functions?
For example:
This is part of the code I would like to implement in C# from python
Some of this I can do, but now with TF2 being the standard (no frozen inference graph), it is difficult to proceed with building models in the old standard.
Just want to know if there are any plans as I cant find anything in the documentation that matches this use case from python.
Many thanks for your work so far.
The text was updated successfully, but these errors were encountered: