-
Notifications
You must be signed in to change notification settings - Fork 529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] about nasnet-a used as subnetwork in adanet #56
Comments
@tianxh1212 : With the following code and settings you should be able to get the same results: https://github.com/tensorflow/adanet/blob/master/adanet/examples/nasnet.py#L181
|
@cweill
Solution:I resolve the above question by setting:
|
@SmallyolkLiu: Have a look at our research code that uses NASNet in AdaNet. It shows you how you can get it working on Google Cloud MLE. |
Hi @cweill . Thanks for all the great work. Did you apply any data augmentation? In your research code, I see that you apply basic augmentation (flip + crop) to input images. Did you do the same to the performance reported in the blog? Thanks! |
@tl-yang: Please see the details in our recent paper: https://arxiv.org/abs/1903.06236 |
@cweill
hi,Weill,
I saw an article on the Google ai blog:https://ai.googleblog.com/2018/10/introducing-adanet-fast-and-flexible.html
The article mentions using nasnet-a as subnetwork after 8 adanet iteration and get the error-rate of 2.3% on cifar-10
and with fewer parameters at the same time. I would like to ask two questions.
1.do you use the entire nasnet-a architecture network(for exapmle, N=6 and F=32) as subnetwork, or use the normal cell as subnetwork or something else.
2. how do the subnetworks combine to each other.
Thanks !
The text was updated successfully, but these errors were encountered: