-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about other models #27
Comments
Hi @Anecdote-Lee, We have documents how to run alternative GAN models here: https://github.com/LS4GAN/benchmarking . Please, check out that repository -- it has information on how we trained and evaluated models in CelebA and Anime datasets. As for the CelebA-HQ and AFHQ datasets -- we relied on the results reported by EGSDE. |
Good, thank you! |
Hello again! @usert5432
|
Hi @Anecdote-Lee,
The difference in configuration options is purely cosmetic and just a historical artifact. All the datasets use all the domains for the pre-training. The reason for this cosmetic difference is that:
The standard
and can be read by the default
Unfortunately, I cannot give any concrete suggestions here. I do not think the small dataset size should be a problem by itself, although having more data augmentations may benefit (as long as you are ok with these augmentations to leak into the generated images (https://arxiv.org/pdf/2006.06676.pdf).) From my experience -- if you dataset is "good" then the CycleGAN-like models will work with minimal modifications (even with small datasets). If your dataset is "bad", then you would need to do a lot of hyperparameter tunes and data augmentations to make it work. And, unfortunately, the easiest way to find out if your dataset is "good" or "bad" is to run the training and see if it works. |
Thank you for your help and suggestion. It will be very helpful! |
Hello! I want to ask the other question. |
Hi @Anecdote-Lee, Sure.
During the pre-training process we actually train a single joint generator for A and B domains simultaneously. Then we use that single generator to initialize both A->B and B->A translations. Here is the corresponding configuration: uvcgan2/scripts/celeba_hq/train_m2f_translation.py Lines 50 to 53 in 40b60d2
Both, |
Thanks for your kind explanation. |
Hello,
I am using this model to train and test my datasets.
Like the paper of this code, I want to compare this model with other models such as CycleGAN or other GANs.
Is there any provided code in this github to run other models to compare?
(The reason why I thought it is provided is that I saw 'model' parameter to select)
The text was updated successfully, but these errors were encountered: