Skip to content

Release the weight for 4x RRDB architecture

Compare
Choose a tag to compare
@Kiteretsu77 Kiteretsu77 released this 24 Mar 04:11
· 44 commits to main since this release

Hello! We have retrained the model with the same hyperparameter setting for RRDB-6B (Which is the same structure as Real-ESRGAN Anime6B). In the future, when we have more time, we will tune more to the perpetual loss-related hyperparameter. More information will be found in the model zoo.

So why RRDB? This is because I can directly use my FAST inference codebase without modifying the code too much (lol).

Update: We have cleaned the 4x_APISR_RRDB_GAN_generator weight to have less weight size. Thanks!

Thanks!