Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

some problem about long text #32

Open
ttslr opened this issue Aug 3, 2020 · 1 comment
Open

some problem about long text #32

ttslr opened this issue Aug 3, 2020 · 1 comment

Comments

@ttslr
Copy link

ttslr commented Aug 3, 2020

Thanks for your great work!
I ran your code and it works well.

Btw, I use a dynamic "max_len" instead of 400 when I synthesis the speech.
But it has some errors when the long text was given since your position_embedding's max length is 1024 (such as https://github.com/soobinseo/Transformer-TTS/blob/master/network.py#17, https://github.com/soobinseo/Transformer-TTS/blob/master/network.py#63).
I think it's better to increase the number to make it work when feeding long text.

Thanks for your work again. :)

@ttslr ttslr changed the title some problem abtout long text some problem about long text Aug 3, 2020
@soobinseo
Copy link
Owner

Thanks for your advice.

I will check it soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants