Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seeking Empirical Guidance on Optimal Parameter Settings for LivePortrait Lip Sync #378

Open
tonyabracadabra opened this issue Sep 8, 2024 · 1 comment

Comments

@tonyabracadabra
Copy link

Hello,

I’m currently working with the LivePortrait model for synchronizing expressions between videos and am seeking empirical advice on the best parameter settings for achieving accurate lip sync.

Specifically, I have been experimenting with the relative_motion parameter. When set to false, I observed excessive mouth closing, while setting it to true resulted in noticeable jittering.

Given that parameter settings can be complex and interdependent, I am looking for guidance on how to balance these parameters effectively. Could you provide any empirical advice or recommendations for setting parameters to achieve the best lip sync results?

Any insights or experiences you can share would be greatly appreciated.

Thank you!

@zzzweakman
Copy link
Collaborator

Hi @tonyabracadabra, you can try modifying this parameter. The Gradio interface allows you to directly adjust this parameter, which represents the strength of the driving force. You can tweak it until it meets your needs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants