-
-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inochi2D Support #14
Comments
The project looks quite interesting, but I noticed that there isn't an easy way to integrate it with the web, which is the frontend implementation used in Open-LLM-VTuber. |
The current impelmentation of facial expression with Live2D calls the pre-defined facial expressions shipped with the Live2D model. I might need to do some research to emit blendshape. |
Blendshapes emitted via VMC protocol are arbitrary key-value pairs, with most values being scalar floats. |
Yeah, I can see that in your screenshot, but I don't have the pre-defined values for facial expressions, such as happy, sad, or something else. How do I get those things? Is it possible to get those things by recording them in some face-tracking software or just getting them from somewhere? I'm really new to these things... Also, I don't know how to display them on my web frontend. This is actually the biggest problem for me. I think blendshape wouldn't be too hard, but I actually have no idea how to display inochi2d on the web. |
Inochi2D is a bit more modular than Live2D in this way... But Inochi2D itself is more akin to a standard (including file format) that has example implementations. There is a rust implementation in the works that could run on the web but it's still in-progress and isn't complete. Examples of blendshapes can be found in the iOS specification, with many folk using iPhones specifically for face tracking. https://developer.apple.com/documentation/arkit/arblendshapelocationjawopen?language=objc Now, the typical workflow is as follows:
That said, you can simply push out values for Happiness (-1 .. 0 .. 1) and say it's Inochi2D's job to map this shape into puppet animation parameters. |
Inochi2D is a free(in freedom) and open source alternative to Live2D. Currently under development but already being very capable of displaying 2D puppets.
Everything they do are opened source here:
https://github.com/Inochi2D/
The text was updated successfully, but these errors were encountered: