Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wasm demos #87

Open
bil-ash opened this issue Sep 30, 2024 · 1 comment
Open

wasm demos #87

bil-ash opened this issue Sep 30, 2024 · 1 comment

Comments

@bil-ash
Copy link

bil-ash commented Sep 30, 2024

Please create the following browser wasm demos-

  1. Stable diffusion with W8A8 quantization- This is important because the stable diffusion demo which I saw uses fp16 weights with transformers.js as the engine and requires webgpu/webnn support. Onnxstream with W8A8 quantization(and without webgpu/webnn requirement) should be much lighter.
  2. LLM(llama) with int4-int8 quantization- This will become a feasible alternative to wllama
@vitoplantamura
Copy link
Owner

vitoplantamura commented Oct 2, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants