-
Notifications
You must be signed in to change notification settings - Fork 103
Issues: xdit-project/xDiT
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
How to use quantization methods to load models in a multi-machine environment
#415
opened Dec 27, 2024 by
etersin
Failed to load flux.1-dev with enable_sequential_cpu_offload and use_fp8_t5_encoder (4090)
bug
Something isn't working
#407
opened Dec 24, 2024 by
WeiboXu
Error when running Flux1.0 dev and HunyuanDiT-v1.2-Diffusers with multiple prompts
#398
opened Dec 18, 2024 by
henryhe4004
Reasons about inference speed difference with diffusers on single GPU without parrallelism?
#392
opened Dec 13, 2024 by
xyyan0123
support advanced attention implementations (FA3, FlashInfer, xformers, etc.)
help wanted
Extra attention is needed
#319
opened Oct 25, 2024 by
feifeibear
[Bug] Potential risk of getting stuck in PipeFusion
bug
Something isn't working
#310
opened Oct 17, 2024 by
HOOLoLo
RoadMap and Looking for Contributions
help wanted
Extra attention is needed
#213
opened Aug 22, 2024 by
feifeibear
10 tasks
[feature] Loras support for SD and Flux models
help wanted
Extra attention is needed
#192
opened Aug 14, 2024 by
tsubasakong
ProTip!
Follow long discussions with comments:>50.