-
Notifications
You must be signed in to change notification settings - Fork 262
Issues: intel/intel-extension-for-pytorch
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Pausing Intel® Extension for PyTorch* CPPSDK for XPU devices releasing
#786
opened Feb 18, 2025 by
jingxu10
Comfyui 0.3.14 make a pic with lumina 2.0 model got bug report
#783
opened Feb 8, 2025 by
kingsdom-sk
UR error during ComfyUI's tutorial with Stable Diffusion model on Ubuntu with TigerLake-H GT1
#781
opened Feb 4, 2025 by
ValentinaGalataTNG
Is
intel-extension-for-pytorch==2.5.0
compatible with torch==2.6.0
?
#778
opened Jan 29, 2025 by
steve-marmalade
How to enable and disable flash attention during inference?
CPU
CPU specific issues
LLM
#776
opened Jan 25, 2025 by
troore
General question about purpose of this project in scope of other hardware than Intel
#772
opened Jan 16, 2025 by
cyberluke
Performance regression with IPEX 2.3, TORCH 2.6 compared with IPEX 2.1
ARC
ARC GPU
Escalate
Performance
#768
opened Jan 10, 2025 by
gc-fu
functionalTensor is not supported by ipex custom kernel when using Torch.compile after ipex.llm.optimize
#760
opened Dec 30, 2024 by
lostkingdom4
AOT IPEX wheels for Meteor Lake and above targeting Linux missing from releases.
iGPU
#753
opened Dec 20, 2024 by
simonlui
Previous Next
ProTip!
Updated in the last three days: updated:>2025-02-17.