-
Notifications
You must be signed in to change notification settings - Fork 480
Issues: pytorch/xla
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Query regarding using 1 chip (2 cores of TPU v3) for Inference
#8359
opened Nov 6, 2024 by
deepakkumar2440
Offer user guide instructions to users to leverage various Bugs/features related to improving the usability of PyTorch/XLA
libtpu
versions
documentation
usability
#8355
opened Nov 4, 2024 by
miladm
Error when usecollective communication in torch_xla.core.xla_model.all_to_all in SPMD mopde
#8345
opened Oct 31, 2024 by
DarkenStar
Bug - Using Sharding in Flash Attention with segment ids.
#8334
opened Oct 29, 2024 by
dudulightricks
Provide debugging and troubleshooting tips to Pallas developer
documentation
#8301
opened Oct 22, 2024 by
miladm
Clarify that torch_xla2 is only recommended for inference
#8270
opened Oct 17, 2024 by
cloudchrischan
Improve documentation for Bugs/features related to improving the usability of PyTorch/XLA
get_memory_info
usability
#8245
opened Oct 9, 2024 by
miladm
XLA2 does not work with jax 0.4.34 (but did work on jax 0.4.33)
#8240
opened Oct 9, 2024 by
Chaosruler972
A process in the process pool was terminated abruptly while the future was running or pending.
#8234
opened Oct 8, 2024 by
fancy45daddy
A process in the process pool was terminated abruptly while the future was running or pending.
#8233
opened Oct 8, 2024 by
fancy45daddy
how to use torch.float16 in diffusers pipeline with pytorch xla
#8223
opened Oct 6, 2024 by
fancy45daddy
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.