Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Wheel naming issue #663

Closed
zhyncs opened this issue Dec 14, 2024 · 11 comments
Closed

[Bug] Wheel naming issue #663

zhyncs opened this issue Dec 14, 2024 · 11 comments

Comments

@zhyncs
Copy link
Member

zhyncs commented Dec 14, 2024

ref
#643
#659
#660
#662

@ur4t @xslingcn

Before

flashinfer-0.1.6+cu124torch2.4-cp310-cp310-linux_x86_64.whl

After

flashinfer-0.1.6-cp38-abi3-linux_x86_64.whl

Expected

flashinfer-0.1.6+cu124torch2.4-cp38-abi3-linux_x86_64.whl
@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

CUDA and Torch version info missing: +cu124torch2.4

@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

@ur4t Could you help fix this issue? Thanks.

@xslingcn
Copy link
Contributor

xslingcn commented Dec 14, 2024

setup.py is taking FLASHINFER_BUILD_VERSION for top priority.

flashinfer/setup.py

Lines 47 to 55 in b1b1fb8

def get_version():
build_version = os.environ.get("FLASHINFER_BUILD_VERSION")
if build_version is not None:
return build_version
package_version = (root / "version.txt").read_text().strip()
local_version = os.environ.get("FLASHINFER_LOCAL_VERSION")
if local_version is None:
return package_version
return f"{package_version}+{local_version}"

It seems we specify this env variable in the nightly building script, while the cuda/torch version is missing:
https://github.com/flashinfer-ai/flashinfer-nightly/blob/f28fab619a21573c9e1ec0dba8bec841c890cfed/.github/workflows/nightly-release.yml#L51-L52

https://github.com/flashinfer-ai/flashinfer-nightly/blob/f28fab619a21573c9e1ec0dba8bec841c890cfed/.github/workflows/nightly-release.yml#L62

@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

Yeah I think before these #643 #659 #660 #662 the nightly works

@xslingcn
Copy link
Contributor

xslingcn commented Dec 14, 2024

Not sure how this was handled before we changed packaging, but one fix I can think of now is:

diff --git a/.github/workflows/nightly-release.yml b/.github/workflows/nightly-release.yml
index 0c919e9..9c6c4ae 100644
--- a/.github/workflows/nightly-release.yml
+++ b/.github/workflows/nightly-release.yml
@@ -48,7 +48,10 @@ jobs:
           sed -i 's/+cu/\.cu/g' scripts/run-ci-build-wheel.sh
           sed -i 's|/ci-cache|/opt/dlami/nvme/flashinfer/github|g' scripts/run-ci-build-wheel.sh
           sed -i '/mkdir -p "\$CONDA_pkgs_dirs" "\$XDG_CACHE_HOME"/d' scripts/run-ci-build-wheel.sh
-          version="$(cat version.txt)"+"$(git rev-parse HEAD | cut -c1-7)"
+          CUDA_VERSION=${{ matrix.cuda }}
+          CUDA_MAJOR="${CUDA_VERSION%.*}"
+          CUDA_MINOR="${CUDA_VERSION#*.}"
+          version="$(cat version.txt)+$(git rev-parse HEAD | cut -c1-7)+cu${CUDA_MAJOR}${CUDA_MINOR}torch${{ matrix.torch }}"
           echo "VERSION=$version" >> $GITHUB_ENV
           echo "version=$version" >> $GITHUB_OUTPUT
         id: get_version

@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

There are several ways to ensure nightly compatibility. I want to know the name given when version 0.2.0 was released for the main branch of FlashInfer (not nightly). Also, it would be better if we could be compatible with both the nightly and stable releases.

@xslingcn
Copy link
Contributor

This issue doesn't exist in the stable release workflow, because we don't specify FLASHINFER_BUILD_VERSION so it will use the FLASHINFER_LOCAL_VERSION as passed in:

FLASHINFER_ENABLE_AOT=1 FLASHINFER_LOCAL_VERSION="cu${CUDA_MAJOR}${CUDA_MINOR}torch${FLASHINFER_CI_TORCH_VERSION}" python -m build --no-isolation --wheel

@xslingcn
Copy link
Contributor

I want to know the name given when version 0.2.0 was released for the main branch of FlashInfer (not nightly).

It will be something like flashinfer-0.2.0+cu12.4torch2.4-cp38-abi3-linux_x86_64.whl.

@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

I see. I'll help with the nightly build issue.

@zhyncs
Copy link
Member Author

zhyncs commented Dec 14, 2024

fixed with flashinfer-ai/flashinfer-nightly#4

@zhyncs zhyncs closed this as completed Dec 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants