-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🧪 Pass sdist into cibuildwheel directly #82
base: main
Are you sure you want to change the base?
🧪 Pass sdist into cibuildwheel directly #82
Conversation
@nitzmahone while working on this I also identified invalid output references in the artifact download steps. Will try to address that now. |
bf7ba01
to
5e4aee6
Compare
It wasn't obvious right away, but referencing a job in the `${{ needs }}` context that is not being depended on, and does not even exist, silently returned an empty string for the whole expression passed to the `name:` input of the `download-artifact` action invocations. This subsequently triggered the behavior of said action to change to "download all the artifacts that exist in the workflow". This "download all" behavior has an additional quirk, though — it downloads each artifact in a subdirectory named after the artifact name. The behavior of the "download one specific artifact" mode, on the other hand, is different in that it does not create an extra level of nesting. This was the reason why it felt necessary to unpack two-levels deep sdist tarballs in the first place. With this patch, references to said sdists will change.
26b4f18
to
4314e7b
Compare
@nitzmahone I think 2bea383 constitutes a release blocker. |
@@ -258,14 +260,14 @@ jobs: | |||
id: fetch_sdist | |||
uses: actions/download-artifact@v4 | |||
with: | |||
name: ${{ needs.build_sdist.outputs.artifact_name }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(this was an empty string because there's no job with ID build_sdist
; causing all the artifacts to be downloaded and unpacked in subdirectories)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ugh- another bug introduced by moving back and forth between PyYAML and CFFI trying to standardize the workflows between them. GHA doesn't make it easy to debug these kinds of things because so many different layers just silently succeed with glaring issues like this. Their (lack of) expression type system also makes me appreciate Jinja so much more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interestingly, in the past, their YAML parser would error out if you were to reference an identifier not listed in the needs:
key. It used to be rather hard to find that web page that displays the error too since the checks wouldn't appear in the PR.
|
||
mkdir cffi | ||
|
||
tar zxf ${{ steps.fetch_sdist.outputs.download-path }}/cffi*.tar.gz/cffi*.tar.gz --strip-components=1 -C cffi |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(with the requested GHA artifact name being non-empty, the new path is actually ${{ steps.fetch_sdist.outputs.download-path }}/cffi*.tar.gz
)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah- the artifact upload/download pathing is painful. It's even worse on Windows, since the GHA context/event properties are usually (but not always) Windows paths with drive letters and backslashes, and there aren't enough expression conversion operators to (reliably) convert those. Using bash on Windows for "glue" instead of cmd/PS helps some, but there were still some things that just plain didn't work properly, which is why I had to fall back to not using the artifact fetch paths in all cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pretty sure bash can handle those. I had to work around this problem in some action already.
|
||
tar zxf ${{ steps.fetch_sdist.outputs.download-path }}/cffi*.tar.gz/cffi*.tar.gz --strip-components=1 -C cffi | ||
with: | ||
package-dir: >- |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(cibuildwheel
unarchives sdist tarballs if supplied in place of a directory)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm likely going to move the libffi build to its own cached-artifact job though, which means we'll still have to give cibuildwheel a full directory with an exploded tarball to get the cached artifact dir into the container envs alongside the sdist- that's why I abandoned just giving cibuildwheel the sdist tarball directly. I looked over the code- it didn't look like there was a way to build directly from an sdist tarball and send a project directory with build deps at the same time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason those things shouldn't just be in the sdist? It's supposed to be self-contained.
mkdir cffi | ||
|
||
tar zxf ${{ steps.fetch_sdist.outputs.download-path }}/cffi*.tar.gz/cffi*.tar.gz --strip-components=1 -C cffi | ||
python -m pip install --upgrade "${{ matrix.cibw_version || 'cibuildwheel' }}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(I didn't keep any ${{ matrix.cibw_version }}
references because it's likely some legacy leftover that doesn't actually exist anywhere)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should leave support in for that in some form- they're necessary every time cibuildwheel doesn't support something we need to (eg, every pre-beta Python, or when cibuildwheel drops support for a target Python before we do). I don't want to have to plumb it back in every time, since we're always testing new Pythons well before cibuildwheel has released support for them, and I haven't gotten around to PRing any of my "bring your own Python" hacks to cibuildwheel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can I do that in a follow-up? There's no structure for it left in the matrix even. Pretty sure I could sketch a better solution anyway.
name: ${{ steps.build.outputs.artifact_name }} | ||
path: dist/*.whl | ||
name: ${{ steps.built-artifact-lookup.outputs.artifact_name }} | ||
path: wheelhouse/${{ steps.built-artifact-lookup.outputs.artifact_name }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(I replaced a wildcard with the known value, since we actually have it in a var already)
|
||
- name: configure docker foreign arch support | ||
uses: docker/setup-qemu-action@v3 | ||
if: ${{ ! contains(matrix.spec, 'x86_64') }} | ||
|
||
- name: build/test wheels | ||
id: build | ||
uses: pypa/[email protected] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(I'm referencing the official action, since it's better integrated into the ecosystem and none of the dynamic detection magic is actually necessary anymore)
@@ -29,6 +29,8 @@ env: | |||
jobs: | |||
python_sdist: | |||
runs-on: ubuntu-22.04 | |||
outputs: | |||
artifact_name: ${{ steps.build_sdist.outputs.artifact_name }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(this was another missing bit that prevented the wheel jobs from referencing the tarball name)
This patch also brings a copy of `cibuildwheel` in through the GitHub Actions interface instead of PyPI.
4314e7b
to
2eda2d1
Compare
This patch also brings a copy of
cibuildwheel
in through the GitHub Actions interface instead of PyPI.It also fixes referencing the sdist artifact name from wheel building jobs.