Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support numpy2 #411

Open
wants to merge 12 commits into
base: main
Choose a base branch
from
Open

Support numpy2 #411

wants to merge 12 commits into from

Conversation

ewu63
Copy link
Collaborator

@ewu63 ewu63 commented Jul 15, 2024

Purpose

Closes #408.

I removed the runtime requirement of numpy<2, and instead always build against numpy2 as suggested by numpy docs.

Changes

  • support numpy2
  • added matrix test for CI on Windows
  • added setuptools as a runtime dependency (missing dependency; unrelated to numpy2 fixes).

Expected time until merged

As long as it takes for CI to pass. While we don't test against numpy2 here (we could add CI for it for windows), we will be testing it on conda in the future.

Type of change

  • Bugfix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (non-backwards-compatible fix or feature)
  • Code style update (formatting, renaming)
  • Refactoring (no functional changes, no API changes)
  • Documentation update
  • Maintenance update
  • Other (please describe)

Testing

Checklist

  • I have run flake8 and black to make sure the Python code adheres to PEP-8 and is consistently formatted
  • I have formatted the Fortran code with fprettify or C/C++ code with clang-format as applicable
  • I have run unit and regression tests which pass locally with my changes
  • I have added new tests that prove my fix is effective or that my feature works
  • I have added necessary documentation

@ewu63 ewu63 requested a review from a team as a code owner July 15, 2024 23:52
@ewu63 ewu63 requested review from lamkina and ArshSaja July 15, 2024 23:52
@ewu63
Copy link
Collaborator Author

ewu63 commented Jul 16, 2024

Hmm so all the builds are failing due to the numpy lock file. Not sure how we should address this, since build-time dependency needs to be 2.0 but runtime version should be the one pinned.

pyproject.toml Show resolved Hide resolved
@enrpadilla
Copy link

Hmm so all the builds are failing due to the numpy lock file. Not sure how we should address this, since build-time dependency needs to be 2.0 but runtime version should be the one pinned.

I think the fix for this would be to create a separate .yml file that gets activated after pyoptsparse has been built. At least, locally, I was able to (partially) fix the Windows build that uses numpy>=2.0 in the conda environment by creating a separate .yml file that I activated at runtime, and only three tests failed:

The following tests failed:
test_hs015.py:TestHS15.test_optimization_2_CONMIN
test_hs071.py:TestHS71.test_optimization_4_CONMIN
test_rosenbrock.py:TestRosenbrock.test_optimization_3_CONMIN


Passed:  38
Failed:  3
Skipped: 34


Ran 75 tests using 1 processes
Wall clock time:   00:02:8.67

I'm still familiarizing myself with the project, so I don't know why those three tests are failing, but I can push the commit on the changes I've made if you'd like @ewu63.

@ewu63
Copy link
Collaborator Author

ewu63 commented Jul 30, 2024

Let me clarify - my comment above was regarding the Docker-based Linux tests. We use a pip constraint file in our toolchain and that's creating some issues. This is an issue with our toolchain and something for us maintainers (CC @eirikurj) to sort out.

The step you mentioned is indeed necessary, and already done as part of this PR -- see here. The test is failing due to some regression errors that I have not had time to look into (which you can view by opening the test logs on GHA), and is unrelated to the build-time issues discussed above. Interestingly though, those errors are different from what you're seeing locally.

@enrpadilla
Copy link

enrpadilla commented Jul 30, 2024

Ah, I see. Thanks for that clarification.

As another data point then, when I initially pulled the branch, and tried to run the tests I got the same thing that is in the build-windows action log. I wonder if the discrepancy between there and what I am seeing locally is something to do with using the same environment and just installing a different version of numpy into it again. I know that from my experience using conda, sometimes the package manager won't upgrade/downgrade dependencies in an environment, so I wonder if when pyoptsparse is being built with numpy>=2.0 and then gets tested with a lower version it leaves the packages that numpy>=2.0 brought in because the package manager thinks it's okay. Which can cause weird things to happen.

@sanbales
Copy link

Just a friendly ping to see if there are any updates on this PR. Looks like the assert in line 148 in tests.test_hs015.test_ipopt is a bit optimistic with numpy=2? It now takes 45 iterations instead of 12. Is that a deal breaker? I can submit a PR that changes this quantity depending on the version of numpy if that would help, but I'm not too familiar with what is being tested here.

Copy link

codecov bot commented Oct 14, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 74.92%. Comparing base (bc021e4) to head (3ba01ef).
Report is 1 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #411   +/-   ##
=======================================
  Coverage   74.92%   74.92%           
=======================================
  Files          22       22           
  Lines        3334     3338    +4     
=======================================
+ Hits         2498     2501    +3     
- Misses        836      837    +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ewu63
Copy link
Collaborator Author

ewu63 commented Oct 14, 2024

Okay, looks like I got past the pip-constraints file issue (which is frankly a bit hacky, CC @eirikurj). The Windows build (which is the only one testing numpy2 right now) is failing in a couple places, which probably requires further investigation. Hopefully no f2py shenanigans but we shall see. Suggested next steps for debugging:

  1. I am not able to reproduce the numpy2 failures on my Linux machine locally, but maybe we should do this in Docker and see if we reproduce any of it.
  2. Determine if there is significant change in the behaviour of some optimizers. Maybe the heuristics need to be changed.

In any case, I doubt I have time to work on this in the next few weeks, so I suggest someone in the lab take a look, maybe @marcomangano or @eirikurj?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support numpy 2.0
4 participants