Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bump to 0.4.1 #176

Merged
merged 17 commits into from
Jan 6, 2025
Merged

bump to 0.4.1 #176

merged 17 commits into from
Jan 6, 2025

Conversation

boeschf
Copy link
Collaborator

@boeschf boeschf commented Dec 18, 2024

bump version from 0.4.0 to 0.4.1

  • remove build dependency on mpi4py
    • this has previously caused issues when the build time and compile time versions did not match
    • the MPI communicator is now passed by converting it to a Fortran handle (C pointers are not publicly exposed)
    • also leads to code simplifications on the c++ side
  • cmake python configuration is cleaned up
    • test virtual environment uses proper cmake commands and targets: removes unconditional repetitions of configure steps and establishes proper dependency chain
    • deleted dead cmake code
    • min-requirements-test.txt is created: use minimum python package versions instead of fixed ones for testing

Comment on lines 42 to 54
# custom command to generate mpi4py.hpp via python script
set(MPI4PY_HEADER "${CMAKE_CURRENT_BINARY_DIR}/mpi4py.hpp")
add_custom_command(
OUTPUT ${MPI4PY_HEADER}
COMMAND ${Python3_EXECUTABLE} ${CMAKE_CURRENT_SOURCE_DIR}/generate_mpi4py_header.py ${MPI4PY_HEADER}
COMMENT "Generating mpi4py.hpp with mpi4py version information"
)
# we need custom target to establish dependency
add_custom_target(generate_mpi4py_header DEPENDS ${MPI4PY_HEADER})
add_dependencies(pyghex_obj generate_mpi4py_header)
# add mpi4py include directories
target_include_directories(pyghex_obj PRIVATE "${PY_MPI4PY}/include")

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could probably be replaced by configure_file and a small extension to the ghex_find_python_module.cmake to retrieve the version.

include(FindPackageHandleStandardArgs)
function(find_python_module module)
    string(TOUPPER ${module} module_upper)

    if(NOT PY_${module_upper})
        if(ARGC GREATER 1 AND ARGV1 STREQUAL "REQUIRED")
            set(${module}_FIND_REQUIRED TRUE)
        endif()

        # A module's location is usually a directory, but for binary modules
        # it's a .so file.
        execute_process(COMMAND "${PYTHON_EXECUTABLE}" "-c"
            "import re, ${module}; print(re.compile('/__init__.py.*').sub('',${module}.__file__))"
            RESULT_VARIABLE _${module}_status
            OUTPUT_VARIABLE _${module}_location
            ERROR_QUIET OUTPUT_STRIP_TRAILING_WHITESPACE)
        execute_process(COMMAND "${PYTHON_EXECUTABLE}" "-c"
            "import re, ${module}; print(re.compile('/__init__.py.*').sub('',${module}.__version__))"
            OUTPUT_VARIABLE _${module}_version
            ERROR_QUIET OUTPUT_STRIP_TRAILING_WHITESPACE)
        message(STATUS "python-exe = ${PYTHON_EXECUTABLE}")
        message(STATUS "status = ${_${module}_status}")
        message(STATUS "location = ${_${module}_location}")
        message(STATUS "version = ${_${module}_version}")

        if(NOT _${module}_status)
            set(HAVE_${module_upper} ON CACHE INTERNAL "Python module available")
            set(PY_${module_upper} ${_${module}_location} CACHE STRING "Location of Python module ${module}")
            set(PY_${module_upper}_VERSION ${_${module}_version} CACHE STRING "Version of Python module ${module}")
        else()
            set(HAVE_${module_upper} OFF CACHE INTERNAL "Python module available")
        endif()
    endif(NOT PY_${module_upper})

    find_package_handle_standard_args(PY_${module} DEFAULT_MSG PY_${module_upper})
endfunction(find_python_module)

@@ -0,0 +1,7 @@
iniconfig
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The main reason for us to keep frozen versions is to get a reproducible build between ci and local. If the default workflow is to generate updated versions it defeats this purpose. If the reproducibility is not a requirement this file could just define minimum versions?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. I am torn here. On the one hand I like the idea of having a reproducible test case, but on the other hand I would also like to see if stuff breaks with newer versions. Any thoughts?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you would get that also by just providing minimum versions in a requirements-test.txt.
If you want both, the setup in gt4py is to run with frozen versions in PRs, but with updated versions in a daily plan.

import warnings
from mpi4py import __version__ as mpi4py_runtime_version
from ghex import MPI4PY_BUILD_VERSION
if MPI4PY_BUILD_VERSION != mpi4py_runtime_version:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the warning!

Comment on lines 128 to 131
- name: Install python packages
run: |
python3 -m pip install mpi4py --user

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe it's not needed to build the python bindings in this workflow as it's covered in the other 2? Otherwise, I guess the fix would be to create a venv.

@boeschf
Copy link
Collaborator Author

boeschf commented Dec 20, 2024

thanks for the comments! I am starting to think that maybe we should not use MPI4PY for the build at all. The only portable way to cross the Python/C++ boundary I found so far is to go through a conversion to a fortran handle (py2f). What do you think about this?

@havogt
Copy link
Collaborator

havogt commented Dec 20, 2024

thanks for the comments! I am starting to think that maybe we should not use MPI4PY for the build at all. The only portable way to cross the Python/C++ boundary I found so far is to go through a conversion to a fortran handle (py2f). What do you think about this?

No opinion. I don't see the current setup (even with a hard error on mismatching mpi4py versions) as a big issue, since we have to compile on install anyway. Switching the mpi4py version later is not an important use-case, I'd say. Any progress on being able to provide a binary distribution would be nice, but has to come from upstream, i.e. mpi4py.

@boeschf boeschf requested a review from havogt December 20, 2024 09:44
@boeschf boeschf merged commit c37725c into ghex-org:master Jan 6, 2025
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants