Skip to content

Releases: geospace-code/h5fortran-mpi

harmonize API, auto-build HDF5

11 Dec 16:41
3704445
Compare
Choose a tag to compare
  • harmonize API with legacy non-MPI h5fortran
  • auto-build recent HDF5-MPI if not found/working
  • require CMake >= 3.19 for robustness

cpu_count: make single target

29 Jul 21:14
423f87c
Compare
Choose a tag to compare

we don't run benchmarks on CI due to their stability/demands. If benchmarks enabled, also enable tests.

update benchmark API

28 Jul 15:29
5c7b70a
Compare
Choose a tag to compare

correct benchmarks API to match v2.0 h5fortran-mpi

make cpu_count match the scivision/physical-cpu-count project. explicit include directories.

add scheduled runs of Windows oneAPI

work with or without MPI

22 Jul 23:54
8716b7a
Compare
Choose a tag to compare

can now serve as a drop-in replacement for h5fortran (original, non-MPI) as well as be used for HDF5-MPI collective operations.

unify with h5fortran

21 Jul 20:30
b5b0f09
Compare
Choose a tag to compare

added numerous features to bring in sync with h5fortran (original non-MPI)

initial release

13 Jan 21:30
a27c829
Compare
Choose a tag to compare

based on h5fortran, now including HDF5-MPI parallel read, write of datasets and attributes.

Nearly all of the object-oriented functionality from h5fortran is present in h5fortran-mpi.

A key distinction for user programs is h5fortran-mpi REQUIRES MPI to be linked in the user program, EVEN IF the program doesn't use HDF5-MPI.

At a future point, to fully merge h5fortran and h5fortran-mpi, we might consider a build-time way to remove the need to link MPI for HDF5/h5fortran programs not using HDF5-MPI.