Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CMake Improvements for MPI #393

Open
pgierz opened this issue Dec 8, 2022 · 10 comments · May be fixed by #621
Open

CMake Improvements for MPI #393

pgierz opened this issue Dec 8, 2022 · 10 comments · May be fixed by #621
Milestone

Comments

@pgierz
Copy link
Member

pgierz commented Dec 8, 2022

Hello,

It seems that the default CMakeLists.txt files have difficulties finding the MPI Headers unless these are set via the module file's CPATH variable. While I can directly modify the module file on Albedo, this will not be the case in all machines.

It might therefore be nice if the CMakeLists.txt file (and subsequent Cmake configuration in other folders) was able to cleverly figure out where mpi.h and mpif.h are. The cleanest way (I guess) is to use CMAKE_PREFIX_PATH which is set by the module file, and then includes all the relevant include, lib and so forth based upon that.

Unfortunately, I am not a cmake guru, and am poking around in the dark. Maybe something @hegish can help out with?

Best
Paul

@patrickscholz
Copy link
Contributor

Somehow this problem i havent encountered so far on Albedo ... How looks the environment file you used, or which modules did you load?

@pgierz
Copy link
Member Author

pgierz commented Dec 8, 2022

Hi Patrick, are you using intelmpi or openmpi? There are different things set by the modules.

@pgierz
Copy link
Member Author

pgierz commented Dec 8, 2022

The compile script is about as simple as I can make it:

#!/usr/bin/bash
# Dummy script generated by esm-tools, to be removed later:
set -e
module purge
module load gcc/12.1.0
module load openmpi/4.1.3
module load udunits/2.2.28
module load netcdf-c/4.8.1-gcc12.1.0
module load netcdf-cxx4/4.3.1-gcc12.1.0
module load netcdf-fortran/4.5.4-gcc12.1.0
module load hdf5/1.12.2-gcc12.1.0
module load intel-oneapi-mkl/2022.1.0-gcc12.1.0
module load cdo/2.0.5
module load nco/5.0.1
module load git/2.35.2
module load python/3.10.4
module list

export LC_ALL=en_US.UTF-8
export MPI_LIB=$($MPIF90 -show |sed -e 's/^[^ ]*//' -e 's/-[I][^ ]*//g')
export LAPACK_LIB='-L/albedo/soft/sw/spack-sw/intel-oneapi-mkl/2022.1.0-7235vfh/mkl/2022.1.0 -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lm -ldl'
export FFLAGS='-fallow-argument-mismatch'
export taken2from=fesom2_compile
export ENVIRONMENT_SET_BY_ESMTOOLS=TRUE

unset SLURM_MEM_PER_NODE
unset SLURM_MEM_PER_CPU

cd fesom-2.1
mkdir -p build; cd build; cmake ..;   make install -j `nproc --all`
cd ..

@patrickscholz
Copy link
Contributor

In moment i play with intel compiler my env file with which i compile FESOM2 on albedo looks in moment like ...

module load intel-oneapi-compilers 
module load intel-oneapi-mkl/2022.1.0
module load intel-oneapi-mpi/2021.6.0
export FC="mpiifort -qmkl" CC=mpiicc CXX=mpiicpc

module load netcdf-fortran/4.5.4-intel-oneapi-mpi2021.6.0-oneapi2022.1.0
module load netcdf-c/4.8.1-intel-oneapi-mpi2021.6.0-oneapi2022.1.0

export NETCDF_Fortran_INCLUDE_DIRECTORIES=/albedo/soft/sw/spack-sw/netcdf-fortran/4.5.4-vmvv2ho/include
export NETCDF_Fortran_LIBRARIES=/albedo/soft/sw/spack-sw/netcdf-fortran/4.5.4-vmvv2ho/lib
export NETCDF_C_LIBRARIES=/albedo/soft/sw/spack-sw/netcdf-c/4.8.1-sp3ulf4/lib

@pgierz
Copy link
Member Author

pgierz commented Dec 8, 2022

@mandresm: in the next cleanup I am going to throw out all of these stupid "taken2from` variables, they serve no useful purpose.

@pgierz
Copy link
Member Author

pgierz commented Dec 8, 2022

export NETCDF_Fortran_INCLUDE_DIRECTORIES=/albedo/soft/sw/spack-sw/netcdf-fortran/4.5.4-vmvv2ho/include

that one can go away, it is set by the module file. I can also set the Libraries variables directly in the module if you want.

@patrickscholz
Copy link
Contributor

the thing with NETCDF_Fortran_INCLUDE_DIRECTORIES i saw that already, its a left over from my tests.
I mean i cant claim that FESOM2 runs well with these env file but at least it does not complain not to know any mpi headers...

@pgierz
Copy link
Member Author

pgierz commented Dec 8, 2022

Well, the point is: shouldn't the cmake config of fesom be clever enough to figure all of this out without relying on the user (or the module) to set certain variables? If this cmake prefix gets set, everything else should fall into place.

But again, I am unfortunately not (yet) a cmake guru...

@patrickscholz
Copy link
Contributor

FESOM compiles also without setting these environment variables explicitly. I just put them now in to exclude them as a possible source for the I/O problems im hunting for .

@hegish
Copy link
Collaborator

hegish commented Dec 8, 2022

Hello,

It seems that the default CMakeLists.txt files have difficulties finding the MPI Headers unless these are set via the module file's CPATH variable. While I can directly modify the module file on Albedo, this will not be the case in all machines.

It might therefore be nice if the CMakeLists.txt file (and subsequent Cmake configuration in other folders) was able to cleverly figure out where mpi.h and mpif.h are. The cleanest way (I guess) is to use CMAKE_PREFIX_PATH which is set by the module file, and then includes all the relevant include, lib and so forth based upon that.

Unfortunately, I am not a cmake guru, and am poking around in the dark. Maybe something @hegish can help out with?

Best Paul

I do not advise do go that route. Finding all the MPI related stuff and use the correct libraries etc. is a bit error prone when doing it from cmake. Instead use the appropriate MPI compiler wrappers for FC, CC and CXX. This is what they are made for.

@JanStreffing JanStreffing added this to the FESOM 2.6.1 milestone Aug 27, 2024
@JanStreffing JanStreffing linked a pull request Aug 27, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants