-
Hi folks, I managed to build my app that uses an MPI-only lib (ExodusII from SNL) on top of AMPI instead of OpenMPI and here is what I get when I try to run it:
What does "TCharm has not been initialized!" mean? Thanks, |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
Forgot to add info on my charm build:
|
Beta Was this translation helpful? Give feedback.
-
With AMPI you need to privatize the global variables of all MPI code/libraries to the virtual ranks for correctness when running with multiple ranks per process. HDF5 contains hundreds of mutable global/static variables. In the past we privatized HDF5 using TLSglobals. This requires you to build our modified version of HDF5 using -tlsglobals. You could also try using pieglobals on an unmodified version of HDF5, but that hasn't been tested and I'm not sure how pieglobals would interact with AMPI + Charm++ interoperation. In general, AMPI + Charm++ interoperability is experimental and not well tested at this point. |
Beta Was this translation helpful? Give feedback.
-
Thanks, Sam. That is useful to know. I did find and have used in the past the modified hdf5 you mention. While looking into this in more detail, however, I realized that my app does not even need hdf5 that importantly, as for the functionality we use, we can get away with relying on the serial version of netcdf within the exodus library to read/write unstructured meshes and associated data in parallel. So this problem goes away. |
Beta Was this translation helpful? Give feedback.
With AMPI you need to privatize the global variables of all MPI code/libraries to the virtual ranks for correctness when running with multiple ranks per process. HDF5 contains hundreds of mutable global/static variables. In the past we privatized HDF5 using TLSglobals. This requires you to build our modified version of HDF5 using -tlsglobals. You could also try using pieglobals on an unmodified version of HDF5, but that hasn't been tested and I'm not sure how pieglobals would interact with AMPI + Charm++ interoperation. In general, AMPI + Charm++ interoperability is experimental and not well tested at this point.