You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to run an example simulation after compilation.
It seems there is an issue with ParMETIS related to MPI segmentation.
I would greatly appreciate it if someone could guide me on how to resolve this issue.
I've attached the error message below.
ucns3d_p will run on 2 MPI tasks.
Authorization required, but no authorization protocol specified
Authorization required, but no authorization protocol specified
i read the bleed BC
RESTARTING 0 0.0000000000000000 0
ParMETIS Initiated
[kucfd-System:909616] *** An error occurred in MPI_Comm_rank
[kucfd-System:909616] *** reported by process [1937178625,0]
[kucfd-System:909616] *** on communicator MPI_COMM_WORLD
[kucfd-System:909616] *** MPI_ERR_COMM: invalid communicator
[kucfd-System:909616] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[kucfd-System:909616] *** and potentially your MPI job)
[kucfd-System:909609] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[kucfd-System:909609] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
The text was updated successfully, but these errors were encountered:
I assume that there might be an issue with the parmetis since this is when the code crashes.
Please check the following:
If you are running a 2D case do you have the 2D option in the UCNS3D.DAT file?
check the chmod of ucns3d_p since the authorisation required message might indicate something related to that?
Check what compilers combo you are using (Gnu with OpenMPI/MPICH should work, Intel fortran with Intel MPI should work, GNU with CrayMPI/HPE mpi should work, however when switching with intel fortran and openmpi or gnu fortran with intel mpi you might get these issue.
Hi,
I am trying to run an example simulation after compilation.
It seems there is an issue with ParMETIS related to MPI segmentation.
I would greatly appreciate it if someone could guide me on how to resolve this issue.
I've attached the error message below.
ucns3d_p will run on 2 MPI tasks.
Authorization required, but no authorization protocol specified
Authorization required, but no authorization protocol specified
i read the bleed BC
RESTARTING 0 0.0000000000000000 0
[kucfd-System:909616] *** An error occurred in MPI_Comm_rank
[kucfd-System:909616] *** reported by process [1937178625,0]
[kucfd-System:909616] *** on communicator MPI_COMM_WORLD
[kucfd-System:909616] *** MPI_ERR_COMM: invalid communicator
[kucfd-System:909616] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[kucfd-System:909616] *** and potentially your MPI job)
[kucfd-System:909609] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[kucfd-System:909609] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
The text was updated successfully, but these errors were encountered: