[DFTB-Plus-User] an error in running DFTB_MPI
jsxz
jsxzzhangchao at 126.com
Thu Oct 2 16:38:10 CEST 2014
Dear Bálint,
Thank you very much for your explicitly reply!
According to your guide, I have compiled the DFTB_MPI successfully, and created the binary dftb+.
But when I run the dftb+ with the mpirun, new error appears,
[zc at lxslc509 testmpi]$ mpirun -np 4 ./dftbmpi+ < dftb_in.hsd
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
--------------------------------------------------------------------------
[[38600,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:
Module: OpenFabrics (openib)
Host: lxslc509.ihep.ac.cn
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
--------------------------------------------------------------------------
[[38598,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:
Module: OpenFabrics (openib)
Host: lxslc509.ihep.ac.cn
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
librdmacm: couldn't read ABI version.
librdmacm: assuming: 4
--------------------------------------------------------------------------
[[38603,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:
Module: OpenFabrics (openib)
Host: lxslc509.ihep.ac.cn
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
[[38601,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:
Module: OpenFabrics (openib)
Host: lxslc509.ihep.ac.cn
Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
dftbmpi+ 0000000000828269 Unknown Unknown Unknown
dftbmpi+ 0000000000826B3E Unknown Unknown Unknown
dftbmpi+ 00000000007D0072 Unknown Unknown Unknown
dftbmpi+ 0000000000781DE3 Unknown Unknown Unknown
dftbmpi+ 000000000078699B Unknown Unknown Unknown
libpthread.so.0 0000003CBD80ECA0 Unknown Unknown Unknown
libmpi.so.0 00002B4702F35D0C Unknown Unknown Unknown
dftbmpi+ 000000000077562B Unknown Unknown Unknown
dftbmpi+ 000000000075136D Unknown Unknown Unknown
dftbmpi+ 0000000000409EA1 Unknown Unknown Unknown
dftbmpi+ 0000000000409D21 Unknown Unknown Unknown
dftbmpi+ 0000000000409CC6 Unknown Unknown Unknown
libc.so.6 0000003CBCC1D9C4 Unknown Unknown Unknown
dftbmpi+ 0000000000409BC9 Unknown Unknown Unknown
CMA: unable to get RDMA device list
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
dftbmpi+ 0000000000828269 Unknown Unknown Unknown
dftbmpi+ 0000000000826B3E Unknown Unknown Unknown
dftbmpi+ 00000000007D0072 Unknown Unknown Unknown
dftbmpi+ 0000000000781DE3 Unknown Unknown Unknown
dftbmpi+ 000000000078699B Unknown Unknown Unknown
libpthread.so.0 0000003CBD80ECA0 Unknown Unknown Unknown
libmpi.so.0 00002B4B05AF5D0C Unknown Unknown Unknown
dftbmpi+ 000000000077562B Unknown Unknown Unknown
dftbmpi+ 000000000075136D Unknown Unknown Unknown
dftbmpi+ 0000000000409EA1 Unknown Unknown Unknown
dftbmpi+ 0000000000409D21 Unknown Unknown Unknown
dftbmpi+ 0000000000409CC6 Unknown Unknown Unknown
libc.so.6 0000003CBCC1D9C4 Unknown Unknown Unknown
dftbmpi+ 0000000000409BC9 Unknown Unknown Unknown
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
dftbmpi+ 0000000000828269 Unknown Unknown Unknown
dftbmpi+ 0000000000826B3E Unknown Unknown Unknown
dftbmpi+ 00000000007D0072 Unknown Unknown Unknown
dftbmpi+ 0000000000781DE3 Unknown Unknown Unknown
dftbmpi+ 000000000078699B Unknown Unknown Unknown
libpthread.so.0 0000003CBD80ECA0 Unknown Unknown Unknown
libmpi.so.0 00002B6723442D0C Unknown Unknown Unknown
dftbmpi+ 000000000077562B Unknown Unknown Unknown
dftbmpi+ 000000000075136D Unknown Unknown Unknown
dftbmpi+ 0000000000409EA1 Unknown Unknown Unknown
dftbmpi+ 0000000000409D21 Unknown Unknown Unknown
dftbmpi+ 0000000000409CC6 Unknown Unknown Unknown
libc.so.6 0000003CBCC1D9C4 Unknown Unknown Unknown
dftbmpi+ 0000000000409BC9 Unknown Unknown Unknown
CMA: unable to get RDMA device list
CMA: unable to get RDMA device list
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
dftbmpi+ 0000000000828269 Unknown Unknown Unknown
dftbmpi+ 0000000000826B3E Unknown Unknown Unknown
dftbmpi+ 00000000007D0072 Unknown Unknown Unknown
dftbmpi+ 0000000000781DE3 Unknown Unknown Unknown
dftbmpi+ 000000000078699B Unknown Unknown Unknown
libpthread.so.0 0000003CBD80ECA0 Unknown Unknown Unknown
libmpi.so.0 00002B9063978D0C Unknown Unknown Unknown
dftbmpi+ 000000000077562B Unknown Unknown Unknown
dftbmpi+ 000000000075136D Unknown Unknown Unknown
dftbmpi+ 0000000000409EA1 Unknown Unknown Unknown
dftbmpi+ 0000000000409D21 Unknown Unknown Unknown
dftbmpi+ 0000000000409CC6 Unknown Unknown Unknown
libc.so.6 0000003CBCC1D9C4 Unknown Unknown Unknown
dftbmpi+ 0000000000409BC9 Unknown Unknown Unknown
CMA: unable to get RDMA device list
Many thanks for your help!
Best regards
Xiaobao Zhang
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20141002/ef0cbc79/attachment.htm>
More information about the DFTB-Plus-User
mailing list