[DFTB-Plus-User] error with ELSI solvers
Bálint Aradi
aradi at uni-bremen.de
Tue Feb 1 18:13:40 CET 2022
Dear Huy,
I have some problems to reproduce the segfault. I've tried your example
with the Conda openmpi version, as well as with an Intel-compiled 21.2.
Both were able to cope with your example and went without any problems
beyond the first total energy calculations (after 14 scc iterations).
Could you eventually also try the Conda version, just to exclude, that
it is not a problem with the building process somehow?
Best regards,
Bálint
On 28.01.22 15:57, Huy Pham wrote:
> Dear Bálint,
>
> Thanks for your quick reply!
>
> I am able to solve the PEXSI problem following your suggestion. But
> again after the first SCF, there is an error "segmentation fault ".
>
> Adding "ulimit -s unlimited" before running the exe file doesn't help,
> still the same error. Please let me know if you have further
> suggestions. Attached are my input files, if you need it.
>
> Thanks,
> Huy
>
> On Fri, Jan 28, 2022 at 12:12 AM Bálint Aradi <aradi at uni-bremen.de
> <mailto:aradi at uni-bremen.de>> wrote:
>
> Dear Huy,
>
> For the segfault: Often a limited stack size is the reason (hits me
> often). Could you check, whether that is the case? (ulimit -s unlimited)
>
> As for the PEXSI problem. The error message is generated by the PEXSI
> library within ELSI. I am definitely not a PEXSI expert, but having a
> quick look at the PEXSI code in ELSI
>
> https://gitlab.com/ElectronicStructureLibrary/elsi-interface/-/blob/master/src/elsi_util.f90#L423
> <https://gitlab.com/ElectronicStructureLibrary/elsi-interface/-/blob/master/src/elsi_util.f90#L423>
>
> suggests, that the product of the number of poles per process
> (ProcsPerPole), the number of poles (Poles) and the number of
> processors
> used to search the Fermi level (muPoints) should be greater or equal
> than the number of the MPI processes.
>
> I hope this helps.
>
> Best regards,
>
> BálintBest regards,
>
> Bálint
>
>
>
> On 28.01.22 01:58, Huy Pham wrote:
> > Hi All,
> >
> > I have a problem when running short MD simulation using DFTB plus
> with
> > different solvers in ELSI. I used the 3ob dataset, the system is a
> > molecular crystal (C, H, N, O) with 576 atoms (supercell).
> >
> > The simulation was fine with ELPA, but if I changed the solver to
> OMM or
> > NTPoly, the simulations crashed after finishing the first SCF.
> The error
> > is "SIGSEGV, segmentation fault occurred" which is difficult to
> track down.
> >
> > When using PEXSI, the error is below:
> > **Error! MPI task 38 in elsi_dm_complex_sparse: Number of
> MPI tasks
> > per pole too small
> > **Error! MPI task 270 in elsi_dm_complex_sparse: Number of
> MPI tasks
> > per pole too small
> > **Error! MPI task 3 in elsi_dm_complex_sparse: Number of
> MPI tasks
> > per pole too small
> > **Error! MPI task 219 in elsi_dm_complex_sparse: Number of
> MPI tasks
> > per pole too small
> > I couldn't find any related info in the manual. According to the
> error
> > message, I have tried to increase the number of MPI tasks, still
> waiting
> > for the simulation to run. Reducing the number of poles doesn't
> help, as
> > at some point, the error "Too few PEXSI poles" will happen.
> >
> > Any suggestion would be appreciated!
> >
> > Thanks,
> > Huy
> >
> > --
> >
> > C. Huy Pham, Ph.D.
> >
> > Staff Scientist
> >
> > Lawrence Livermore National Laboratory
> >
> > Physical and Life Sciences Directorate
> >
> > Materials Science Division, L-287
> >
> > 7000 East Ave, Livermore, CA 94550
> >
> > Phone: (925) 422-7881
> >
> > _______________________________________________
> > DFTB-Plus-User mailing list
> > DFTB-Plus-User at mailman.zfn.uni-bremen.de
> <mailto:DFTB-Plus-User at mailman.zfn.uni-bremen.de>
> >
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
> <https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user>
>
>
> --
> Dr. Bálint Aradi
> Bremen Center for Computational Materials Science, University of Bremen
> http://www.bccms.uni-bremen.de/cms/people/b-aradi/
> <http://www.bccms.uni-bremen.de/cms/people/b-aradi/>
>
> _______________________________________________
> DFTB-Plus-User mailing list
> DFTB-Plus-User at mailman.zfn.uni-bremen.de
> <mailto:DFTB-Plus-User at mailman.zfn.uni-bremen.de>
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
> <https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user>
>
>
>
> --
>
> C. Huy Pham, Ph.D.
>
> Staff Scientist
>
> Lawrence Livermore National Laboratory
>
> Physical and Life Sciences Directorate
>
> Materials Science Division, L-287
>
> 7000 East Ave, Livermore, CA 94550
>
> Phone: (925) 422-7881
>
> _______________________________________________
> DFTB-Plus-User mailing list
> DFTB-Plus-User at mailman.zfn.uni-bremen.de
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
--
Dr. Bálint Aradi
Bremen Center for Computational Materials Science, University of Bremen
http://www.bccms.uni-bremen.de/cms/people/b-aradi/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 840 bytes
Desc: OpenPGP digital signature
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20220201/b5602d8d/attachment.sig>
More information about the DFTB-Plus-User
mailing list