[DFTB-Plus-User] error with ELSI solvers

Huy Pham pchuy1906 at gmail.com
Fri Jan 28 15:57:58 CET 2022


Dear Bálint,

Thanks for your quick reply!

I am able to solve the PEXSI problem following your suggestion. But again
after the first SCF, there is an error "segmentation fault ".

Adding "ulimit -s unlimited" before running the exe file doesn't help,
still the same error. Please let me know if you have further suggestions.
Attached are my input files, if you need it.

Thanks,
Huy

On Fri, Jan 28, 2022 at 12:12 AM Bálint Aradi <aradi at uni-bremen.de> wrote:

> Dear Huy,
>
> For the segfault: Often a limited stack size is the reason (hits me
> often). Could you check, whether that is the case? (ulimit -s unlimited)
>
> As for the PEXSI problem. The error message is generated by the PEXSI
> library within ELSI. I am definitely not a PEXSI expert, but having a
> quick look at the PEXSI code in ELSI
>
>
> https://gitlab.com/ElectronicStructureLibrary/elsi-interface/-/blob/master/src/elsi_util.f90#L423
>
> suggests, that the product of the number of poles per process
> (ProcsPerPole), the number of poles (Poles) and the number of processors
> used to search the Fermi level (muPoints) should be greater or equal
> than the number of the MPI processes.
>
> I hope this helps.
>
> Best regards,
>
> BálintBest regards,
>
> Bálint
>
>
>
> On 28.01.22 01:58, Huy Pham wrote:
> > Hi All,
> >
> > I have a problem when running short MD simulation using DFTB plus with
> > different solvers in ELSI. I used the 3ob dataset, the system is a
> > molecular crystal (C, H, N, O) with 576 atoms (supercell).
> >
> > The simulation was fine with ELPA, but if I changed the solver to OMM or
> > NTPoly, the simulations crashed after finishing the first SCF. The error
> > is "SIGSEGV, segmentation fault occurred" which is difficult to track
> down.
> >
> > When using PEXSI, the error is below:
> > **Error! MPI task      38 in elsi_dm_complex_sparse: Number of MPI tasks
> > per pole too small
> > **Error! MPI task     270 in elsi_dm_complex_sparse: Number of MPI tasks
> > per pole too small
> > **Error! MPI task       3 in elsi_dm_complex_sparse: Number of MPI tasks
> > per pole too small
> > **Error! MPI task     219 in elsi_dm_complex_sparse: Number of MPI tasks
> > per pole too small
> > I couldn't find any related info in the manual. According to the error
> > message, I have tried to increase the number of MPI tasks, still waiting
> > for the simulation to run. Reducing the number of poles doesn't help, as
> > at some point, the error "Too few PEXSI poles" will happen.
> >
> > Any suggestion would be appreciated!
> >
> > Thanks,
> > Huy
> >
> > --
> >
> > C. Huy Pham, Ph.D.
> >
> > Staff Scientist
> >
> > Lawrence Livermore National Laboratory
> >
> > Physical and Life Sciences Directorate
> >
> > Materials Science Division, L-287
> >
> > 7000 East Ave, Livermore, CA 94550
> >
> > Phone: (925) 422-7881
> >
> > _______________________________________________
> > DFTB-Plus-User mailing list
> > DFTB-Plus-User at mailman.zfn.uni-bremen.de
> >
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
>
>
> --
> Dr. Bálint Aradi
> Bremen Center for Computational Materials Science, University of Bremen
> http://www.bccms.uni-bremen.de/cms/people/b-aradi/
>
> _______________________________________________
> DFTB-Plus-User mailing list
> DFTB-Plus-User at mailman.zfn.uni-bremen.de
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
>


-- 

C. Huy Pham, Ph.D.

Staff Scientist

Lawrence Livermore National Laboratory

Physical and Life Sciences Directorate

Materials Science Division, L-287

7000 East Ave, Livermore, CA 94550
Phone: (925) 422-7881
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20220128/216daa8e/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: dftb.gen
Type: application/octet-stream
Size: 38271 bytes
Desc: not available
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20220128/216daa8e/attachment-0002.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: dftb_in.hsd
Type: application/octet-stream
Size: 875 bytes
Desc: not available
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20220128/216daa8e/attachment-0003.obj>


More information about the DFTB-Plus-User mailing list