[DFTB-Plus-User] Scalapack error in MD run
Bálint Aradi
aradi at uni-bremen.de
Mon Jul 25 11:09:00 CEST 2022
Dear Luca,
It may be a ScaLAPACK issue as well. A few notes on this:
- depending on the size of your Hamiltonian, 64 cores may be a lot (the
DFTB Hamiltonian is very small compared to typical DFT Hamiltonians),
not givin you any speed advantage over a lower number of cores.
- You could try to use the ELPA solver, as it does not depend on
ScaLAPACK and may be more stable.
- did you try to start from the crashing geometry? If yes, does it work
(at least for a further few 100 of steps)? In that case, it would be
likely a ScaLAPACK or MPI problem, where something piles up during the
iterations and crashes at some point. You could try ELPA or a different
MPI framework.
Best regards,
Bálint
On 25.07.22 10:13, Luca Babetto wrote:
> Dear Bálint,
>
> As always, thank you for your reply. I checked the geometries and
> everything seems fine, there are no atoms "dangerously" close to one
> another which would cause the issue you mentioned, so I suspect that is
> not the cause.
>
> I have tried running a few tests with the same exact input file, but i)
> reducing the number of cores from 20 to 6 and ii) using the shared
> memory version of DFTB+ and even though I have only let it run for a
> couple thousand steps, I have not encountered the same issue in either
> case (the "normal" run would consistently crash within a couple hundred
> MD steps).
>
> This problem therefore seems to be related to simulations running on
> many cores. For information, we are running DFTB+ on a 64-core
> Threadripper PRO 3995WX workstation running Ubuntu 20.04, launching the
> simulations via Slurm for the job scheduling, on the DFTB+ software as
> installed directly via the conda repository (both for the openmpi and
> shared memory version). Let me know if there are more tests I can run to
> better help diagnose the problem, or if I should directly open an issue
> on GitHub with this information. We have not encountered this problem
> before on simulations running with the mio and "standard" 3ob
> parameters, as well as the xTB Hamiltonian.
>
> Kind regards
>
> _______________________________________________
> DFTB-Plus-User mailing list
> DFTB-Plus-User at mailman.zfn.uni-bremen.de
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
--
Dr. Bálint Aradi
Bremen Center for Computational Materials Science, University of Bremen
http://www.bccms.uni-bremen.de/cms/people/b-aradi/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 840 bytes
Desc: OpenPGP digital signature
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20220725/18f3be87/attachment.sig>
More information about the DFTB-Plus-User
mailing list