[DFTB-Plus-User] output statement overflows record

siddheshwar chopra sidhusai at gmail.com
Tue Jul 10 17:21:35 CEST 2018


Dear Márton,
I think the solution for virtual memory problem is as follows:

Ulimit -s unlimited
Regards,


On Tue 10 Jul, 2018 7:09 pm Szendrő Márton, <marciarcu at gmail.com> wrote:

> Dear Ben,
>
> Thank you for your suggestion, it eliminated the error indeed. However
> when i tried to run with the default solver i got the following error:
>
> forrtl: severe (41): insufficient virtual memory
> Image              PC                Routine            Line        Source
> dftb+              00000000013A0EF0  Unknown               Unknown  Unknown
> dftb+              000000000088364F  initprogram_MP_al        3222
> initprogram.f90
> dftb+              00000000008811FA  initprogram_MP_in        3146
> initprogram.f90
> dftb+              0000000000857F71  initprogram_MP_in        2124
> initprogram.f90
> dftb+              0000000000409A08  MAIN__                     31
> dftbplus.f90
> dftb+              000000000040919E  Unknown               Unknown  Unknown
> libc.so.6          00000033F021ED1D  Unknown               Unknown  Unknown
> dftb+              00000000004090A9  Unknown               Unknown  Unknown
>
> I ran my 30 000 atom simulation on 140 cpu cores with a total of 420Gb ram
> (distributed among 7 nodes).  Could you please help me whether this is a
> bug or i just run out of memory? How much memory do i need to perform a
> 10^4 atom simulation? Is there a way to reduce memory consumption?
>
> With sincere thanks,
> Marton Szendro
>
> Ben Hourahine <benjamin.hourahine at strath.ac.uk> ezt írta (időpont: 2018.
> júl. 2., H, 16:46):
>
>> Hello Szendrő,
>>
>> the problem is due to the same cause as
>>
>> https://github.com/dftbplus/dftbplus/issues/13
>> The work around is to disable the write out of dftb_pin.hsd which can be
>> done by setting
>>
>> ParserOptions = {
>>   WriteHSDInput = No
>> }
>>
>> If you are simulating very large systems, its possible that the
>> experimental branch
>>
>> https://github.com/bhourahine/dftbplus/tree/elsi
>>
>> may be of use, as it includes both the ELPA and PEXSI solvers which have
>> better parallel performance than ScaLAPACK. However, this code is work in
>> progress so you might want to wait for it to be merged into the next major
>> release.
>>
>> Regards
>>
>> Ben
>>
>> On 02/07/18 14:11, Szendrő Márton wrote:
>>
>> Dear All,
>>
>> I am trying to calculate the bandstructure of a very large system (circa
>> 30 000 atoms) on a supercomputer cluster with MPI. (ifort 16.0.1, intel mpi
>> 5.1.2, MKL 2016.1.150).
>> I am wondering whether i reached some sort of physical limitations here,
>> or is it a bug, because i got the following error during the run:
>>
>> forrtl: severe (66): output statement overflows record, unit -132, file
>> /Lustre01/home/szendro/phd/hopg/dftb/rigid/dftb_pin.hsd
>> Image              PC                Routine            Line        Source
>> dftb+_dev          0000000000E7E3A9  Unknown               Unknown
>> Unknown
>> dftb+_dev          0000000000ECA9A9  Unknown               Unknown
>> Unknown
>> dftb+_dev          0000000000EC7B53  Unknown               Unknown
>> Unknown
>> dftb+_dev          0000000000665C00  hsdparser_MP_dump         867
>> hsdparser.f90
>> dftb+_dev          0000000000666D4A  hsdparser_MP_dump         933
>> hsdparser.f90
>> dftb+_dev          0000000000666D96  hsdparser_MP_dump         935
>> hsdparser.f90
>> dftb+_dev          000000000066574C  hsdparser_MP_dump         836
>> hsdparser.f90
>> dftb+_dev          0000000000665502  hsdparser_MP_dump         811
>> hsdparser.f90
>> dftb+_dev          0000000000B711DB  parser_MP_parsehs         172
>> parser.f90
>> dftb+_dev          000000000040998D  MAIN__                     29
>> dftbplus.f90
>> dftb+_dev          000000000040919E  Unknown               Unknown
>> Unknown
>> libc.so.6          00000033F021ED1D  Unknown               Unknown
>> Unknown
>> dftb+_dev          00000000004090A9  Unknown               Unknown
>> Unknown
>>
>> I was able to succesfully run the code for 1204 atoms, so i guess the
>> problem is related to the size of the system.
>>
>> Any help would be greatly appreciated.
>>
>> Best regards,
>> Marton Szendro
>>
>>
>>
>>
>>
>>
>>
>> _______________________________________________
>> DFTB-Plus-User mailing listDFTB-Plus-User at mailman.zfn.uni-bremen.dehttps://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
>>
>>
>> --
>>       Dr. B. Hourahine, SUPA, Department of Physics,
>>     University of Strathclyde, John Anderson Building,
>>             107 Rottenrow, Glasgow G4 0NG, UK.
>>     +44 141 548 2325, benjamin.hourahine at strath.ac.uk
>>
>> 2013/4 THE Awards Entrepreneurial University of the Year
>>       2012/13 THE Awards UK University of the Year
>>
>>    The University of Strathclyde is a charitable body,
>>         registered in Scotland, number SC015263
>>
>> _______________________________________________
>> DFTB-Plus-User mailing list
>> DFTB-Plus-User at mailman.zfn.uni-bremen.de
>> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
>
> _______________________________________________
> DFTB-Plus-User mailing list
> DFTB-Plus-User at mailman.zfn.uni-bremen.de
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20180710/1c7cf4b1/attachment.htm>


More information about the DFTB-Plus-User mailing list