[DFTB-Plus-User] Cannot calculate transport

Gabriele Penazzi penazzi at uni-bremen.de
Wed Apr 15 20:52:00 CEST 2015


On 15.04.2015 18:57, Alessandro Pirrotta wrote:
> Dear Argo,
>
> Thank you very much for your email and your tips but none of those 
> fixed the problem.
>
> Dear Gabriele,
>
> Thank you for your email.
> The second paragraph of the trouble shooting (executing "ulimit -t 
> unlimited" right before running dftb+)
> fixes the problem. Now the program runs. Is that something that should 
> not happen on a correctly compiled dftb+?

Hi Alessandro,

no, it is not not really a problem, it is a predicted behaviour. Modern 
ifort compiler will intensively allocate arrays in stack memory, the 
reason being that stack is more efficient. On the other hand stack is 
usually limited by the OS, therefore you can happen to have a 
segmentation fault for apparently no reason, depending on the code flow. 
You can tell ifort to use more intensively heap memory just setting a 
flag (I don't have a documentation now, it's something like 
-heap-arrays). But performance will be affected, you can try but I would 
not recommend that.

gfortran has the opposite default, but also you can tell gfortran to 
rely more heavily stack (-fstack-arrays) to mimic ifort behaviour.

If your system allows to set ulimit -s unlimited you're always fine (I 
posted the intel link because apparently this is not always working, 
even though it always worked for me), you don't need to fight with 
compilation flag, just remember to add the command in your scripts. I 
did not try those compiling options myself.

Gabriele

>
> I tried to compiling again using these flags ( -ipo -O3 -no-prec-div 
> -xP ) as suggested on the intel forum but I could not compile it 
> correctly. I got an extremely long list of : 
> ipo_out4.f:(.text.hot000d3+0x33df): undefined reference to ..... "
>
> Regards,
> Alessandro
>
> *Alessandro Pirrotta*
> PhD student
>
> *Faculty of Science**
> *Department of Chemistry &
> Nano-Science Center
> University of Copenhagen
> Universitetsparken 5, C321
> 2100 Copenhagen Ø
> Denmark
>
> DIR +45 21 18 11 90
> MOB +45 52 81 23 41
>
> alessandro.pirrotta at chem.ku.dk <mailto:alessandro.pirrotta at chem.ku.dk>
>
> alessandro.pirrotta at gmail.com <mailto:alessandro.pirrotta at gmail.com>
>
> www.ki.ku.dk <http://www.ki.ku.dk/>
>
>
> On 15 April 2015 at 14:33, Gabriele Penazzi <penazzi at uni-bremen.de 
> <mailto:penazzi at uni-bremen.de>> wrote:
>
>     Hi Alessandro,
>
>     have you checked if the system stack setting is responsible for
>     this? (see troubleshooting section in the manual and this link
>     https://software.intel.com/en-us/articles/intel-fortran-compiler-increased-stack-usage-of-80-or-higher-compilers-causes-segmentation-fault)?
>
>
>     Gabriele
>
>
>
>     On 04/15/2015 01:47 PM, Alessandro Pirrotta wrote:
>>     Dear DFTB+ users,
>>
>>     I have compiled and successfully tested (with autotest)
>>     dftb+mpi-negf.
>>
>>     I have then run a drain, source and device calculation.
>>     Drain and source run fine, but then the device calculation
>>     crashes giving this output:
>>
>>     Reading SKF file
>>     ....
>>     .....
>>     Done.
>>
>>     Processed input in HSD format written to 'dftb_pin.hsd'
>>
>>     Starting initialization...
>>     --------------------------------------------------------------------------------
>>     PGRID:ALLPROC: 1 x 1
>>     PGRID:ATOM: 1 x 1
>>     Poisson parallelized on 1 nodes
>>      Atomic density tolerance: 1.000000000000000E-005
>>      Atomic density cutoff:    9.39669825812203  a.u.
>>      Input PoissonBox=   30.0000   30.0000   30.0000  A
>>      PoissAcc=  1.000000000000000E-007
>>      Bulk Boundary Potential:    No
>>      Atomic cutoff radius=   4.97251893391611      A
>>
>>     CENTRAL REGION
>>      Atom start - end =      1    22
>>
>>     CONTACT #  1
>>      Atom start - end =     23   118
>>      direction:  3
>>      Fermi Level=  -4.70500000000000      eV
>>      mu=  0.136056922500000      V
>>
>>     CONTACT #  2
>>      Atom start - end =    119   214
>>      direction: -3
>>      Fermi Level=  -4.70500000000000      eV
>>      mu= -0.136056922500000      V
>>
>>     >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>      INIT MPI-NEGF ON           1 NODES
>>      CPU         0 READY
>>      PRINTING CPU: T
>>     Mode:                      Static calculation
>>     Contact upload:              Yes
>>     Contact calculation:         No
>>     Use Poisson solver:          Yes
>>     Self consistent charges:     Yes
>>     SCC-tolerance:                 0.100000E-06
>>     Max. scc iterations:                   2000
>>     Ewald alpha parameter:         0.000000E+00
>>     Spin polarisation:           No
>>     Nr. of up electrons:          1091.000000
>>     Nr. of down electrons:        1091.000000
>>     Periodic boundaries:         No
>>     Diagonalizer:                Green's Functions (density matrix)
>>     Maximal SCC-cycles:                    2000
>>     Electronic temperature:        0.100000E-07
>>     Initial charges:             Set automatically (system chrg:  
>>     0.000E+00)
>>     Included shells:              S:  s, p, d
>>                           C:  s, p
>>                           O:  s, p
>>                           H:  s
>>                          Au:  s, p, d
>>     Extra options:
>>     --------------------------------------------------------------------------------
>>
>>     ********************************************************************************
>>     ** Geometry step: 0
>>     ********************************************************************************
>>
>>     MAXNEIGHBORS: 161
>>     Structure info:
>>     Number of PLs:           1
>>     Interacting PLs:           1           1
>>
>>       iSCC Total electronic   Diff electronic  SCC error
>>     =========================================================================
>>                 SOLVING POISSON EQUATION
>>     =========================================================================
>>     -------------------------------------------------------------------------
>>      Poisson Box internally adjusted:
>>      x range=   -14.93910    15.06090; Periodic:  F
>>      y range=   -15.02760    14.97240; Periodic:  F
>>      z range=    -3.14441    15.18553; Periodic:  F
>>      Mesh details:
>>      Lx=  30.000  nx= 129   dlx=  0.23437
>>      Ly=  30.000  ny= 129   dly=  0.23437
>>      Lz=  18.330  nz=  65   dlz=  0.28641
>>     -------------------------------------------------------------------------
>>      -------------------------------------------------------------------------
>>      Relative Poisson Error =  1.952627165542703E-008
>>      Number of cycles executed =  8/ 60
>>      -------------------------------------------------------------------------
>>     *************************************************************************
>>     =========================================================================
>>                  COMPUTING DENSITY MATRIX
>>     =========================================================================
>>     CONTOUR INTEGRATION:*** 70 points
>>     srun: error: node244: task 0: Segmentation fault (core dumped)
>>     srun: Terminating job step 615175.0
>>
>>     I have compiled it with mpif90, ifort 13.0.1, and openmpi-1.6.5-i4
>>     Any idea why the tests run fine and this does not?
>>
>>     Kind regards,
>>     Alessandro
>>
>>     *Alessandro Pirrotta*
>>     PhD student
>>
>>     *Faculty of Science**
>>     *Department of Chemistry &
>>     Nano-Science Center
>>     University of Copenhagen
>>     Universitetsparken 5, C321
>>     2100 Copenhagen Ø
>>     Denmark
>>
>>     DIR +45 21 18 11 90 <tel:%2B45%2021%2018%2011%2090>
>>     MOB +45 52 81 23 41 <tel:%2B45%2052%2081%2023%2041>
>>
>>     alessandro.pirrotta at chem.ku.dk
>>     <mailto:alessandro.pirrotta at chem.ku.dk>
>>
>>     alessandro.pirrotta at gmail.com <mailto:alessandro.pirrotta at gmail.com>
>>
>>     www.ki.ku.dk <http://www.ki.ku.dk/>
>>
>>
>>
>>     _______________________________________________
>>     DFTB-Plus-User mailing list
>>     DFTB-Plus-User at mailman.zfn.uni-bremen.de  <mailto:DFTB-Plus-User at mailman.zfn.uni-bremen.de>
>>     https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
>
>     -- 
>     --
>     Dr. Gabriele Penazzi
>     BCCMS - University of Bremen
>
>     http://www.bccms.uni-bremen.de/
>     http://sites.google.com/site/gabrielepenazzi/
>     phone:+49 (0) 421 218 62337  <tel:%2B49%20%280%29%20421%20218%2062337>
>     mobile:+49 (0) 151 19650383  <tel:%2B49%20%280%29%20151%2019650383>
>
>
>     _______________________________________________
>     DFTB-Plus-User mailing list
>     DFTB-Plus-User at mailman.zfn.uni-bremen.de
>     <mailto:DFTB-Plus-User at mailman.zfn.uni-bremen.de>
>     https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user
>
>
>
>
> _______________________________________________
> DFTB-Plus-User mailing list
> DFTB-Plus-User at mailman.zfn.uni-bremen.de
> https://mailman.zfn.uni-bremen.de/cgi-bin/mailman/listinfo/dftb-plus-user

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.zfn.uni-bremen.de/pipermail/dftb-plus-user/attachments/20150415/ab29b4e1/attachment-0001.html>


More information about the DFTB-Plus-User mailing list