First steps: walk through from DFT(standalone): Difference between revisions

From The Yambo Project
Jump to navigation Jump to search
 
(26 intermediate revisions by 5 users not shown)
Line 17: Line 17:
=== Prerequisites ===
=== Prerequisites ===
'''You will need''':
'''You will need''':
<!--
* PWSCF input files and pseudopotentials for hBN bulk  
* PWSCF input files and pseudopotentials for hBN bulk  
* <code>pw.x</code> executable, version 5.0 or later
* <code>pw.x</code> executable, version 5.0 or later
* <code>p2y</code> and <code>yambo</code> executables
* <code>p2y</code> and <code>yambo</code> executables
* <code>gnuplot</code> for plotting spectra
-->
<!--
* Before starting, [[Get Tutorial files CECAM2021|get tutorial files]]
-->
* Before starting, get the hBN tutorial files [https://www.yambo-code.eu/wiki/index.php/Tutorials#Tutorial_files here]
* <code>yambo</code> executable
* <code>gnuplot</code> for plotting spectra
* <code>gnuplot</code> for plotting spectra


<!--
==Download the Files==
==Download the Files==


Download and unpack the followint files:
Download and unpack the followint files:
[http://www.yambo-code.org/educational/tutorials/files/hBN.tar.gz hBN.tar.gz] [15 MB],
[https://www.yambo-code.org/educational/tutorials/files/hBN.tar.gz hBN.tar.gz] [15 MB],
[http://www.yambo-code.org/educational/tutorials/files/hBN-2D.tar.gz hBN-2D.tar.gz] [8,6 MB]
[https://www.yambo-code.org/educational/tutorials/files/hBN-2D.tar.gz hBN-2D.tar.gz] [8,6 MB]


In the next days you could also use this file which you may like to download now
In the next days you could also use this file which you may like to download now
[http://www.yambo-code.org/educational/tutorials/files/hBN-convergence-kpoints.tar.gz hBN-convergence-kpoints.tar.gz] [254 MB]
[https://www.yambo-code.org/educational/tutorials/files/hBN-convergence-kpoints.tar.gz hBN-convergence-kpoints.tar.gz] [254 MB]


After downloading the tar.gz files just unpack them in the '''YAMBO_TUTORIALS''' folder. For example
After downloading the tar.gz files just unpack them in the '''YAMBO_TUTORIALS''' folder. For example
Line 44: Line 53:
(Advanced users can download and install all tutorial files using git. See the main  [[Tutorials#Files|Tutorial Files]] page.)
(Advanced users can download and install all tutorial files using git. See the main  [[Tutorials#Files|Tutorial Files]] page.)


Now you can go directly in '''YAMBO''' folder where you can find the '''SAVE''' folder which is needed to start the yambo tutorials (for hBN bulk in this case) and go directly to the initialiation step [[Initialization step]]
Now you can go directly in '''YAMBO''' folder where you can find the '''SAVE''' folder which is needed to start  and go directly to '''[Initialization of Yambo databases]'''  below, which is always the first step you have to perform for any simulation using the Yambo code.


 
Or if you wish you can learn also how to start from the DFT simulations doing a scf and nscf calculation, entering in '''PWSCF''' folder. In this way you will see how you can create the '''SAVE''' folder starting from *.save directory produced by pw.x.
Or if you wish you can learn (see below) how to start from the DFT simulations doing a scf and nscf calculation, entering in '''PWSCF''' folder. In this way you will see how you can create the '''SAVE''' folder starting from *.save directory produced by pw.x.


==DFT calculation of bulk hBN and conversion to Yambo==
==DFT calculation of bulk hBN and conversion to Yambo==
Line 79: Line 87:
  data-file.xml charge-density.dat gvectors.dat B.pz-vbc.UPF N.pz-vbc.UPF
  data-file.xml charge-density.dat gvectors.dat B.pz-vbc.UPF N.pz-vbc.UPF
  K00001 K00002 .... K00035 K00036
  K00001 K00002 .... K00035 K00036
-->


=== Conversion to Yambo format ===
=== Conversion to Yambo format ===
Line 88: Line 95:
  $ p2y
  $ p2y
  ...
  ...
  <---> DBs path set to .
  <--%-> DBs path set to .
  <---> Index file set to data-file.xml
  <--%-> Index file set to data-file.xml
  <---> Header/K-points/Energies... done
  <--%-> Header/K-points/Energies... done
  ...
  ...
  <---> == DB1 (Gvecs and more) ...
  <--%-> == DB1 (Gvecs and more) ...
  <---> ... Database done
  <--%-> ... Database done
  <---> == DB2 (wavefunctions)  ... done ==
  <--%-> == DB2 (wavefunctions)  ... done ==
  <---> == DB3 (PseudoPotential) ... done ==
  <--%-> == DB3 (PseudoPotential) ... done ==
  <--->  == P2Y completed ==
  <--%->  == P2Y completed ==


This output repeats some information about the system and generates a ''SAVE'' directory:
This output repeats some information about the system and generates a ''SAVE'' directory:
Line 125: Line 132:
  $ ls
  $ ls
  SAVE
  SAVE
-->


==Initialization of Yambo databases==
==Initialization of Yambo databases==
Use the ''SAVE'' folders that are already provided, rather than any ones you may have generated previously.  
<!-- Use the ''SAVE'' folders that are already provided, rather than any ones you may have generated previously. -->


Every Yambo run '''must''' start with this step. Go to the folder ''containing'' the hBN-bulk <code>SAVE</code> directory:
Every Yambo run '''must''' start with this step. Go to the folder ''containing'' the hBN-bulk <code>SAVE</code> directory:
Line 174: Line 182:
Specific runlevels are indicated with numeric labels like [02.02]. <br>
Specific runlevels are indicated with numeric labels like [02.02]. <br>
The hashes (#) indicate progress of the run in Wall Clock time, indicating the elapsed (E) and expected (X) time to complete a runlevel, and the percentage of the task complete.
The hashes (#) indicate progress of the run in Wall Clock time, indicating the elapsed (E) and expected (X) time to complete a runlevel, and the percentage of the task complete.
In this case the simulation was so fast that there is not output. On longer simulations you will be able to appreciate this feature.


===New core databases===
===New core databases===
Line 262: Line 271:




Yambo recalculates again the Fermi level (close to the value of 5.06 noted in the PWscf SCF calculation). From here on, however, the Fermi level is set to zero, and other eigenvalues are shifted accordingly. The system is insulating (8 filled, 92 empty) with an indirect band gap of 3.87 eV. The minimum and maximum direct and indirect gaps are indicated. There are 72 k-points in the full BZ, generated using symmetry from the 14 k-points in our user-defined grid.
Yambo recalculates again the Fermi level (close to the value of 5.06 noted in the PWscf SCF calculation). From here on, however, the Fermi level is set to zero, and other eigenvalues are shifted accordingly. The system is insulating (8 filled, 92 empty) with an indirect band gap of 3.87 eV. The direct and indirect gaps are indicated. There are 72 k-points in the full BZ, generated using symmetry from the 14 k-points in our user-defined grid.


'''TIP''': You should inspect the report file after ''every'' run for errors and warnings.
'''TIP''': You should inspect the report file after ''every'' run for errors and warnings.
Line 313: Line 322:


=== Input file generator ===
=== Input file generator ===
We are going to work again with bulk hBN.
First, move to the appropriate folder and initialize the Yambo databases if you haven't already done so.
First, move to the appropriate folder and initialize the Yambo databases if you haven't already done so.
  $ cd YAMBO_TUTORIALS/hBN/YAMBO
  $ cd YAMBO_TUTORIALS/hBN/YAMBO
Line 356: Line 366:
The options can be split into two sets: <br>
The options can be split into two sets: <br>
* A set of options which is needed to generate the appropriate input file (default name: ''yambo.in'') selecting the kind of simulation you would like to perform  <br>
* A set of options which is needed to generate the appropriate input file (default name: ''yambo.in'') selecting the kind of simulation you would like to perform  <br>
* A set of options which can be used to manage auxiliary functions (like redirect the I/O, choose a specific name for the input file, etc ..).  
* A set of options which can be used to manage auxiliary functions (like redirect the I/O, choose a specific name for the input file, etc ..).


===Runlevel selection===
===Runlevel selection===
Line 369: Line 379:
For instance, to generate an input file for optical spectra including local field effects (Hartree approximation), do (and then exit)
For instance, to generate an input file for optical spectra including local field effects (Hartree approximation), do (and then exit)
  $ yambo -optics c -kernel hartree      ''which switches on:''
  $ yambo -optics c -kernel hartree      ''which switches on:''
  optics                      # [R OPT] Optics
  optics                      # [R] Linear Response optical properties
  chi                          # [R CHI] Dyson equation for Chi.
  chi                          # [R][CHI] Dyson equation for Chi.
  Chimod= "Hartree"            # [X] IP/Hartree/ALDA/LRC/BSfxc
  Chimod= "Hartree"            # [X] IP/Hartree/ALDA/LRC/PF/BSfxc
To perform a Hartree-Fock and GW calculation using a plasmon-pole approximation, do (and then exit):
To perform a Hartree-Fock and GW calculation using a plasmon-pole approximation, do (and then exit):
  $ yambo -hf -gw0 p -dyson n        ''which switches on:''
  $ yambo -hf -gw0 p -dyson n        ''which switches on:''
Line 391: Line 401:
  [[Variables#EXXRLvcs|EXXRLvcs]] =  3187        RL    # [XX] Exchange RL components
  [[Variables#EXXRLvcs|EXXRLvcs]] =  3187        RL    # [XX] Exchange RL components
  %[[Variables#QPkrange|QPkrange]]                    # [GW] QP generalized Kpoint/Band indices
  %[[Variables#QPkrange|QPkrange]]                    # [GW] QP generalized Kpoint/Band indices
   1| 14|  1|100|
   1| 14|  6|10|
  %
  %
The <code>[[Variables#QPkrange|QPkrange]]</code> variable (follow the link for a "detailed" explanation for any variable) suggests a range of k-points (1 to 14) and bands (1 to 100) based on what it finds in the core database ''SAVE/ns.db1'', i.e. as defined by the DFT code. <br>
The <code>[[Variables#QPkrange|QPkrange]]</code> variable (follow the link for a "detailed" explanation for any variable) suggests a range of k-points (1 to 14) and bands (1 to 100) based on what it finds in the core database ''SAVE/ns.db1'', i.e. as defined by the DFT code. <br>
Line 464: Line 474:
===Job script label===
===Job script label===
The best way to keep track of different runs using different parameters is through the <code>-J</code> flag. This inserts a label in all output and report files, and creates a new folder containing any new databases (i.e. they are not written in the core ''SAVE'' folder). Try:
The best way to keep track of different runs using different parameters is through the <code>-J</code> flag. This inserts a label in all output and report files, and creates a new folder containing any new databases (i.e. they are not written in the core ''SAVE'' folder). Try:
  $ yambo -J 1Ry -V RL -hf -F yambo_hf.in        ''and modify to''
  $ yambo -V RL -hf -F yambo_hf.in        ''and modify to''
  FFTGvecs = 1 Ry
  FFTGvecs = 1 Ry
  EXXGvecs = 1 Ry
  EXXRLvcs = 1 Ry
VXCRLvcs = 1 Ry
  $ yambo -J 1Ry -F yambo_hf.in          ''Run the code''
  $ yambo -J 1Ry -F yambo_hf.in          ''Run the code''
  $ ls
  $ ls
Line 474: Line 485:


''Exercise'': use <code>yambo</code> to report the properties of all database files (including ''ndb.HF_and_locXC'')
''Exercise'': use <code>yambo</code> to report the properties of all database files (including ''ndb.HF_and_locXC'')
==Links==
* Back to [[ICTP 2022#Tutorials]]
* Back to [[CECAM VIRTUAL 2021#Tutorials]]
<!--
<br>
{| style="width:100%" border="1"
|style="width:15%; text-align:left"|Prev: [[CECAM_VIRTUAL_2021#Tutorials|CECAM School Home]]
|style="width:50%; text-align:center"|Now: CECAM School Home -> [[First_steps:_walk_through_from_DFT(standalone)|First steps]]
|style="width:35%; text-align:right"|Next: CECAM School Home -> [[Next steps: RPA calculations (standalone)|Next steps]]
|-
|}
-->

Latest revision as of 15:25, 24 May 2023

In this tutorial you will learn how to calculate optical spectra using Yambo, starting from a DFT calculation and ending with a look at local field effects in the optical response.

System characteristics

We will use a 3D system (bulk hBN) and a 2D system (hBN sheet).

Atomic structure of bulk hBN Atomic structure of 2D hBN

Hexagonal boron nitride - hBN:

  • HCP lattice, ABAB stacking
  • Four atoms per cell, B and N (16 electrons)
  • Lattice constants: a = 4.716 [a.u.], c/a = 2.582
  • Plane wave cutoff 40 Ry (~1500 RL vectors in wavefunctions)
  • SCF run: shifted 6x6x2 grid (12 k-points) with 8 bands
  • Non-SCF run: gamma-centred 6x6x2 (14 k-points) grid with 100 bands

Prerequisites

You will need:

  • Before starting, get the hBN tutorial files here
  • yambo executable
  • gnuplot for plotting spectra


Initialization of Yambo databases

Every Yambo run must start with this step. Go to the folder containing the hBN-bulk SAVE directory:

$ cd YAMBO_TUTORIALS/hBN/YAMBO
$ ls
SAVE

TIP: do not run yambo from inside the SAVE folder! This is the wrong way ..

$ cd SAVE
$ yambo
yambo: cannot access CORE database (SAVE/*db1 and/or SAVE/*wf)

In fact, if you ever see such message: it usually means you are trying to launch Yambo from the wrong place.

$ cd ..

Now you are in the proper place and

$ ls
SAVE

you can simply launch the code

$ yambo 

This will run the initialization (setup) runlevel.

Run-time output

This is typically written to standard output (on screen) and tracks the progress of the run in real time:

<---> [01] MPI/OPENMP structure, Files & I/O Directories
<---> [02] CORE Variables Setup
<---> [02.01] Unit cells
<---> [02.02] Symmetries
<---> [02.03] Reciprocal space
<---> Shells finder |########################################| [100%] --(E) --(X)
<---> [02.04] K-grid lattice
<---> Grid dimensions      :   6   6   2
<---> [02.05] Energies & Occupations
<---> [03] Transferred momenta grid and indexing
<---> BZ -> IBZ reduction |########################################| [100%] --(E) --(X)
<---> [03.01] X indexes
<---> X [eval] |########################################| [100%] --(E) --(X)
<---> X[REDUX] |########################################| [100%] --(E) --(X)
<---> [03.01.01] Sigma indexes
<---> Sigma [eval] |########################################| [100%] --(E) --(X)
<---> Sigma[REDUX] |########################################| [100%] --(E) --(X)
<---> [04] Timing Overview
<---> [05] Memory Overview
<---> [06] Game Over & Game summary

Specific runlevels are indicated with numeric labels like [02.02].
The hashes (#) indicate progress of the run in Wall Clock time, indicating the elapsed (E) and expected (X) time to complete a runlevel, and the percentage of the task complete. In this case the simulation was so fast that there is not output. On longer simulations you will be able to appreciate this feature.

New core databases

New databases appear in the SAVE folder:

$ ls SAVE
ns.db1 ns.wf ns.kb_pp_pwscf ndb.gops ndb.kindx
ns.wf_fragments_1_1 ...
ns.kb_pp_pwscf_fragment_1 ...

These contain information about the G-vector shells and k/q-point meshes as defined by the DFT calculation.

In general: a database called ns.xxx is a static database, generated once by p2y, while databases called ndb.xxx are dynamically generated while you use yambo.

TIP: if you launch yambo, but it does not seem to do anything, check that these files are present.


Report file

A report file r_setup is generated in the run directory. This mostly reports information about the ground state system as defined by the DFT run, but also adds information about the band gaps, occupations, shells of G-vectors, IBZ/BZ grids, the CPU structure (for parallel runs), and so on. Some points of note:


Up to Yambo version 4.5

 [02.03] RL shells
 =================
 Shells, format: [S#] G_RL(mHa)
  [S453]:8029(0.7982E+5) [S452]:8005(0.7982E+5) [S451]:7981(0.7982E+5) [S450]:7957(0.7942E+5)
  ...
  [S4]:11( 1183.) [S3]:5( 532.5123) [S2]:3( 133.1281) [S1]:1( 0.000000)

From Yambo version 5.0

 [02.03] Reciprocal space
 ========================
 
 nG shells         :  217
 nG charge         :   3187
 nG WFs            :  1477
 nC WFs            :  1016
 G-vecs. in first 21 shells:  [ Number ]
    1    3    5   11   13   25   37   39   51
   63   65   71   83   95  107  113  125  127
  139  151  163
 ...
 Shell energy in first 21 shells:  [ mHa ]
   0.00000      133.128      532.512      1183.37      1198.15      1316.50      1715.88      2130.05      2381.52
   3313.42      3328.20      3550.11      3683.24      4082.62      4511.57      4733.48      4748.27      4792.61
   4866.61      5266.00      5680.16
 ...


This reports the set of closed reciprocal lattice (RL) shells defined internally that contain G-vectors with the same modulus. The highest number of RL vectors we can use is 8029. Yambo will always redefine any input variable in RL units to the nearest closed shell.

Up to Yambo version 4.5

 [02.05] Energies [ev] & Occupations
 ===================================
 Fermi Level        [ev]:  5.112805
 VBM / CBm          [ev]:  0.000000  3.876293
 Electronic Temp. [ev K]:  0.00      0.00
 Bosonic    Temp. [ev K]:  0.00      0.00
 El. density      [cm-3]: 0.460E+24
 States summary         : Full        Metallic    Empty
                          0001-0008               0009-0100
 Indirect Gaps      [ev]: 3.876293  7.278081
 Direct Gaps        [ev]:  4.28829  11.35409
 X BZ K-points :  72

From Yambo version 5.0

 [02.05] Energies & Occupations
 ==============================
 
 [X] === General ===
 [X] Electronic Temperature                        :  0.000000  0.000000 [eV K]
 [X] Bosonic    Temperature                        :  0.000000  0.000000 [eV K]
 [X] Finite Temperature mode                       : no
 [X] El. density                                   :  0.46037E+24 [cm-3]
 [X] Fermi Level                                   :  5.110835 [eV]
 
 [X] === Gaps and Widths ===
 [X] Conduction Band Min                           :  3.877976 [eV]
 [X] Valence Band Max                              :  0.000000 [eV]
 [X] Filled Bands                                  :   8
 [X] Empty Bands                                   :    9  100
 [X] Direct Gap                                    :  4.289853 [eV]
 [X] Direct Gap localized at k-point               :   7
 [X] Indirect Gap                                  :  3.877976 [eV]
 [X] Indirect Gap between k-points                 :  14   7
 [X] Last valence band width                       :  3.401086 [eV]
 [X] 1st conduction band width                     :  4.266292 [eV]


Yambo recalculates again the Fermi level (close to the value of 5.06 noted in the PWscf SCF calculation). From here on, however, the Fermi level is set to zero, and other eigenvalues are shifted accordingly. The system is insulating (8 filled, 92 empty) with an indirect band gap of 3.87 eV. The direct and indirect gaps are indicated. There are 72 k-points in the full BZ, generated using symmetry from the 14 k-points in our user-defined grid.

TIP: You should inspect the report file after every run for errors and warnings.

Different ways of running yambo

We just run Yambo interactively.

Let's try to re-run the setup with the command

$ nohup yambo &
$ ls
l_setup  nohup.out  r_setup  r_setup_01  SAVE

If Yambo is launched using a script, or as a background process, or in parallel, this output will appear in a log file prefixed by the letter l, in this case as l_setup. If this log file already exists from a previous run, it will not be overwritten. Instead, a new file will be created with an incrementing numerical label, e.g. l_setup_01, l_setup_02, etc. This applies to all files created by Yambo. Here we see that l_setup was created for the first time, but r_setup already existed from the previous run, so now we have r_setup_01 If you check the differences between the two you will notice that in the second run yambo is reading the previously created ndb.kindx in place of re-computing the indexes. Indeed the output inside l_setup does not show the timing for X and Sigma

As a last step we run the setup in parallel, but first we delete the ndb.kindx file

$ rm SAVE/ndb.kindx
$ mpirun -np 4 yambo 
$ ls
LOG  l_setup  nohup.out  r_setup  r_setup_01  r_setup_02  SAVE

There is now r_setup_02 In the case of parallel runs, CPU-dependent log files will appear inside a LOG folder, e.g.

$ ls LOG
l_setup_CPU_1   l_setup_CPU_2  l_setup_CPU_3  l_setup_CPU_4

This behaviour can be controlled at runtime - see the Parallel tutorial for details.


2D hBN

Simply repeat the steps above. Go to the folder containing the hBN-sheet SAVE directory and launch yambo:

$ cd TUTORIALS/hBN-2D/YAMBO
$ ls
SAVE
$ yambo

Again, inspect the r_setup file, output logs, and verify that ndb.gops and ndb.kpts have been created inside the SAVE folder.

You are now ready to use Yambo!

Yambo's command line interface

Yambo uses a command line interface to select tasks, generate input files, and control the runtime behaviour.

In this module you will learn how to select tasks, generate and modify input files, and control the runtime behaviour by using Yambo's command line interface.

Input file generator

We are going to work again with bulk hBN. First, move to the appropriate folder and initialize the Yambo databases if you haven't already done so.

$ cd YAMBO_TUTORIALS/hBN/YAMBO
$ yambo                    (initialize)

Yambo generates its own input files: you just tell the code what you want to calculate by launching Yambo along with one or more options. To see the list of possible options, run yambo -h (we report here only the part we are focusing in)

$ yambo -h
'A shiny pot of fun and happiness [C.D.Hogan]' 

This is      : yambo
Version      : 5.0.1 Revision 19547 Hash e90d90f2d
Configuration: MPI+OpenMP+SLK+SLEPC+HDF5_MPI_IO

...

Initializations:
-setup           (-i)            :Initialization
-coulomb         (-r)            :Coulomb potential

Response Functions:
-optics          (-o) <string>   :Linear Response optical properties (more with -h optics)
-X               (-d) <string>   :Inverse Dielectric Matrix (more with -h X)
-dipoles         (-q)            :Oscillator strenghts (or dipoles)
-kernel          (-k) <string>   :Kernel (more with -h kernel)

Self-Energy:
-hf              (-x)            :Hartree-Fock
-gw0             (-p) <string>   :GW approximation (more with -h gw0)
-dyson           (-g) <string>   :Dyson Equation solver (more with -h dyson)
-lifetimes       (-l)            :GoWo Quasiparticle lifetimes

Bethe-Salpeter Equation:
-Ksolver         (-y) <string>   :BSE solver (more with -h Ksolver)

Total Energy:
-acfdt                           :ACFDT Total Energy

Utilites:
...
-slktest                         :ScaLapacK test

The options can be split into two sets:

  • A set of options which is needed to generate the appropriate input file (default name: yambo.in) selecting the kind of simulation you would like to perform
  • A set of options which can be used to manage auxiliary functions (like redirect the I/O, choose a specific name for the input file, etc ..).

Runlevel selection

First of all, you would like to specify which kind of simulation you are going to perform and generate an input file with the first set of options. By default, when generating the input file, Yambo will launch the vi editor. Editor choice can be changed when launching the configure before compilation; alternatively you can use the -Q run time option to skip the automatic editing (do this if you are not familiar with vi!):

$ yambo -hf -Q
yambo: input file yambo.in created
$ emacs yambo.in     or your favourite editing tool

Multiple options can be used together to activate various tasks or runlevels (in some cases this is actually a necessity). For instance, to generate an input file for optical spectra including local field effects (Hartree approximation), do (and then exit)

$ yambo -optics c -kernel hartree       which switches on:
optics                       # [R] Linear Response optical properties
chi                          # [R][CHI] Dyson equation for Chi.
Chimod= "Hartree"            # [X] IP/Hartree/ALDA/LRC/PF/BSfxc

To perform a Hartree-Fock and GW calculation using a plasmon-pole approximation, do (and then exit):

$ yambo -hf -gw0 p -dyson n        which switches on:
HF_and_locXC                 # [R XX] Hartree-Fock Self-energy and Vxc
gw0                          # [R GW] GoWo Quasiparticle energy levels
ppa                          # [R Xp] Plasmon Pole Approximation
em1d                         # [R Xd] Dynamical Inverse Dielectric Matrix      

Each runlevel activates its own list of variables and flags.

The previous command is also equivalent to

$ yambo -hf -gw0 r -dyson n -X p

Changing input parameters

Yambo reads various parameters from existing database files and/or input files and uses them to suggest values or ranges. Let's illustrate this by generating the input file for a Hartree-Fock calculation.

$ yambo -hf

Inside the generated input file you should find:

EXXRLvcs =  3187        RL    # [XX] Exchange RL components
%QPkrange                    # [GW] QP generalized Kpoint/Band indices
  1| 14|  6|10|
%

The QPkrange variable (follow the link for a "detailed" explanation for any variable) suggests a range of k-points (1 to 14) and bands (1 to 100) based on what it finds in the core database SAVE/ns.db1, i.e. as defined by the DFT code.
Leave that variable alone, and instead modify the previous variable to EXXRLvcs= 1000 RL

Save the file, and now generate the input a second time with yambo -x. You will see:

 EXXRLvcs=  1009        RL

This indicates that Yambo has read the new input value (1000 G-vectors), checked the database of G-vector shells (SAVE/ndb.gops), and changed the input value to one that fits a completely closed shell.

Last, note that Yambo variables can be expressed in different units. In this case, RL can be replaced by an energy unit like Ry, eV, Ha, etc. Energy units are generally better as they are independent of the cell size. Technical information is available on the Variables page.

The input file generator of Yambo is thus an intelligent parser, which interacts with the user and the existing databases. For this reason we recommend that you always use Yambo to generate the input files, rather than making them yourself.

Extra options

Extra options modify some of the code's default settings. They can be used when launching the code but also when generating input files.

Let's have a look again to the possible options (we report here only the part we are focusing in):

$ yambo -h
This is      : yambo
Version      : 5.0.1 Revision 19547 Hash e90d90f2d 
Configuration: MPI+OpenMP+SLK+SLEPC+HDF5_MPI_IO 

Help & version:
-help            (-h) <string>   :<string> can be an option (e.g. -h optics)
-version                         :Code version & libraries

Input file & Directories:
-Input           (-F) <string>   :Input file
-Verbosity       (-V) <string>   :Input file variables verbosity (more with -h Verbosity)
-Job             (-J) <string>   :Job string
-Idir            (-I) <string>   :Input directory
-Odir            (-O) <string>   :I/O directory
-Cdir            (-C) <string>   :Communication directory

Parallel Control:
-parenv          (-E) <string>   :Environment Parallel Variables file
-nompi                           :Switch off MPI support
-noopenmp                        :Switch off OPENMP support

...

Utilites:
-Quiet           (-Q)            :Quiet input file creation
-fatlog                          :Verbose (fatter) log(s)
-DBlist          (-D)            :Databases properties
...

Command line options are extremely important to master if you want to use yambo productively. Often, the meaning is clear from the help menu:

$ yambo -F yambo.in_HF -hf   Make a Hartree-Fock input file called yambo.in_HF
$ yambo -D                   Summarize the content of the databases in the SAVE folder
$ yambo -I ../               Run the code, using a SAVE folder in a directory one level up
$ yambo -C MyTest            Run the code, putting all report, log, plot files inside a folder MyTest

Other options deserve a closer look.

Verbosity

Yambo uses many input variables, many of which can be left at their default values. To keep input files short and manageable, only a few variables appear by default in the inout file. More advanced variables can be switched on by using the -V verbosity option. These are grouped according to the type of variable. For instance, -V RL switches on variables related to G vector summations, and -V io switches on options related to I/O control. Try:

$ yambo -optics c -V RL       switches on:
FFTGvecs=  3951        RL    # [FFT] Plane-waves

$ yambo -optics c -V io       switches on:
StdoHash=  40                # [IO] Live-timing Hashes
DBsIOoff= "none"             # [IO] Space-separated list of DB with NO I/O. DB= ...
DBsFRAGpm= "none"            # [IO] Space-separated list of +DB to be FRAG and ...
#WFbuffIO                    # [IO] Wave-functions buffered I/O

Unfortunately, -V options must be invoked and changed one at a time. When you are more expert, you may go straight to -V all, which turns on all possible variables. However note that yambo -o c -V all adds an extra 30 variables to the input file, which can be confusing: use it with care.

Job script label

The best way to keep track of different runs using different parameters is through the -J flag. This inserts a label in all output and report files, and creates a new folder containing any new databases (i.e. they are not written in the core SAVE folder). Try:

$ yambo -V RL -hf -F yambo_hf.in        and modify to
FFTGvecs = 1 Ry
EXXRLvcs = 1 Ry
VXCRLvcs = 1 Ry
$ yambo -J 1Ry -F yambo_hf.in           Run the code
$ ls
yambo_hf.in SAVE  
o-1Ry.hf r-1Ry_HF_and_locXC 1Ry 1Ry/ndb.HF_and_locXC

This is extremely useful when running convergence tests, trying out different parameters, etc.

Exercise: use yambo to report the properties of all database files (including ndb.HF_and_locXC)

Links