Toolchains: Difference between revisions

From VASP Wiki
No edit summary
(48 intermediate revisions by 6 users not shown)
Line 1: Line 1:
*[[A Debian based installation of VASP]]
Below we list the [[toolchains]] (compilers + assorted libraries) that we have used to build and test VASP in our nightly tests during the development.
*[[A Ubuntu based installation of VASP]]
Starting from VASP.6.3.0, the [[toolchains]] are listed separately for each version of VASP.
*[[A Fedora based installation of VASP]]
*[[A CentOS based installation of VASP]]


*[[Linking gfortran with Intel MKL]]
* These lists of [[toolchains]] are not comprehensive. They show what we have employed on a regular basis. Other/newer versions of the compilers and libraries than those listed below will, in all probability, work just as well (or better).
{{NB|tip|We encourage using up-to-date versions of compilers and libraries since they are continuously improved and bugs are identified and fixed.|:}}


== Related Sections ==
* Also for older versions of VASP, we recommend using up-to-date versions of compilers and libraries. In most cases, this will not be a problem. Except in some cases, VASP code was adjusted, e.g., to accommodate changes in the behavior of a compiler. Example: Compilation with GCC > 7.X.X is only possible as of VASP.6.2.0. That is because the GCC compilers became more strict and do not accept certain code constructs used in older VASP versions.<ref name="gcc-beyond-7-support"/>
[[Installing VASP]],
 
[[Precompiler flags]],
== VASP.6.4.3 ==
[[GPU port of VASP]],
{| class="wikitable" style="text-align: center;
[[Validation tests]]
|-
! Compilers
! MPI
! FFT
! BLAS
! LAPACK
! ScaLAPACK
! CUDA
! HDF5
! Other
! Remarks
! Known issues
|-
| intel-oneapi-compilers-2024.0.2 || intel-oneapi-mpi-2021.10.0 || colspan="4" | intel-oneapi-mkl-2023.2.0 || - || hdf5-1.14.0 || wannier90-3.1.0<br />libxc-5.2.3
| Rocky Linux 8.8
| -
|-
| gcc-12.3.0 || openmpi-4.1.6 || colspan="3" | intel-oneapi-mkl-2023.2.0 || netlib-scalapack-2.2.0 || - || hdf5-1.14.0 || wannier90-3.1.0<br />libxc-5.2.3
| Rocky Linux 8.8
| -
|-
| nvhpc-24.1<br />(OpenACC)
| openmpi-4.1.6</br />(CUDA-aware)
| colspan="3" | intel-oneapi-mkl-2023.2.0
| netlib-scalapack-2.2.0
| cuda-11.8
| hdf5-1.14.0
| wannier90-3.1.0
| Rock Linux 8.8<br />NVIDIA GPUs<br />(A30)
| -
|-
| style="background-color:#f5d9da;" | aocc-4.0.0
| openmpi-4.1.4
| amdfftw-4.0
| amdblis-4.0
| amdlibflame-4.0
| amdscalapack-4.0
| -
| hdf5-1.13.0
| wannier90-3.1.0<br />libxc-5.2.2
| On AMD CPUs<br />(Zen3)
| <span style=background:#f5d9da>[[Known_issues#KnownIssue11|Reduce optimization level]]</span>
|-
| nec-5.0.2
| nmpi-2.25.0
| colspan="3" | nlc-3.0.0
| netlib-scalapack-2.2.0
| -
| -
| wannier90-3.1.0
| Rocky Linux 8.8<br />NEC SX-Aurora TSUBASA<br />vector engine
| VASP >= 6.3.0<ref name="nec-aurora-support"/>
|}
 
== VASP.6.3.0 ==
{| class="wikitable" style="text-align: center;
|-
! Compilers
! MPI
! FFT
! BLAS
! LAPACK
! ScaLAPACK
! CUDA
! HDF5
! Other
! Remarks
! Known issues
|-
| intel-oneapi-compilers-2022.0.1 || intel-oneapi-mpi-2021.5.0 || colspan="4" | intel-oneapi-mkl-2022.0.1 || - || hdf5-1.13.0 || wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| -
|-
| colspan="5" | intel-parallel-studio-xe-2021.4.0
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| -
|-
| gcc-11.2.0 || openmpi-4.1.2 || colspan="3" | intel-oneapi-mkl-2022.0.1 || netlib-scalapack-2.1.0 || - || hdf5-1.13.0 || wannier90-3.1.0<br />libxc-5.2.2
| Centos 8.3<br />Intel Broadwell
| -
|-
| gcc-11.2.0 || openmpi-4.1.2 || fftw-3.3.10 || colspan="2" | openblas-0.3.18 || netlib-scalapack-2.1.0 || - || hdf5-1.13.0 || wannier90-3.1.0<br />libxc-5.2.2
| Centos 8.3<br />Intel Broadwell
| -
|-
| gcc-11.2.0 || openmpi-4.1.2 || amdfftw-3.1 || amdblis-3.1 || amdlibflame-3.1 || amdscalapack-3.1 || - || hdf5-1.13.0 || wannier90-3.1.0<br />libxc-5.2.2
| Centos 8.3<br />AMD Zen3
| -
|-
| gcc-9.3.0
| style="background-color:#f5d9da;" | openmpi-4.0.5
| fftw-3.3.8
| colspan="2" | openblas-0.3.10
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| gcc-7.5.0
| style="background-color:#f5d9da;" | openmpi-4.0.5
| colspan="3" | intel-mkl-2020.2.254
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| nvhpc-22.2<br />(OpenACC) || openmpi-4.1.2 || colspan="3" | intel-oneapi-mkl-2022.0.1 || netlib-scalapack-2.1.0 || nvhpc-22.2<br />(cuda-11.0) || hdf5-1.13.0 || wannier90-3.1.0
| Centos 8.3<br />NVIDIA GPUs<br />(P100 & V100)
| OpenACC<br />+<br />OpenMP<ref name="omp-acc-bug-1"/>
|-
| nvhpc-21.2<br />(OpenACC)
| style="background-color:#f5d9da;" | openmpi-4.0.5</br />(CUDA-aware)
| colspan="3" | intel-mkl-2020.2.254
| netlib-scalapack-2.1.0
| nvhpc-21.2<br />(cuda-11.0)
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />NVIDIA GPUs<br />(P100 & V100)
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| nvhpc-21.2
| style="background-color:#f5d9da;" | openmpi-4.0.5
| colspan="3" | intel-mkl-2020.2.254
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| nvhpc-21.2
| style="background-color:#f5d9da;" | openmpi-4.0.5
| fftw-3.3.8
| colspan="2" | openblas-0.3.10
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| aocc-3.2.0 || openmpi-4.1.2 || amdfftw-3.1 || amdblis-3.1 || amdlibflame-3.1 || amdscalapack-3.1 || - || hdf5-1.13.0 || wannier90-3.1.0<br />libxc-5.2.2
| On AMD CPUs<br />(Zen3)
| -
|-
| nec-3.4.0 || nmpi-2.18.0 || colspan="3" | nlc-2.3.0 || netlib-scalapack-2.2.0 || - || - || wannier90-3.1.0
| Centos 8.3<br />NEC SX-Aurora TSUBASA<br />vector engine
| VASP >= 6.3.0<ref name="nec-aurora-support"/>
<!-- these toolchains were used to build the code, but not to regularly run the test suite
|-
| nvhpc-22.2 || openmpi-4.1.2 || colspan="3" | intel-oneapi-mkl-2022.0.1 || netlib-scalapack-2.1.0 || - || hdf5-1.13.0 || wannier90-3.1.0
| -
| -
|-
| nvhpc-21.9 || nvhpc-21.9<br />(openmpi-3.1.5) || fftw-3.3.10 || colspan="3" | nvhpc-21.9 || - || - || -
| -
| -
-->
|}
 
== Older versions of VASP.6 ==
{| class="wikitable" style="text-align: center;
|-
! Compilers
! MPI
! FFT
! BLAS
! LAPACK
! ScaLAPACK
! CUDA
! HDF5
! Other
! Remarks
! Known issues
|-
| colspan="5" | intel-parallel-studio-xe-2021.1.1
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| -
|-
| gcc-9.3.0
| style="background-color:#f5d9da;" | openmpi-4.0.5
| fftw-3.3.8
| colspan="2" | openblas-0.3.10
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span><br />VASP >= 6.2.0<ref name="gcc-beyond-7-support"/>
|-
| gcc-7.5.0
| style="background-color:#f5d9da;" | openmpi-4.0.5
| colspan="3" | intel-mkl-2020.2.254
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| nvhpc-21.2<br />(OpenACC)
| style="background-color:#f5d9da;" | openmpi-4.0.5</br />(CUDA-aware)
| colspan="3" | intel-mkl-2020.2.254
| netlib-scalapack-2.1.0
| nvhpc-21.2<br />(cuda-11.0)
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />NVIDIA GPUs<br />(P100 & V100)
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| nvhpc-21.2
| style="background-color:#f5d9da;" | openmpi-4.0.5
| colspan="3" | intel-mkl-2020.2.254
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|-
| nvhpc-21.2
| style="background-color:#f5d9da;" | openmpi-4.0.5
| fftw-3.3.8
| colspan="2" | openblas-0.3.10
| netlib-scalapack-2.1.0
| -
| hdf5-1.10.7
| wannier90-3.1.0
| Centos 8.3<br />Intel Broadwell
| <span style=background:#f5d9da>Memory-leak<ref name="ompi-bug-1"/></span>
|}
 
== Footnotes and references ==
<references>
<ref name="ompi-bug-1">A bug in OpenMPI versions 4.0.4-4.1.1 causes a memory leak in some ScaLAPACK calls. This mainly affects long [[:Category:Molecular dynamics|molecular-dynamics]] runs. This issue is fixed as of openmpi-4.1.2.</ref>
<ref name="nec-aurora-support">The NEC SX-Aurora TSUBASA vector engine is supported as of VASP.6.3.0.</ref>
<ref name="omp-acc-bug-1">The NVIDIA HPC-SDK versions 22.1 and 22.2 have a serious bug that prohibits the execution of the OpenACC GPU port of VASP in conjunction with OpenMP-threading. When using these compiler versions you should compile the OpenACC GPU port of VASP without OpenMP-support. This bug is fixed as of NVIDIA HPC-SDK version 22.3.</ref>
<ref name="gcc-beyond-7-support">Support for GCC > 7.X.X was added with VASP.6.2.0. Do not use GCC-8.X.X compilers: the way we use the <tt>CONTIGUOUS</tt> construct in VASP is broken when using these compilers.</ref>
</references>
<!--
* [http://www.fftw.org/ FFTW]
* [http://www.netlib.org/lapack/index.html LAPACK]
* MPI library
* [http://www.netlib.org/scalapack/ ScaLAPACK]
 
 
* GNU
** [https://gcc.gnu.org/fortran/ gcc and gfortran 7.3.0]
** [http://www.fftw.org/ FFTW 3.3.8]
** [https://www.openblas.net/ Openblas 0.3.7]
** [https://www.open-mpi.org/software/ompi/v4.0/ OpenMPI 4.0.1]
** [http://www.netlib.org/scalapack/ ScaLAPACK 2.1.0]
 
* GNU + MKL
** [https://gcc.gnu.org/fortran/ gcc and gfortran 7.3.0]
** [https://software.intel.com/en-us/mkl/choose-download/linux MKL 2020.0.166] (for FFTW and LAPACK)
** [https://www.open-mpi.org/software/ompi/v4.0/ OpenMPI 4.0.1]
** [http://www.netlib.org/scalapack/ ScaLAPACK 2.1.0]
-->
 
== Related articles ==
[[Installing VASP.6.X.X]],
[[makefile.include]],
[[Compiler options]],
[[Precompiler options]],
[[Linking to libraries]],
[[OpenACC GPU port of VASP]],
[[Validation tests]],
[[Known issues]],
[[Personal computer installation]]


----
----
[[The_VASP_Manual|Contents]]


[[Category:VASP]][[Category:Installation]][[Category:Performance]][[Category:GPU]]
[[Category:VASP]][[Category:Installation]][[Category:Performance]][[Category:GPU]]

Revision as of 08:08, 11 April 2024

Below we list the toolchains (compilers + assorted libraries) that we have used to build and test VASP in our nightly tests during the development. Starting from VASP.6.3.0, the toolchains are listed separately for each version of VASP.

  • These lists of toolchains are not comprehensive. They show what we have employed on a regular basis. Other/newer versions of the compilers and libraries than those listed below will, in all probability, work just as well (or better).
Tip: We encourage using up-to-date versions of compilers and libraries since they are continuously improved and bugs are identified and fixed.
  • Also for older versions of VASP, we recommend using up-to-date versions of compilers and libraries. In most cases, this will not be a problem. Except in some cases, VASP code was adjusted, e.g., to accommodate changes in the behavior of a compiler. Example: Compilation with GCC > 7.X.X is only possible as of VASP.6.2.0. That is because the GCC compilers became more strict and do not accept certain code constructs used in older VASP versions.[1]

VASP.6.4.3

Compilers MPI FFT BLAS LAPACK ScaLAPACK CUDA HDF5 Other Remarks Known issues
intel-oneapi-compilers-2024.0.2 intel-oneapi-mpi-2021.10.0 intel-oneapi-mkl-2023.2.0 - hdf5-1.14.0 wannier90-3.1.0
libxc-5.2.3
Rocky Linux 8.8 -
gcc-12.3.0 openmpi-4.1.6 intel-oneapi-mkl-2023.2.0 netlib-scalapack-2.2.0 - hdf5-1.14.0 wannier90-3.1.0
libxc-5.2.3
Rocky Linux 8.8 -
nvhpc-24.1
(OpenACC)
openmpi-4.1.6
(CUDA-aware)
intel-oneapi-mkl-2023.2.0 netlib-scalapack-2.2.0 cuda-11.8 hdf5-1.14.0 wannier90-3.1.0 Rock Linux 8.8
NVIDIA GPUs
(A30)
-
aocc-4.0.0 openmpi-4.1.4 amdfftw-4.0 amdblis-4.0 amdlibflame-4.0 amdscalapack-4.0 - hdf5-1.13.0 wannier90-3.1.0
libxc-5.2.2
On AMD CPUs
(Zen3)
Reduce optimization level
nec-5.0.2 nmpi-2.25.0 nlc-3.0.0 netlib-scalapack-2.2.0 - - wannier90-3.1.0 Rocky Linux 8.8
NEC SX-Aurora TSUBASA
vector engine
VASP >= 6.3.0[2]

VASP.6.3.0

Compilers MPI FFT BLAS LAPACK ScaLAPACK CUDA HDF5 Other Remarks Known issues
intel-oneapi-compilers-2022.0.1 intel-oneapi-mpi-2021.5.0 intel-oneapi-mkl-2022.0.1 - hdf5-1.13.0 wannier90-3.1.0 Centos 8.3
Intel Broadwell
-
intel-parallel-studio-xe-2021.4.0 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
-
gcc-11.2.0 openmpi-4.1.2 intel-oneapi-mkl-2022.0.1 netlib-scalapack-2.1.0 - hdf5-1.13.0 wannier90-3.1.0
libxc-5.2.2
Centos 8.3
Intel Broadwell
-
gcc-11.2.0 openmpi-4.1.2 fftw-3.3.10 openblas-0.3.18 netlib-scalapack-2.1.0 - hdf5-1.13.0 wannier90-3.1.0
libxc-5.2.2
Centos 8.3
Intel Broadwell
-
gcc-11.2.0 openmpi-4.1.2 amdfftw-3.1 amdblis-3.1 amdlibflame-3.1 amdscalapack-3.1 - hdf5-1.13.0 wannier90-3.1.0
libxc-5.2.2
Centos 8.3
AMD Zen3
-
gcc-9.3.0 openmpi-4.0.5 fftw-3.3.8 openblas-0.3.10 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
gcc-7.5.0 openmpi-4.0.5 intel-mkl-2020.2.254 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
nvhpc-22.2
(OpenACC)
openmpi-4.1.2 intel-oneapi-mkl-2022.0.1 netlib-scalapack-2.1.0 nvhpc-22.2
(cuda-11.0)
hdf5-1.13.0 wannier90-3.1.0 Centos 8.3
NVIDIA GPUs
(P100 & V100)
OpenACC
+
OpenMP[4]
nvhpc-21.2
(OpenACC)
openmpi-4.0.5
(CUDA-aware)
intel-mkl-2020.2.254 netlib-scalapack-2.1.0 nvhpc-21.2
(cuda-11.0)
hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
NVIDIA GPUs
(P100 & V100)
Memory-leak[3]
nvhpc-21.2 openmpi-4.0.5 intel-mkl-2020.2.254 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
nvhpc-21.2 openmpi-4.0.5 fftw-3.3.8 openblas-0.3.10 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
aocc-3.2.0 openmpi-4.1.2 amdfftw-3.1 amdblis-3.1 amdlibflame-3.1 amdscalapack-3.1 - hdf5-1.13.0 wannier90-3.1.0
libxc-5.2.2
On AMD CPUs
(Zen3)
-
nec-3.4.0 nmpi-2.18.0 nlc-2.3.0 netlib-scalapack-2.2.0 - - wannier90-3.1.0 Centos 8.3
NEC SX-Aurora TSUBASA
vector engine
VASP >= 6.3.0[2]

Older versions of VASP.6

Compilers MPI FFT BLAS LAPACK ScaLAPACK CUDA HDF5 Other Remarks Known issues
intel-parallel-studio-xe-2021.1.1 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
-
gcc-9.3.0 openmpi-4.0.5 fftw-3.3.8 openblas-0.3.10 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
VASP >= 6.2.0[1]
gcc-7.5.0 openmpi-4.0.5 intel-mkl-2020.2.254 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
nvhpc-21.2
(OpenACC)
openmpi-4.0.5
(CUDA-aware)
intel-mkl-2020.2.254 netlib-scalapack-2.1.0 nvhpc-21.2
(cuda-11.0)
hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
NVIDIA GPUs
(P100 & V100)
Memory-leak[3]
nvhpc-21.2 openmpi-4.0.5 intel-mkl-2020.2.254 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]
nvhpc-21.2 openmpi-4.0.5 fftw-3.3.8 openblas-0.3.10 netlib-scalapack-2.1.0 - hdf5-1.10.7 wannier90-3.1.0 Centos 8.3
Intel Broadwell
Memory-leak[3]

Footnotes and references

  1. a b Support for GCC > 7.X.X was added with VASP.6.2.0. Do not use GCC-8.X.X compilers: the way we use the CONTIGUOUS construct in VASP is broken when using these compilers.
  2. a b The NEC SX-Aurora TSUBASA vector engine is supported as of VASP.6.3.0.
  3. a b c d e f g h i j A bug in OpenMPI versions 4.0.4-4.1.1 causes a memory leak in some ScaLAPACK calls. This mainly affects long molecular-dynamics runs. This issue is fixed as of openmpi-4.1.2.
  4. The NVIDIA HPC-SDK versions 22.1 and 22.2 have a serious bug that prohibits the execution of the OpenACC GPU port of VASP in conjunction with OpenMP-threading. When using these compiler versions you should compile the OpenACC GPU port of VASP without OpenMP-support. This bug is fixed as of NVIDIA HPC-SDK version 22.3.

Related articles

Installing VASP.6.X.X, makefile.include, Compiler options, Precompiler options, Linking to libraries, OpenACC GPU port of VASP, Validation tests, Known issues, Personal computer installation