Skip to content

Commit

Permalink
Merge pull request NREL#472 from tayloraubry/friendly-users-vasp
Browse files Browse the repository at this point in the history
updates to friendly users VASP documentation for VASP module use
  • Loading branch information
yandthj authored Aug 30, 2023
2 parents 7e4ee74 + f494e7f commit 88eaadc
Showing 1 changed file with 49 additions and 28 deletions.
77 changes: 49 additions & 28 deletions docs/Friendly_User/Applications/vasp.md
Original file line number Diff line number Diff line change
@@ -1,50 +1,36 @@
## VASP modules
## VASP modules on Kestrel

Section coming soon.

## Running VASP

We have found that it is optimal to run an intel toolchain build of VASP using cray-mpich-abi at runtime. Cray-mpich-abi has several dependencies on cray network modules, so the easiest way to load it is to first load ```PrgEnv-intel``` and then swap the default cray-mpich module for the cray-mpich-abi ```module swap cray-mpich cray-mpich-abi```. You must then load your intel compilers and math libraries, and unload cray's libsci. A sample script showing all of this is in the dropdown below.
There are modules for CPU builds of VASP 5.4.4 and VASP 6.3.2 each with solvation, transition state tools, and BEEF-vdW functionals. These modules can be loaded with ```module load vasp/5.4.4``` or ```module load vasp/6.3.2```. A sample job script is shown below.

!!! Note
It is necessary to specify the launcher using srun --mpi=pmi2

??? Sample job script for your own vasp build
??? example "Sample job script: using modules"

```
#!/bin/bash
#SBATCH --nodes=2
#SBATCH --tasks-per-node=104
#SBATCH --time=2:00:00
#SBATCH --mem=0 # ensures you are given all the memory on a node
#SBATCH --account=<your-account-name>
#SBATCH --job-name=<your-job-name>

# Load cray-mpich-abi and its dependencies within PrgEnv-intel, intel compilers, mkl, and unload cray's libsci
source /nopt/nrel/apps/env.sh
module purge
module load PrgEnv-intel
module swap cray-mpich cray-mpich-abi
module unload cray-libsci/22.10.1.2
module load intel-oneapi-compilers/2022.1.0
module load intel-oneapi-mkl/2023.0.0-intel
source /nopt/nrel/apps/env.sh #the need for this will eventually be removed
module load vasp/6.3.2

set -x
export OMP_NUM_THREADS=1 #turns off multithreading
export VASP_PATH=/PATH/TO/YOUR/vasp_exe


srun --mpi=pmi2 ${VASP_PATH}/vasp_std |& tee out

#Note: it may be optimal to run with more processers per task, especially for heavier gw calculations e.g:
srun --mpi=pmi2 -ntasks 64 --ntasks-per-node=32 ${VASP_PATH}/vasp_std |& tee out
srun --mpi=pmi2 vasp_std |& tee out

```

## Compiling VASP yourself

This section has recommendations for toolchains to use for building and running VASP. Please read carefully before compiling on Kestrel's cray architecture.

## Building VASP
### Building VASP

We recomend building vasp with a full intel toolchain and launching with the cray-mpich-abi at runtime. Additionally, you should build on a compute node so that you have the same architecture as at runtime:
We recommend building vasp with a full intel toolchain and launching with the cray-mpich-abi at runtime. Additionally, you should build on a compute node so that you have the same architecture as at runtime:
```
salloc -N 1 -p standard -t TIME [-A account once accounting has been implemented]
salloc -N 1 -p standard -t TIME -A ACCOUNT
```
Then, load appropriate modules for your mpi, compilers, and math packages:
```
Expand All @@ -59,3 +45,38 @@ module load intel-oneapi-mkl
On Kestrel, any modules you have loaded on the login node will be copied to a compute node, and there are many loaded by default for the cray programming environment. Make sure you are using what you intend to.

Sample makefiles for vasp5 and vasp6 on Kestrel can be found in our [Kestrel Repo](https://github.com/NREL/HPC/tree/master/kestrel) under the vasp folder.

### Running your build

We have found that it is optimal to run an intel toolchain build of VASP using cray-mpich-abi at runtime. Cray-mpich-abi has several dependencies on cray network modules, so the easiest way to load it is to first load ```PrgEnv-intel``` and then swap the default cray-mpich module for the cray-mpich-abi ```module swap cray-mpich cray-mpich-abi```. You must then load your intel compilers and math libraries, and unload cray's libsci. A sample script showing all of this is in the dropdown below.

!!! Note
It is necessary to specify the launcher using srun --mpi=pmi2

??? example "Sample job script: using your own build"

```
#!/bin/bash
#SBATCH --nodes=2
#SBATCH --tasks-per-node=104
#SBATCH --time=2:00:00
#SBATCH --account=<your-account-name>
#SBATCH --job-name=<your-job-name>

# Load cray-mpich-abi and its dependencies within PrgEnv-intel, intel compilers, mkl, and unload cray's libsci
source /nopt/nrel/apps/env.sh
module purge
module load PrgEnv-intel
module swap cray-mpich cray-mpich-abi
module unload cray-libsci
module load intel-oneapi-compilers
module load intel-oneapi-mkl

export VASP_PATH=/PATH/TO/YOUR/vasp_exe

srun --mpi=pmi2 ${VASP_PATH}/vasp_std |& tee out

```



0 comments on commit 88eaadc

Please sign in to comment.