-
Notifications
You must be signed in to change notification settings - Fork 2
NWChem
You must use an implementation of MPI that supports MPI-3. These are listed on the front page of this website.
wget http://www.nwchem-sw.org/download.php?f=Nwchem-6.3.revision2-src.2013-10-17.tar.gz -O nwchem-6.3.tgz tar -xzf nwchem-6.3.tgz cd nwchem-6.3.revision2-src.2013-10-17 export NWCHEM_TOP=$PWD git clone git://git.mpich.org/armci-mpi.git || git clone http://git.mpich.org/armci-mpi.git cd armci-mpi git checkout mpi3rma ./autogen.sh
If ./autogen.sh fails due to lack of sufficiently recent versions, please see https://wiki.alcf.anl.gov/parts/index.php/Autotools and run the script there to install them. You will need to prepend $HOME/TOOLS/bin to PATH prior to running this script for it to succeed.
wget http://www.nwchem-sw.org/download.php?f=Nwchem-6.6.revision27746-src.2015-10-20.tar.bz2 -O nwchem-6.6.tbz tar -xjf nwchem-6.6.tbz cd nwchem-6.6 export NWCHEM_TOP=$PWD cd $NWCHEM_TOP/src/tools ./install-armci-mpi
The output of ./install-armci-mpi is important. In particular, it will tell you something like this:
Please set the following environment variables when you build NWChem: ARMCI_NETWORK=ARMCI EXTERNAL_ARMCI_PATH=/Users/jrhammon/Work/NWCHEM/svn/src/tools/..//external-armci
You will use these in phase 2.
Only use these instructions if you are collaborating with Jeff using his Github fork of NWChem...
You need to fork https://github.com/jeffhammond/nwchem before the next step.
git clone https://github.com/$YOUR_GITHUB_USERNAME/nwchem.git nwchem-git cd nwchem-git export NWCHEM_TOP=$PWD cd $NWCHEM_TOP/src/tools ./get-tools ./install-armci-mpi
The output of ./install-armci-mpi is important. In particular, it will tell you something like this:
Please set the following environment variables when you build NWChem: ARMCI_NETWORK=ARMCI EXTERNAL_ARMCI_PATH=$NWCHEM_TOP/src/tools/..//external-armci
You will use these in phase 2.
mkdir build cd build ../configure CC=mpicc --prefix=$NWCHEM_TOP/external-armci make -j12 make install export ARMCI_NETWORK=ARMCI export EXTERNAL_ARMCI_PATH=$NWCHEM_TOP/external-armci
At this point, you should be done with anything related to ARMCI-MPI. The rest of the build is identical to how one builds NWChem normally. However, for completeness, we include this below.
The MPICH, MPICH-GLEX and MVAPICH options are commented out. You need to uncomment the one that is right for your system. For CrayMPI, just set MPI_DIR to /usr so that it is valid. It will not be used since cc and ftn are MPI wrappers already.
export NWCHEM_TARGET=LINUX64 export NWCHEM_MODULES=all export TARGET=LINUX64 export LARGE_FILES=TRUE export USE_MPI=yes
There are implementation-specific options for different MPIs:
MPI_DIR=`which mpicc | sed "s/\/bin\/mpicc//g"` export MPI_LIB="${MPI_DIR}/lib" export MPI_INCLUDE="${MPI_DIR}/include" MPICH_LIBS="-lmpifort -lmpi" GLEX_LIBS="-L/usr/local/glex/lib64 -L/usr/local/glex/lib -lglex" MVAPICH_LIBS="-lmpich -lopa -lmpl -libmad -lrdmacm -libumad -libverbs -lrt -lhwloc -lhwloc" #MPICH# export LIBMPI="-L${MPI_DIR}/lib -Wl,-rpath -Wl,${MPI_DIR}/lib ${MPICH_LIBS} -lpthread" #GLEX# export LIBMPI="-L${MPI_DIR}/lib -Wl,-rpath -Wl,${MPI_DIR}/lib ${MPICH_LIBS} ${GLEX_LIBS} -lpthread" #MVAPICH# export LIBMPI="-L${MPI_DIR}/lib -Wl,-rpath -Wl,${MPI_DIR}/lib ${MVAPICH_LIBS} -lpthread"
These options change frequently. The best way to determine what should go in LIBMPI is via mpicc -show.
On a Cray XC30 system like NERSC Edison, you need to load the following modules:
module load PrgEnv-intel cray-mpich torque module load craype-ivybridge
Then your MPI settings are trivial:
export MPI_LIB="." export MPI_INCLUDE="." export LIBMPI=""
These are compiler and library settings. The Intel compilers are recommended because Intel Fortran appears to be the best compiler for NWChem. It appears that some versions of gfortran create bugs. The other advantage is that Intel compilers link against MKL trivially.
export FC=ifort export CC=icc export BLASOPT="-mkl=sequential"
If you want to use a different BLAS library, just read the documentation and figure it out. You should always be able to use CC=gcc but it is perhaps simpler in some environments (e.g. Cray, where modules makes it hard to mix and match compiler toolchains) to use matched pairs.
In order to automatically include MPI on a Cray system, you need to use the Cray compiler wrappers (with the Intel compilers under the hood, per the modules noted above):
export FC=ftn export CC=cc
Now you should be ready to blast your login nodes with the build.
cd $NWCHEM_TOP/src gmake nwchem_config gmake -j8
If this doesn't work, please email Jeff for assistance.