WRF

The Weather Research and Forecasting (WRF) Model is a next generation mesoscale numerical weather prediction system designed for both atmospheric research and operational forecasting needs. It features two dynamical cores, a data assimilation system, and a software architecture facilitating parallel computation and system extensibility. For more information on WRF, see the WRF website.

For the purposes of this build, the following components are used:

 Component Form
 WRF  Version 3.9.1.1
 Arm Compiler
 Version 19.1
 Arm Performance Libraries
 Version 19.1
 Open MPI  Version 4.0.0
 HDF5  Version 1.10.4
 NetCDF  Version 4.6.1
 NetCDF for Fortran  Version 4.4.4
 Operating system  RHEL 7.3
 Hardware
 Cavium ThunderX2

Prerequisites

  • Installed Arm Compiler and Arm Performance Libraries. For more information, see our instructions on Installing Arm Compiler for HPC.

  • Built and installed Open MPI. For more information on building Open MPI with Arm Compiler for HPC, see our instructions on building Open MPI with Arm Compiler.

  • Built and installed HDF5. For more information on building HDF5 with Arm Compiler for HPC, see our instructions on Building HDF5 with Arm Compiler.

  • Built and installed NetCDF and NetCDF Fortran libraries. For more information on building NetCDF and NetCDF Fortran libraries with Arm Compiler for HPC, see our instructions on Building NetCDF with Arm Compiler.

  • You must be a registered WRF user before you can download the WRF source. To register, visit the WRF website.

Note: For the purposes of this guide, it is assumed that the NetCDF Fortran interface has been installed in the same location as the NetCDF library, thus they share the same lib and include directories.


Procedure

  1. Download the WRF source from the WRF website and unpack the archive.

  2. Change into the unpacked WRFV3 directory: 

    cd WRFV3/
  3. Set the environment variables HDFDIR and NETCDF to be the location of the HDF5 and NetCDF installation directories:

    export HDFDIR=/path/to/hdf5_install
    export NETCDF=/path/to/netcdf_install

    replacing /path/to/hdf_install and /path/to/netcdf_install with the paths to your HDF5 and NetCDF installations, respectively.

    Reminder: For the purposes of this guide, it is assumed that the Fortran NetCDF interface has been installed in the same location as the C library, thus they share the same lib and include directories.

  4. Add details of the HDF5 and NetCDF libraries to CCPFLAGS and LDFLAGS:

    export CPPFLAGS="-I${HDFDIR}/include -I${NETCDF}/include"
    export LDFLAGS="-L${HDFDIR}/lib -L${NETCDF}/lib -lnetcdf -lhdf5_hl -lhdf5 -lz"
  5. Turn on large file support, and unset Parallel NetCDF:

    export WRFIO_NCD_LARGE_FILE_SUPPORT=1
    unset PNETCDF
  6. Include NetCDF in the LD_LIBRARY_PATH:

    export LD_LIBRARY_PATH=${NETCDF}/lib:$LD_LIBRARY_PATH
  7. Add a stanza for Arm (AArch64) to the default configurations file arch/configure_new.defaults before the entry for Fujitsu FX10/FX100:

    ###########################################################
    #ARCH Linux aarch64, Arm compiler OpenMPI # serial smpar dmpar dm+sm
    #
    DESCRIPTION = Arm ($SFC/$SCC): Aarch64
    DMPARALLEL =
    OMPCPP = -fopenmp
    OMP = -fopenmp
    OMPCC = -fopenmp
    SFC = armflang
    SCC = armclang
    CCOMP = armclang
    DM_FC = mpif90
    DM_CC = mpicc -DMPI2_SUPPORT
    FC = CONFIGURE_FC
    CC = CONFIGURE_CC
    LD = $(FC)
    RWORDSIZE = CONFIGURE_RWORDSIZE
    PROMOTION =
    ARCH_LOCAL =
    CFLAGS_LOCAL = -w -O3 -c
    LDFLAGS_LOCAL = -fopenmp
    FCOPTIM = -O3 -mcpu=thunderx2t99 -fopenmp -funroll-loops -lamath
    FCREDUCEDOPT = $(FCOPTIM)
    FCNOOPT = -O0 -fopenmp -frecursive
    FCDEBUG = -g $(FCNOOPT)
    FORMAT_FIXED = -ffixed-form
    FORMAT_FREE = -ffree-form
    FCSUFFIX =
    BYTESWAPIO = -fconvert=big-endian
    FCBASEOPTS = -w $(FORMAT_FREE) $(BYTESWAPIO)
    MODULE_SRCH_FLAG = -module $(WRF_SRC_ROOT_DIR)/main
    TRADFLAG = -traditional-cpp
    CPP = /lib/cpp CONFIGURE_CPPFLAGS
    AR = ar
    ARFLAGS = ru
    M4 = m4 -B 14000
    RANLIB = ranlib
    CC_TOOLS = $(SCC)

    Note: The comment lines are required for the configure script to present the appropriate options at configure time.

  8. Build and configure WRF, using the configure script. The configure script uses the Arm Compiler settings provided in the stanza to build and configure WRF:

    ./configure
  9. The configure script prompts you to choose whether to compile a serial or parallel (with Distributed-Memory Parallelism, 'dmpar') version, and how to compile for nesting. For the purposes of this example build, select the serial build and choose no nesting.

  10. Use the WRF compile script to build the target wrf executable:

     ./compile wrf

    Note: depending on your system configuration, you may need to remove the time command from the configuration generated to avoid the production of a large number of non-critical error reports at compile time. This does not impact the built executable.

    sed -i 's/time \$(DM_FC)/\$(DM_FC)/' ./configure.wrf

CONUS 12km test

  • Copy the existing run directory to a new location, for example:

  • cp -r run run_CONUS
    cd run_CONUS
  • Download a CONUS (Contiguous US) test deck and data files from UCAR:

  • wget http://www2.mmm.ucar.edu/BENCH/12km/example.txt -O example.txt
    wget http://www2.mmm.ucar.edu/BENCH/12km/namelist.input -O namelist.input
    wget http://www2.mmm.ucar.edu/BENCH/12km/rsl.out.0000 -O rsl.out.0000
    wget http://www2.mmm.ucar.edu/BENCH/12km/wrfbdy_d01 -O wrfbdy_d01
    wget http://www2.mmm.ucar.edu/BENCH/12km/wrfrst_d01_2001-10-24_09:00:00 -O wrfrst_d01_2001-10-24_09:00:00
    wget http://www2.mmm.ucar.edu/BENCH/12km/wrfout_d01_2001-10-24_12:00:00 -O wrfout_d01_2001-10-24_12:00:00-REF
  • Execute the test problem using the symlink to the WRF executable in the run_CONUS directory, for example:

  • OMP_NUM_THREADS=4 mpirun -np 4 ./wrf.exe 

Related information