Below, you will find how the ND Cloud Modification Project runs data through WRF 4.0 instead of WRF 3.7. This is a general instruction on downloading, coding, and running WRF 4.0 but will also include specific examples that are useful for this project in particular. The WRF 4.0 outputs will be compared to the WRF 3.7 original outputs and real-time observations to reveal the differences between the two versions.
If you ever get stuck to the point of no repair … turn it off and on again! Most of the errors I ran into were fixed by simply logging off, getting a snack, and logging back in.
ssh -X -A forecast@wopr.atmos.und.edu : In this directory you will find,
ssh -X -A forecast@western.atmos.und.edu : In this directory you will find,
All data can be found at '/home/data/nam/' in the respective directories.
For this project specifically, I have downloaded WRF v4.0 in /wrf within the Calgary server. Calgary is accessible at 'ssh first.last@134.129.222.140', and the password is specific to each person with an account on Calgary.
To move files from one server to another use 'scp -r original file server:pathway to file in original server .' Don't forget to include a period at the end! Example=scp -r forecast@western.atmos.und.edu:/home/data/wrfout_west_370_MP_8_alt/20220914 .
Find WRF namelist options at https://esrl.noaa.gov/gsd/wrfportal/namelist_input_options.html
To see the namelist specifications that a wrf output file was run with, type: ncdump -h wrfout_d0…
The following was taken from the README files for each version. Most of the specifics have remained the same but there are two primary differences that are used in 4.0 but not 3.7: the Hybrid-sigma vertical coordinate coordinates and flux-adjusting surface data nudging.
Advanced Research WRF (ARW) solver: Eulerian mass, hydrostatic and non-hydrostatic
Two-way nesting:
Moving nest:
Physics options:
Nudging:
Thompson Microphysics options:
Hybrid Vertical Coordinate: The HVC option is a "hybrid" vertical coordinate, in that the eta levels are terrain following near the surface, and then relax towards an isobaric surface aloft. The purpose of this coordinate option is to reduce the artificial influence of topography towards the top of the model.
The following are steps for compiling WRF and WPS. Detailed instructions can be found at https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compilation_tutorial.php and Linux instructions can be found at https://metclim.ucd.ie/2017/06/wrf-installation-on-a-linux-machine/.
Source code information can be found at https://www2.mmm.ucar.edu/wrf/users/download/get_sources.html This site is for all versions of WRF so make sure you are downloading the correct version. For this project, we are upgrading to v4.0 so I downloaded the tar files for WRF Version 4.0 and WPS Version 4.0.
You can skip this step if you are compiling WRF on a supercomputer such as Cheyenne. It should already be done in a shared directory.
In the /WRF directory, type:
mkdir Build_WRF mkdir TESTS
In the /WRF/TESTS directory, type:
wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/Fortran_C_tests.tar tar -xf Fortran_C_tests.tar gfortran TEST_1_fortran_only_fixed.f ./a.out
Output: SUCCESS test 1 fortran only fixed format
gfortran TEST_2_fortran_only_free.f90 ./a.out
Output: Assume Fortran 2003: has FLUSH, ALLOCATABLE, derived type, and ISO C Binding SUCCESS test 2 fortran only free format
gcc TEST_3_c_only.c ./a.out
Output: SUCCESS test 3 c only
gcc -c -m64 TEST_4_fortran+c_c.c gfortran -c -m64 TEST_4_fortran+c_f.f90 gfortran -m64 TEST_4_fortran+c_f.o TEST_4_fortran+c_c.o ./a.out
Output: C function called by Fortran Values are xx = 2.00 and ii = 1 SUCCESS test 4 fortran calling c
./TEST_csh.csh
Output: SUCCESS csh test
./TEST_perl.pl
Output: SUCCESS perl test
./TEST_sh.sh
Output: SUCCESS sh test
You can skip this step if you are compiling WRF on a supercomputer such as Cheyenne. It should already be done in a shared directory.
The WRF Libraries required for configuring and compiling are version specific and need to be 100% correct. If you download an incorrect version of a library or simply put the download in the wrong folder then WRF will not compile correctly. Environmental Variables seen below need to be defined every time you want to run WRF or permanently saved.
Type:
export DIR=/wrf/WRF/Build_WRF/LIBRARIES export CC=gcc export CXX=g++ export FC=gfortran export FCFLAGS=-m64 export F77=gfortran export FFLAGS=-m64 export JASPERLIB=$DIR/grib2/lib export JASPERINC=$DIR/grib2/include export LDFLAGS=-L$DIR/grib2/lib export CPPFLAGS=-I$DIR/grib2/include export PATH=$DIR/netcdf/bin:$PATH export NETCDF=$DIR/netcdf export LD_LIBRARY_PATH=$DIR/lib:$LD_LIBRARY_PATH
Include “export LD_LIBRARY_PATH=$DIR/grib2/lib:$LD_LIBRARY_PATH” for errors including “libpng12.so.0”.
In the /WRF/Build_WRF directory, type:
mkdir LIBRARIES
In the /WRF/Build_WRF/LIBRARIES directory, type:
wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/jasper-1.900.1.tar.gz wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/libpng-1.2.50.tar.gz wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/zlib-1.2.7.tar.gz wget http://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/netcdf-4.1.3.tar.gz
Find other Netcdf-c versions at https://github.com/Unidata/netcdf-c/releases. Find other Netcdf-fortran version at https://github.com/Unidata/netcdf-fortran/releases.
In the /WRF/Build_WRF/LIBRARIES directory, type:
tar xzvf netcdf-4.1.3.tar.gz cd netcdf-4.1.3 ./configure --prefix=$DIR/netcdf --disable-dap --disable-netcdf-4 --disable-shared make make install cd ..
tar xzvf zlib-1.2.7.tar.gz cd zlib-1.2.7 ./configure --prefix=$DIR/grib2 make make install cd ..
tar xzvf libpng-1.2.50.tar.gz cd libpng-1.2.50 ./configure --prefix=$DIR/grib2 make make install cd ..
tar xzvf jasper-1.900.1.tar.gz cd jasper-1.900.1 ./configure --prefix=$DIR/grib2 make make install cd ..
In the /WRF/TESTS directory, type:
wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/Fortran_C_NETCDF_MPI_tests.tar tar -xf Fortran_C_NETCDF_MPI_tests.tar cp ${NETCDF}/include/netcdf.inc . gfortran -c 01_fortran+c+netcdf_f.f gcc -c 01_fortran+c+netcdf_c.c gfortran 01_fortran+c+netcdf_f.o 01_fortran+c+netcdf_c.o -L${NETCDF}/lib -lnetcdff -lnetcdf ./a.out
Output: C function called by Fortran Values are xx = 2.00 and ii = 1 SUCCESS test 1 fortran + c + netcdf
In the /WRF directory, type:
./configure
Compilation Options:
Type:
./compile em_real >& log.compile
Replace em_real with the compilation option of your choice. Compilation might take 20-30 minutes.
Type:
ls -ls main/*.exe
If the executable files, wrf.exe OR real.exe have not appeared in the /WRF/main directory then stop now! There is an issue with your libraries and further progress is not possible.
Type (if executable files are missing):
./clean -a
Then, redo STEPS 3, 4, and 5 and ensure the libraries are matching what version of WRF you are using. The issue could also be a missing environmental variable. Check out the WRF & MPAS-A Forum https://forum.mmm.ucar.edu/.
In the /WPS directory, type:
./clean ./configure ./compile >& log.compile
If the executable files, geogrid.exe, ungrib.exe, OR metgrid.exe have not appeared in the /WPS directory then stop now! There is an issue with your libraries or WRF compilation and further progress is not possible. My solution to this issue is deleting the untarred WPS directory and redoing the WRF compilation steps. Check out the WRF & MPAS-A Forum https://forum.mmm.ucar.edu/ for other solutions.
Find all Geographical Input Data Mandatory Field Downloads at https://www2.mmm.ucar.edu/wrf/users/download/get_sources_wps_geog.html
In the /WRF/Build_WRF directory, type:
wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz gunzip geog.tar.gz tar -xf geog.tar
Edit the /WPS/namelist.wps file:
&geogrid geog_data_path = path_to_directory/Build_WRF/WPS_GEOG
At this point, we will need data to run through WRF. To test if WRF was compiled correctly, you can use a Case Study available online at https://www2.mmm.ucar.edu/wrf/OnLineTutorial/CASES/SingleDomain/index.php. The following steps can be modified in order to accommodate the data set you are using.
On the same level as the /WPS and /WRF directories, type:
mkdir DATA
Put all data files within the /DATA directory.
In the /WPS directory, type:
ln -sf ungrib/Variable_Tables/Vtable.NAM Vtable ./link_grib.csh /pathway-to-DATA/DATA/
Output: GRIBFILE.AAA, GRIBFILE.AAB, GRIBFILE.AAC, …, GRIBFILE.AAZ files in the /WPS directory. Be sure to link the correct data type. My data was NAM, yours might be GFS, ECMWF, RAP, or even SST.
Edit the /WPS/namelist.wps file to match the input files.
vim namelist.wps
Things to look out for:
In the /WPS directory, type:
./geogrid.exe
Output: geo_em.d01.nc, geo_em.d02.nc, …, geo_em.d05.nc in the /WPS directory.
In the /WPS directory, type:
./ungrib.exe
Output: FILE:Year-Month-Day_Hour, FILE:Year-Month-Day_Hour+interval_seconds, …, FILE:Year-Month-Day_Hour+interval_seconds in the /WPS directory.
In the /WPS directory, type:
./metgrid.exe
Output: met_em.d01.Year-Month-Day_00:00:00.nc, …, met_em.d02.Year-Month-Day_00:00:00.nc, …, met_em.d05.Year-Month-Day_00:00:00.nc files in the /WPS directory. There should be a “met_em” file for each domain and for each output of WPS. For example, We ran a 24 hour case study with an interval time of 3 hours, this resulted in 8 files for each domain, 24 files in total.
If you get an error that is similar to,
Processing domain 1 of 3 Processing 2022-07-28_00 FILE WARNING: Couldn't open file FILE:2022-07-28_00 for input. ERROR: The mandatory field TT was not found in any input data.
This is because you didn't properly “ungrib” your data. Be sure you didn't skip step 8.2!
In the /WRF/run directory, you can either symbolic link the WPS output files or copy the files to this directory.
ln -sf /pathway-to-WPS/WPS/met_em.d0* . OR cp /pathway-to-WPS/WPS/met_em.d0* /pathway-to-WRF/WRF/run/
Edit the /WRF/run/namelist.input file to match the input files.
vim namelist.input
For this project, we used the exact same namelist from previous WRF versions used for the project. The only differences were date/time shifts and we added “do_radar_ref = 1” in the physics section in order to additionally output radar plots. We also had to omit the “afwa” section for version 4.0 because they did not allow WRF to run properly.
Things to look out for:
In the /WRF/run directory, type:
./real.exe
Output: wrfinput_d01, wrfinput_d02, …, wrfinput_d05 AND wrfbdy_d01, wrfbdy_d02, …, wrfbdy_d05 files in the /WRF/run directory.
If you only get 'wrfinput' files for one or two of your domains, make sure your namelist has the correct 'max domain' and 'input_from_file' is 'true' for all your domains.
In the /WRF/run directory, type:
./wrf.exe
Output: wrfout_d01_Year-Month-Day_00:00:00, …, wrfout_d02_Year-Month-Day_00:00:00, …, wrfout_d05_Year-Month-Day_00:00:00 files in the /WRF/run directory.
If you receive an error after trying to run wrf.exe that reads,
FATAL CALLED FROM FILE: <stdin> LINE: 186 nesting requires either an MPI build or use of the -DSTUBMPI option
This means that you have compiled WRF for nesting data but your data isn't actually nesting. This will require you to recompile WRF.