Weather Research and Forecasting (WRF) modeling system
Overview of WRF3
- The Weather Research and Forecasting (WRF) Model is a next-generation mesocale numerical weather prediction system designed to serve both operational forecasting and atmospheric research needs. It features multiple dynamical cores, a 3-dimensional variational (3DVAR) data assimilation system, and a software architecture allowing for computational parallelism and system extensibility. WRF is suitable for a broad spectrum of applications across scales ranging from meters to thousands of kilometers.
- With WRF development now taking priority at NCAR, MM5 development has been ended, and the model is considered frozen.
- The WRF is a weather forecasting system with fully compressible, non-hydrostatic equations (unlike the MM5, which is based on incompressible, non-hydrostatic equations). As with the MM5, the WRF can use a number of different physics options.
- It can work on PC or small workstation as well as on a cluster.
- Link to WRF Tutorial Page
- Link to WRF Overview Powerpoint
- Link to WRF Downloads Page
- Link to WRF Online Tutorial
WRF Official Website
This site has the code, a great walk through example of how to run a basic WRF model run. It also has excellent plotting examples using NCL.
Some Hints for compiling WRF
1. Software requirement
- Fortran 90 or 95 and c compiler. PGI (pgf90 v6.1-6, not free) or IFC(ifort v9.1, free) can be used. Some old version compilers may not work.
- perl 5.04 or better (perl 5.7.0 is required for WRFSI GUI)
- If MPI and OpenMP compilation is desired, it requires MPI or OpenMP libraries
- WRF I/O API supports netCDF and PHD5 formats, hence one of these libraries needs to be available on the computer you are compiling and running WRF.
- Using PGI or Intel compile on a Linux computer requires that the netCDF library is also installed using the same compiler.
2. System requirement
- Seems 512M is enough for compiling WRF with PGI compiler. But with IFC, there is a memory problem. For our new machine, the memory problem can be solved by changing ifort optimazation from -O3 to -O2. But not works on /rain. 2G memory is preferred.
- Recommended minimum total space: 10G
3. Compiling option
- For compiling tips please check the [PGI website tips and techniques page (WRF and NETCDF) | http://www.pgroup.com/resources/tips.htm]
- For the new machine /rime & /hail, we choose 'Settings for AMD x86_64 Intel xeon i686 ia32 Xeon Linux, ifort compiler (single threaded, allows nesting using RSL without MPI)'. X86_64 means 64 bit. Some files didn't get compiled 32 bit. The C compiler options('cc = gcc -DFSEEKO64_OK' in configure.wrf) should include a -m32 to force 32 bit compiling. Netcdf should also be compiled 32 bit.
- For PGi compiler, -m32 is not necessary.
- In order to compile on rime and hail with the PG compilers, we need PG-compiled versions of the netCDF libraries. The following environmental variables were used in order to successfully build netCDF on these machines...
setenv CC /usr/pgi/linux86-64/6.2/bin/pgcc setenv F77 /usr/pgi/linux86-64/6.2/bin/pgf77 setenv F90 /usr/pgi/linux86-64/6.2/bin/pgf90 setenv CPPFLAGS "-DNDEBUG -~DpgiFortran" setenv CFLAGS "-O2 -Msignextend -V" setenv FC $F90 setenv FFLAGS "-O2 -w -V" setenv CXX $CC
or using the newer PGI compiler 7.1
#use the PGI compilers 7.1 from Oct 2007 setenv CC /usr/pgi/linux86-64/7.1/bin/pgcc setenv F77 /usr/pgi/linux86-64/7.1/bin/pgf77 setenv F90 /usr/pgi/linux86-64/7.1/bin/pgf90 setenv CPPFLAGS "-DNDEBUG -DpgiFortran?" setenv CFLAGS "-O2 -Msignextend -V" setenv FC $F90 setenv FFLAGS "-O2 -w -V" setenv CXX $CC #use the netcdf libraries created by the PGI compilers March 2007 setenv NETCDF /usr/local/netcdf-pg setenv NETCDF_LIB /usr/local/netcdf-pg/lib setenv NETCDF_INC /usr/local/netcdf-pg/include #use the mpich libraries from PGI setenv /usr/pgi/linux86-64/7.1/mpi/mpich
Configure was then run using
so that the PG-compiled versions were installed in a separate directory from the ifort-compiled versions. When compiling WRF with PG compilers, it will be necessary to use
setenv NETCDF /usr/local/netcdf-pg
If using jasper library to build Grib2 I/O, it will be necessary to use
setenv JASPERINC /usr/local/include/jasper setenv JASPERLIB /usr/local/lib
Dierk's env for compiling and running WRF
- Create Directories
mkdir DOMAINS; mkdir GRIBdata; mkdir RAW; mkdir WPSdata; ls
- Download WRF and WPS from http://www.dtcenter.org/wrf-nmm/users/downloads/index.php
wget http://www.mmm.ucar.edu/wrf/src/WRFV2.2.1.TAR.gz 22 MB tar -xvzf WRFV2.2.1.TAR.gz mv WRFV2.2.1.TAR.gz RAW/.
- Download WRF-NMM Test Data for v2.2.1
cd GRIBdata wget http://www.dtcenter.org/wrf-nmm/users/downloads/met_nmm_20050123.tar.gz 34 MB tar -xvzf met_nmm_20050123.tar.gz cd ..
- Download WRF Pre-processing System tar file
cd GRIBdata wget http://www.mmm.ucar.edu/wrf/src/WPSV2.2.1.TAR.gz 34 MB tar -xvzf WPSV2.2.1.TAR.gz mv WPSV2.2.1.TAR.gz RAW/.
- Download WRF Preprocessing System geographic input data
cd WPSdata wget http://www.dtcenter.org/wrf-nmm/users/downloads/geog.tar.gz 360 MB for full spectrum of geographic tar -xvzf geog.tar.gz cd ..
- Download WRF Preprocessing System Test Data
NAM files for the 2005012300 forecast cycle every 3 hours out to 24 hours.This time period corresponds to the 23-24 January 2005 snow storm that impacted the northeast coast of the US.
cd GRIBdata wget http://www.dtcenter.org/wrf-nmm/users/downloads/NAM_inputdata_20050123.tar.gz tar -xvzf NAM_inputdata_20050123.tar.gz cd ..
Compiling and running WRF
- ./clean -a
- ./configure ; choose option 4 for PGI compilers. use default
Since we are compiling a WRF ARW real data case, type:
- ./compile em_real >& wrf_compile.log
- "SMPar" means "Shared-Memory Parallelism"; in
practice what happens is that OpenMP directives are enabled and the resulting binary will only run within a single shared-memory system.
- "DMPar" means "Distributed-Memory Parallelism",
which in turn means MPI will be used in the build.
Check the compile.log file for any errors. If your compilation was successful, you should see these executables created in the main/ directory.
main/ndown.exe Used for one-way nesting main/nup.exe Upscaling - used with WRF-Var main/real.exe WRF initialization for real data cases main/wrf.exe WRF model integration
These executables will be linked from the main/ directory to the directories run/ and test/em_real/, which is where you will be running the code from.
Looks good right out of the box.
Compiling the WPS (WRF Preprocessing System)
WPS Readme file explains what the WPS does..
You can also go through Sarah Davis's instructions and jpgs of the steps.
- ./clean -a
- ./configure ; choose option 1 for PGI compilers, no MPI, no GRIB2 data
- ./compile >& wps_compile.log
Check the compile_wps.log file for any errors. If your compilation was successful, you should see these executables created.
geogrid.exe -> geogrid/src/geogrid.exe metgrid.exe -> metgrid/src/metgrid.exe ungrib.exe -> ungrib/src/ungrib.exe geogrid.exe Generate static data metgrid.exe Generate input data for WRFV2 ungrid.exe Unpack GRIB data
A few utilities will also be linked under the util/ directory.
avg_tsfc.exe -> src/avg_tsfc.exe g1print.exe -> ../ungrib/src/g1print.exe g2print.exe -> ../ungrib/src/g2print.exe mod_levs.exe -> src/mod_levs.exe plotfmt.exe -> src/plotfmt.exe plotgrids.exe -> src/plotgrids.exe rd_intermediate.exe -> src/rd_intermediate.exe
Problem Note: When running ./configure the pgi option did not exist.. it instead created ifort I went in and manually changed the Architecture to pgi...
View the altered configure.wps file used to compile wpsv3.2.1 Aug 2010 on rime.
Detailed explanations of the utilities are available from Chapter 3 of the [ http://www.dtcenter.org/wrf-nmm/users/docs/user_guide/WPS/users_guide_nmm_chap3_v1_WPS.htm User's Guide]
2b. Prepare to run the WPS wizard by
i. Examine namelist.wps
- changed the data_path
geog_data_path = 'Your WPS_GEOG data location' opt_geogrid_tbl_path = '../../../WPS/geogrid/' to geog_data_path = '../WPSdata/geog' or whatever you have.
ii. Creating a DOMAIN folder and your location
ls DOMAINS RAW WPS WPSdata WRFV2
ls DOMAINS JAN_MAYEN
You are now ready to run the WPS and WRF code.
3. Run WPS Wizard
- Launch WRF Domain Wizard. Java version 1.6 should be installed first. On rime (and eventually hail), set up firefox (preferences -> downloads -> actions) to point to /usr/bin/javaws for files with suffix .jnlp
- Set the computer, the directory of the WPS programs, the geography files and the output domains.
- Set the region and nest domain, include projection options and grid options
- Set the Grib Vtable name
It is the data type of your source data for initialization. You can also create the Vtable name files by yourself. I think it is for selecting the variables you want used for initialization.
- Select the directory which you store your source data for generating initialization data for WRF, such as ECMWF reanalysis data. You should download the source files first.
- Select the source files for your initialization. It should cover the whole time ranges of your case.
- Set the Grib start and end date and grib interval(the output time interval of your intialization data).
- Run WPS step by step, to interpolate source files into grids to generate the initialization data for WRF.
- Notes: Sometimes, it will cause some problem when using WRF Domain Wizard to run WPS (For me, the Domain Wizard doesn't save the change to namelist.). Then you can try to edit the namelist and run it on terminal by yourself.
- ./ungrib.exe >&ungrib.log
- ./metgrid.exe >&metgrid.log
- Notes: Use Multiple Meteorological Data for Initialization.
If you would like use other meteorological data(Here, SST as an example), such as satellite products, for initialization,
- If your SST data is grib file, link the grib file to your working directory (use Domain Wizard). Edit namelist.wps, run ungrib.exe, SST_FILE:2004_*_* should be created.
- If your SST data is in other format, the SST data (HDF file) should be writen to the intermediate format(the file format which generates by ungrib.exe and used by metgrid.exe).
No interpolation is needed for this step. You just write the raw data into the intermediate files. The detailed file format can be found in ARW usr guide. Fortran is recommanded for wrting the file. The output file name is like SST_FILE:2004-01-26_00.
- Second, edit the namelist.wps. The following is an example.
&metgrid fg_name = 'FILE','SST_FILE', io_form_metgrid = 2, opt_output_from_metgrid_path = '/ide5/ltwu/WRF/WPS/domain/20040127/', opt_metgrid_tbl_path = '/ide5/ltwu/WRF/WPS/domain/20040127/',
- For AMSRE L3 SST data, the missing values for land and ice are set as -9999.0. In the interpolation process by metgrid.exe, some wrong values may be generated.
To fix this problem, edit the METGRID.TBL like:
======================================== name=SST interp_option=sixteen_pt+four_pt fill_missing=-9999.0 flag_in_output=FLAG_SST missing_value=-9999.0 ========================================
- Finally, run metgrid.exe. The SST variable should be writted in met_em* files.
Data assimilation part. See the WRF Users Guide for detail.
After setting up WRF-VAR, the steps of running the data assimilation are as following:
- Get observation data, first guesses data from WPS or WRF.
- Edit 3DVAR_OBSPROC/namelist.3dvar_obs; Run 3DVAR_OBSPROC/3dvar_obs.exe. A file like 'obs_gts.3dvar' will be created. It is used for WRF-VAR.
- Edit namelist.input, namelist.3dvar and DA_Run_WRF-Var.csh in wrfvar/run directory. Run DA_Run_WRF-Var.csh. A subdirectory 'wrf-var' will be created in run directory. Some diagnostic files including 'wrf_3dvar_output' will be created.
- Go to WRF_BC_v2.1. Edit new_bc.csh and run it. The WRF boundary file 'wrfbdy_d01' will be updated.
- Use the updated boundary file as the initial conditions for WRF.
- WRFPortal will be availble later some time.
- Link or copy the initialization files(met_em*) generated by WPS to WRF/Run directory.
- Edit namelist.input to fit your case.
- ./real.exe or mpirun np 4 real.exe
- ./wrf.exe or mpirun np 4 wrf.exe
WRF output files
- The file format of WRF outputs is netcdf (You can choose to output other format). Look at the header output variables and global attributes:
ncdump -h met_em.*.nc
- Netcdf files can be read in IDL, Matlab, C etc. There are some post processing tools available from WRF users home pages.
WRF Output Results
Here are some examples of WRF Results and output.
Sarah Davis output files have been moved to /data2/WRFout
Her other files and script have been backed up to /data2/SDAVIS_WRF/WRF
They are still on /rime2/sdavis/WRF