There are many ways to visualize the spatial domain of WRF. I have used several of them in my previous papers, but they turned out to be not so elegant for publications. Therefore, while writing the dissertation recently, I developed some functions that can directly digest WPS namelist (rather than WRF output in some cases) to derive the domain boundaries. Figure 1 is an example plot.
Figure 1. Visualization of WRF domain from home-made Python. Background is the topography from ETOPO1 dataset.
This post is a record on how to process the HadGEM2-CC data for WRF downscaling. Unlike GFDL-ESM2M, the atmospheric results in this dataset is on pressure levels, which saves a lot of trouble. But the data available on the portal are masked out with terrain, so some interpolation is required to avoid WRF errors during initialization (e.g. unreasonably large RH values for interpolation).
After a long time of working with reanalysis data, I finally came to running WRF with CMIP5 data. Unfortunately, there are not many resources available online (though there have been so many publications based on WRF downscaling of CMIP5). One most useful post is the instructions at CORDEX experiment site. This one is for MIROC5 model output. It is worth pointing out that there is one bug in the instruction (step 2.4, we should not remove ps variable at this step). For my case, I need to drive WRF with CESM4, HadGEM2-CC and GFDL-ESM2M data.
CESM4 have been bias-corrected and prepared for WRF by NCAR as ds316.1, and this helps to save a lot of time. Steps for HadGEM2-CC are coverd by the part (2) of this post. GFDL-ESM2M happens to be not fully compliant to CF-1.4 convention, so I got some extra trouble. Luckily, I was able to solve all of them, and here is a record of the steps needed for GFDL-ESM2M data digestion. Continue reading
I have been running WRF for over a year now, and I have come across to various problems. For some of them, I found solutions online, but for others, I have to fix them myself. Fortunately, these problems did not require any work into the model codes..
I have been working on DS131.2 dataset since last week, but only till now have I figured out how to use it.
DS131.2 (Twentieth Century Global Reanalysis Version 2c, 20CR) is the reanalysis data during 1850-present. This is one of the only two options (the other is ERA dataset) that I could use, since I would like to do WRF simulations since 1910s. Unfortunately I had to spend about half a week figuring out how to get the correct data from the UCAR website. Continue reading
So here comes some nice parts of working with Linux. I am trying to figure out running several WRF simulations in parallel (I am not talking about a MPI run of WRF) on the school cluster. From the simple run experience, we just need to compile the code, run the model. But what if I want to have multiple runs at the same time? I do not want to go into the “building-running-deleting” cycle, and I would like to figure out how to have several sets of namelists along with only one wps.exe/wrf.exe to do the simulation. This would make it handy when we need to recompile wrf.exe or wps.exe. I am writing down what I have tested, but there is no guarantee that this is correct. Continue reading
WRF (Weather Research and Forecasting) is a powerful numerical simulation tool in atmospheric science. It is maintained by UCAR, and has been under continuous development. There has been a lot of official (ARW user guide)/ unofficial installation guides, but not so much information for CentOS 7 system. There are some minor differences in CentOS 7 that we have to configure manually. This, however, is pretty straightforward if you are familiar with Linux. Here I will show how to set up the basic dependencies, and how to compile WRF, as well as the related WPS and RIP4. Also here I will give out the steps for GNU compiler installation and Intel compiler installation.