I have been using Basemap for years in my Python scripting. Now I would like to slowly switch to Python 3, but Basemap is no longer officially supported there. There seems to be some workaround (such as using customized versions of Basemap), but personally I do not want to try that. So, I guess it is time to find an alternative, such as Cartopy.
In this post, I leave a record of my learning more about Cartopy, as well as some useful tricks (that I found out after hours of experiments..) Continue reading
A notebook on some tricks in Linux. For some of them, I learned them from the Internet. For the rest, I got it by playing around it myself.
1.1. Continue the interrupted downloading process
Use “-C” option. Specially, if we want cURL to automatically determine the location to continue transfer, then use “-C -“.
curl -C - https://sample.tgz
This would also save the failed file transfer from wget. Continue reading
This is to record some interesting use of CDO, just to avoid frequent Googling…
1. GRIB1 file, from Gaussian reduced grid to regular Gaussian grid:
cdo setgridtype,regular <infile> <outfile>
This is mainly used to pre-process ECMWF data. For the explanation on the reduced Gaussian grid, see ref . Specific, this handles the issue:
Warning (cdfDefRgrid) : Creating a netCDF file with data on a gaussian reduced grid.
Warning (cdfDefRgrid) : The further processing of the resulting file is unsupported!
2. Delete select timesteps in the NetCDF file
cdo delete,timestep=1,10,20 <infile> <outfile>
Note the counter starts from 1.
 Reduced Gaussian grid.
This post is a record on how to process the HadGEM2-CC data for WRF downscaling. Unlike GFDL-ESM2M, the atmospheric results in this dataset is on pressure levels, which saves a lot of trouble. But the data available on the portal are masked out with terrain, so some interpolation is required to avoid WRF errors during initialization (e.g. unreasonably large RH values for interpolation).
After a long time of working with reanalysis data, I finally came to running WRF with CMIP5 data. Unfortunately, there are not many resources available online (though there have been so many publications based on WRF downscaling of CMIP5). One most useful post is the instructions at CORDEX experiment site. This one is for MIROC5 model output. It is worth pointing out that there is one bug in the instruction (step 2.4, we should not remove ps variable at this step). For my case, I need to drive WRF with CESM4, HadGEM2-CC and GFDL-ESM2M data.
CESM4 have been bias-corrected and prepared for WRF by NCAR as ds316.1, and this helps to save a lot of time. Steps for HadGEM2-CC are coverd by the part (2) of this post. GFDL-ESM2M happens to be not fully compliant to CF-1.4 convention, so I got some extra trouble. Luckily, I was able to solve all of them, and here is a record of the steps needed for GFDL-ESM2M data digestion. Continue reading
Here is a list of the “strange” demands that I have had, and the solutions as my future reference. Continue reading
I have started my first “big” project in Python: I plan to develop a spatial-temporal analysis toolbox in Python that handles the data often used in geophysics and atmospheric sciences.
The first step is to construct the big structure of the package. So far, I have worked out several obstacles, and I am putting them here to help myself (and probably others).
- About importing a folder
In the “__init__.py” file, put a line as “import SAL”.
I have been running WRF for over a year now, and I have come across to various problems. For some of them, I found solutions online, but for others, I have to fix them myself. Fortunately, these problems did not require any work into the model codes..
Display Name : YoukuMediaCenter
Path : C:\Users\xxxxx\AppData\Roaming\ytmediacenter\YoukuMediaCenter.exe iku://|start|