-
Notifications
You must be signed in to change notification settings - Fork 2
Run RUN
The Folder structure
/path/to/Install/folder
├── [...stuff...]
├── Domains
├── WRF
│ ├── [...stuff...]
│ └── run
├── WPS
├── dataGFS
└── RUN
Our wrapper for the WRF model comprises several steps:
- Edit input. Time and domain to calculate
- Download GFS data
- Run WRF
- geogrid
- ungrib
- metgrid
- real
- wrf
- Post-Processing
All the parameters for each run are controlled from $FOLDER/RUN/config.ini
There is a lot of shit obsolete garbage entries in this file, which I'll clean up eventually. For now we need to pay attention only to these varaibles:
start_date = dd/mm/YYYY-HH:MM # Local time
end_date = dd/mm/YYYY-HH:MM # Local time
daily_hours = 8,20 # Mask of hours to calculate, regardless of start-end_dates
# useful, for instance, to calculate only during day-light time
FOLDER: ~/METEO # Root folder installation of WRF, WPS and RUN
DOMAIN: mydomain # Name of the domain
domain_folder = ${FOLDER}/Domains/mydomain
domains = 1,2 # list of nested domains to calculate, maybe ignored for now? set it, just in case
GFS_data_folder = ${FOLDER}/dataGFS
output_folder = /storage/WRFOUT/${DOMAIN}
- It is important that the folders do not end in
/
(maybe I've fixed this already?)
There are (at least) 3 ways to access the GFS data
- FTP
- HTTP
- HTTP restricted to a subregion
In every case the folder structure in the NOAA server is the same:
{BASE}/pub/data/nccf/com/gfs/prod/gfs.YYYYmmdd/HH/atmos/gfs.tHHz.pgrb2b.0p25.fTTT
Where YYYYmmdd is the date of the calculation we want to access. HH is the hour of said calculation and TTT are the number of hours since the GFS calculation until the forecast we want to calculate.
Example: Imagine that, on april the 12th 2021 at 13:00UTC, we want to make a forecast for april 13th 2021 at 9:00UTC. In this case, the last GFS data would have been calculated at 12:00UTC, so the folder in the server should be:
/pub/data/nccf/com/gfs/prod/gfs.20210412/12/atmos/
and the corresponding data file: gfs.t12z.pbgrib2.f021
since there are 21 hours since the time of the GFS calculation (12/04/2021-12:00) and the forecast (13/4/2021-09:00).
Actually there are some delays so probably these times are not accurate, but it showcases the idea.
The code in download_gfs_data.py
will look for the last batch containing all the data requested via the start_date
- end_date
variables in the config.ini
file. If the data is requested before the GFS files are available, then the the program will default to the previous batch.
Black box dark magic divided in 5 smaller black boxes. All of them full of dark magic:
- geogrid
- ungrib
- metgrid
- real
- wrf
The explanations can be found here: https://www2.mmm.ucar.edu/wrf/users/docs/user_guide_v4/v4.0/contents.html
For now everything is messy and chaotic, I hope in the next "days" I'll go cleaning up, expanding and fixing the codes.
At this point post_process.py
will read the data from a wrfout
file, and generate a number of images corresponding to each of the properties we study.
myuser@mycomputer:~$ python post_process.py /path/to/WRFOUTs/wrfout_dXX_YYYY-MM-DD_HH:MM:SS
These images have no margins and they are all created with the same geographical extent, so they can be stacked one on top of each other as desired.
UPDATE: go check out the codes in https://github.com/B4dWo1f/RUNplots