Skip to content
Dominik Schneider edited this page Nov 27, 2017 · 2 revisions

Introduction

I have never successfully prepared the meteo data for this model. I received meteo input files from Bin and failed to produce new input files when I tried to use his workflow. Below are some general notes about what the model expects and the troubles I had. Sorry not to be of more help!

Hourly NLDAS2 is used for the meteorological forcing data, but you could use any hourly gridded dataset. For the Sierra and Upper Colorado we used NLDAS_FORA0125_H.002. If you use different forcing data you should make sure that the assumptions are similar because there is a lot of downscaling that occurs in the model. For example, the height of the windspeed or whether the incoming shortwave is for the top of the canopy.

The variables (and filenames) you need are:

  • incoming longwave radiation (dlwrf.dat)
  • incoming shortwave radiation (goes.dat)
  • precipitation (precip.dat)
  • surface pressure (ps.dat)
  • near surface air temperature (sat.dat)
  • near surface specific humidity (spehum.dat)
  • windspeed (windspeed.dat)

Details

Bin used a climate software called grads so that's what the workflow is based around. It's not particularly intuitive but it does deal with large GRIB data well. I found it easy to install on a mac with homebrew. Bin uses linux which led to some differences though.

Bin and I had an email exchange where I tried to get this working on my computer. I ended up with several issues that I never resolved and Bin wasn't sure how serious they were. They resulted from version differences and OS differences between our computers. In the end, I never created new input data to run the reconstruction (except forest canopy density and fsca), I only ran the reconstruction with existing meteo data from Bin and with different fsca. Since I never got this workflow to work all I can pass on is in the email thread. In the thread you will see:

  1. get.sh to download grib files for the forcings
  2. a link to install some grib utilities http://www.cpc.ncep.noaa.gov/products/wesley/grib2ctl.html which involves installing wgrib and running a perl script.
  3. some grads scripts to convert the downloaded data to the binary files needed for the reconstruction. you'll see in the filenames for each variable in his scripts.

Bin hosts his library of grads scripts at http://bguan.bol.ucla.edu/bGASL.html but the files attached to his emails can be found by date in the snodis repo.

Suggestions

I would recommend a workflow that uses more easily accessible utilities and file formats so that future generations will have it easier. NASA's earthdata portal I linked above is much improved from when I was going through this so you could actually subset the nldas2 files in space and time and download netcdf files, which I think you will find easier to work with. Based on my recollection and quickly rereading the aboutgriddeddata page I linked above, you'll need to produce a flat binary file where the raster grid is formatted rowwise from bottom left to top right. each XY grid is stacked in time. You can also look at tif2dat.m in prepare_fsca for converting a stack of geotifs to the correct .dat format. Also, below is an r function to read the files:

readgrads=function(t,fid=fid,nc=1950,nr=2580){ #1950 columns, 2580 rows
  require(raster)
  numel=nc*nr
# print(numel)
  dtype=4 #number of bits per digit
  tindex=t*numel-numel+1
  print(tindex/numel+1)
  seek(fid, where = dtype*tindex-dtype,origin = "start") ## seek is 0 index based!
  num=readBin(fid,size=dtype,n=numel,what='numeric')
  maty=matrix(num,byrow=T,ncol=nc)
  mat=maty[nrow(maty):1,]
  r=raster(mat)
  projection(r)='+init=epsg:4326'
  extent(r)=c(-112.25,-104.125,33,43.75) #there are the lat/long coordinates for the upper colorado domain
  return(r)
}
Clone this wiki locally