NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
I want to store values in NetCDF as climatological statistics, and I haven't found a good example or explanation of how this is best done for my use case. For each lon/lat point in a grid I compute distribution fitting values for each of the twelve months, and I want to store these values for later use when running various calculations on data sets that share the same grid. So for each lon/lat point in the grid there are twelve values, one per month. The time values therefore don't adhere to the normal conventions where any specific calendar dates are applicable, but it's not clear from any of the documentation I've seen how you best represent generic month values as time coordinate axis values. For example how do you represent "January" using a valid udunits for the time dimension's/variable's units attribute? The data values I'm storing do have a pair of associated "calibration period" dates associated with them, in that they're computed from data beginning at one date and ending at another (i.e. what we refer to as the "calibration period") -- maybe I just use the calibration period start date within the udunits string, such as "days since <calibration_start_date>"? For my purposes I can easily work with ordinal values (i.e. 0 = January, 1 = February, etc.) but in order to conveniently use the data sets within my Java code as GridDatasets it appears that the time coordinate variable needs to adhere to a conventional representation of time values with standard udunits, and it's not clear what (if any) best practice for this is. Can anyone advise as to how this is best represented in NetCDF for my use case? Thanks in advance for any suggestions or insight. --James
netcdfgroup
archives: