NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Off-the-cuff, and personal thought here...You raise a valid and important question: What software was used to create this dataset, and can we find a way to use it...
I suspect we should, for long-term archives, have a metadata field for the code used to create the file, and if it's not "netcdf" but rather, e.g., a model that may have reinterpretted a release of netcdf codes/libs rather than calling the libs, metadata identifying the revision/release of the codes it derives from.
This also implies that we keep a copy of different releases around, and migrate them to new architectures periodically. This is a less-daunting task than doing a format migration on a largish archive.
gerry Dan Wilkinson wrote:
I am participating in the planning for large future archive that will rely heavily on NetCDF, and this question came to mind. Should some sort of NetCDF software time capsule be included in the meta data to insure that the NetCDF data could be interpreted many decades hence? The prudent thing to do would be to perform a format migration every few years, however, I can see how very large volume archives might be neglected in that regard.Thanks, Dan _______________________________________________ netcdf-java mailing list netcdf-java@xxxxxxxxxxxxxxxxFor list information or to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/
-- Gerry Creager -- gerry.creager@xxxxxxxx Texas Mesonet -- AATLT, Texas A&M University Cell: 979.229.5301 Office: 979.862.3982 FAX: 979.862.3983 Office: 1700 Research Parkway Ste 160, TAMU, College Station, TX 77843
netcdf-java
archives: