NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
>Here are a few examples of data requiring more than 32 bits of precision and >the measurement time to reach the IEEE 64-bit double precision: (Fascinating table omitted) Remember, however, that we're talking about accessing such datasets via a computer. A netCDF object that contained over 2^53 observations would require quit a bit of storage (by my estimation, at least 10 petabytes (10^15 bytes)). Since the largest, extant mass storage systems are only in the single terabyte range (10^12 bytes) and since planned storage systems (e.g. the Sequoia 2000 project) are only in the 100 terabyte range, I think we can safely rule-out (for the moment anyway) a requirement for representing more than 2^53 observations. >I am a new user of netCDF. I was attracted to it because it had no inherent >properties dedicated to a particular discipline, like FITS, for example. >I hope the developers keep this discipline-free attribute ranked high as they >decide how to improve a useful system. "Discipline-freedom" is one of our goals; ease and convenience are two others. User-feedback is still another. Please let us know if you feel we have overlooked anything. Steve Emmerson <steve@xxxxxxxxxxxxxxxx>
netcdfgroup
archives: