NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Hi Matt, Your understanding of the quantization implementation is correct on all counts. The actual # of bits per digit is ln(10)/ln(2) ~ 3.32. And note that netCDF implementation defines NSB as the number of _explicitly stored bits_, which is one less than the number of significant bits because the IEEE format implicitly defines the first significant bit as 1. Thus NSB <=23 not <=24. Charlie -- Charlie Zender, Earth System Sci. & Computer Sci. University of California, Irvine 949-891-2429 )'(
netcdfgroup
archives: