NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

Re: Compressed netCDF library available

John -- Thanks for looking at the libs.  You commented:

> My only comment is that when i was thinking about this same problem a few 
> years
> back, I  realized that that amount of compression you can get it very data 
> type
> dependent. In particular, if you require your data be stored as floating 
> point,
> many datasets wont compress at all, because the low order bits of the mantissa
> are essentially random.  Unless there is some repeating pattern like a 
> constant
> field or missing data. (I was working with gridded model output)
> 
> It might be worth collecting compression ratios from your library on various
> data files and types, so that people get a sense of what to expect for the
> different cases.

You are right that the amount of compression will be data dependent and
not necessarily reflect the information content.  I would be willing to
compile any compression ratios that people want to send me.  You can
use the nczip program to make a compressed netcdf file of any normal 
netcdf files you have around.  This doesnt' require installing the znetcdf
library or changing your programs.  Send me the compression ratios and
a description of the type of data (or a cdl).

One way to try to improve the compression ratio without changing the netcdf
structure is to normalize floating point numbers to a reasonable precision.
I.e. new_val = (round(val * 1000.0)/1000.0)

This would be the same as using a scale and offset in the netcdf but wouldn't
require the reading software to understand the particular scheme used.

--Bill Noon
Northeast Regional Climate Center
Cornell University

  • 1998 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: