NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
The compression issue (i.e., lack thereof in certain cases) that I noted in 4.3.2 is still present in 4.3.3-rc3. We have confirmed that 4.3.1 works as expected with WRF model output, but 4.3.2 results in larger files that are apparently only partially compressed. ncdump -s gives the same info for both, indicating that the variables should be compressed. I suspect an issue with 4-D variables where the time dimension is unlimited, but I haven't managed a simple test case to demonstrate the problem. (For our local cloud model, COMMAS, the file sizes are the same as CDF-1, i.e., the compression doesn't happen at all despite ncdump -s showing the compression attributes). Any ideas? I compiled 4.3.3-rc3 against hdf5-1.8.13, and all tests used netcdf-fortran-4.4.0. -- Ted __________________________________________________________ | Edward Mansell <ted.mansell@xxxxxxxx> | National Severe Storms Laboratory |-------------------------------------------------------------- | "The contents of this message are mine personally and | do not reflect any position of the U.S. Government or NOAA." |-------------------------------------------------------------- On 1/14/2015 11:09 PM, Ward Fisher wrote: > Hello all, > > The third netCDF 4.3.3 release candidate is out, after a bit of a wait. > There have been a number of bug-fixes and improvements made since the > last release. Assuming everything goes well with this release candidate, > we hope to have a full release in the near future. > > The release and associated release notes may be found at: > > * https://github.com/Unidata/netcdf-c/releases/tag/v4.3.3-rc3 > > > -Ward > > Ward Fisher > UCAR/Unidata - Software Engineer > wfisher@xxxxxxxxxxxxxxxx <mailto:wfisher@xxxxxxxxxxxxxxxx> > >
netcdfgroup
archives: