NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Dear all, I want to use the nccopy script to change the chunking of a large 2D dataset (first dimension time ("snapshots"), second point index ("gllpoints_all")). The original file has the structure: $ ncdump -sch original.nc netcdf original { dimensions: snapshots = 820 ; gllpoints_all = 249984 ; variables: // global attributes: :npoints = 249984 ; group: Snapshots { variables: float strain_dsus(snapshots, gllpoints_all) ; strain_dsus:_Storage = "chunked" ; strain_dsus:_ChunkSizes = 1, 249984 ; } // group Snapshots For further processing of the file, I want to change the chunks so that each contains all the time steps at one point. I do this with $ nccopy -c "snapshots/820,gllpoints_all/1" original.nc new.nc However, the resulting chunk sizes are somewhat weird: {55, 17856} instead of {820, 1}: $ ncdump -sch new.nc netcdf axisem_output_3 { dimensions: snapshots = 820 ; gllpoints_all = 249984 ; // global attributes: :npoints = 249984 ; group: Snapshots { variables: float strain_dsus(snapshots, gllpoints_all) ; strain_dsus:_Storage = "chunked" ; strain_dsus:_ChunkSizes = 55, 17856 ; } // group Snapshots Is there an obvious mistake on my side or might there be a problem with variables in groups? I am using netcdf library version 4.3.1.1 of Feb 26 2014 12:06:45 cheers, Simon Stähler -- o»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«o | Simon Stähler PhD student | | | | Geophysik, Ludwig-Maximilians-Universität München | | Theresienstrasse 41 80333 München Germany | | Tel. : +49 (0)89 2180 4143 | | www.geophysik.uni-muenchen.de/members/staehler | o»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«»«o
netcdfgroup
archives: