NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

Re: Performance problem with large files

hinsen@xxxxxxxxxxxxxxxxxxxxx writes:

 > ... The data in the files is essentially
 > one single-precision float array of dimensions 8000 x 3 x 16000, the
 > last dimension being declared as "unlimited". I read and write
 > subarrays of shape 1 x 3 x 16000. ...

For simplicity call the unlimited dimension t. A netcdf file stores
all the data for t=1, then for t=2 etc. Your description of the
array indices means that each subarray is scattered through the
entire file and requires accessing almost every file block. Things
should be a lot better if you write subarrays of 8000 x 3 x 1 or if
you can't do this, rearrange the file so that the 8000 dimension is
unlimited rather than the 16000 dimension.

Martin Dix 

CSIRO Atmospheric Research                Phone: +61 3 9239 4533
Private Bag No. 1, Aspendale                Fax: +61 3 9239 4444
Victoria 3195  Australia                  Email: martin.dix@xxxxxxxxxxxx


  • 1999 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: