NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Yes, I get that, but in this case that should only be about an order of magnitude difference in access speed, since he is only reading 8 values (hitting the disk 8 times instead of the 1 time it would take for a variable that is not unlimited.) But he is describing a much larger difference in performance... On Thu, Jun 2, 2016 at 5:01 PM, Dave Allured - NOAA Affiliate < dave.allured@xxxxxxxx> wrote: > On Thu, Jun 2, 2016 at 2:41 PM, Ed Hartnett <edwardjameshartnett@xxxxxxxxx > > wrote: > >> I don't think the fact that it is an unlimited dimension should make it >> any slower, either in classic >> > > <snip> > > Ed, I have experienced this slow down many times with Netcdf-3 classic. > As I said earlier, if the format is classic, then a coordinate variable on > the unlimited dimension is scattered throughout the file. This is > confirmed in the Users' Guide. At the low level, many disk blocks must be > read to get the complete coordinate array. > > --Dave >
netcdfgroup
archives: