NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
On Tue, 12 Dec 1995, Russ Rew wrote: > Hi Arlindo, > > > For the first time I experienced the following error from netCDF > > > > ncvarput: xdr_NC_fill > > ncvarput: NCcoordck fill, var taux, rec 1175: File too large > > I can't reproduce this problem. The limit on the size of netCDF files is > supposed to be determined by the size of byte offsets for locations in the > file. These are currently signed 32-bit integers, so that should permit a 2 > Gbyte netCDF file. Any chance that this is just a process limit set by setrlimit? Under irix csh (I have no experience with Alpha's) you can get the soft limits with the "limit" command and the hard limits with "limit -h". 'Just a shot in the dark, but you never know... ---- Peter Neelin (neelin@xxxxxxxxxxxxxxxxx) Positron Imaging Laboratories, Montreal Neurological Institute
netcdfgroup
archives: