NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Hi Ben, > A couple of questions. Your suggestion of setting the the chunking > explicit doesn't seem like it would help as one first has to define the > variable before you can set the chunk size and we are crashing when > defining it unless I am missing something. We did notice that there are > several code changes between 4.2.1 and 4.3.1.1 in the > nc4_find_default_chunksizes2 in nc4var.c which is where we are crashing. > > In 4.2.1 if the dimension is unlimited the the chunksize is set to 1 and > it looks like one would skip the code where we are crashing: > > suggested_size = (pow((double)DEFAULT_CHUNK_SIZE/(num_values * type_size), > 1/(double)(var->ndims - num_set)) * var->dim[d]->len - .5); > > In 4.3.1.1 the setting of the unlimited dimension chunksize to 1 was > removed. I'm guessing the code previous to 4.3.1.1 that set the > chunksize to 1 for unlimited dimensions was saving us. We did notice > that the latest version of nc4var.c on github has extra code after the > line in question for set chunksizes of 1-D record variables as well as a > few other changes so I'm wondering if this is a bug fix? I committed a change to make sure the statement above wouldn't divide by zero if num_values is zero, but I'm not sure that was the cause of the crash you encountered. > Unfortunately we have not been able to reproduce this in a small > example program but has there been some change underneath the hood > that that might have occurred that we should be taking a look at? Could you please capture the output of "ncdump -sh yourfile.nc", for the file you were able to create using netCDF version 4.2.1, and send it to support-netcdf@xxxxxxxxxxxxxxxx? With that, we might be able to reproduce the problem using the ncgen utility. Thanks. --Russ
netcdfgroup
archives: