NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
eric@xxxxxxxxxxxxxxxxxxx (Eric Bachalo) writes: >When we encounter a low memory situation (malloc's may return NULL) we >find that the NetCDF code consistently fails in the following files. [stuff deleted...] >| 0 NC_free_var(var = 0xffffffff) from array.c#348 >| 1 NC_free_array(array = 0x003ee498) from cdf.c#22 >| 2 NC_free_xcdf(handle = 0x003ee1d0) from cdf.c#35 >| 3 NC_free_cdf(handle = 0x003ee1d0) from cdf.c#84 >| 4 NC_new_cdf(name = "c:\testdata\f50.nc", mode = 1) from file.c#100 >| 5 NC_open(path = "c:\testdata\f50.nc", mode = 1) from file.c#145 >| 6 ncopen(path = "c:\testdata\f50.nc", mode = 1) from dsdata.c#3349 >| 7 RecordOpen(filename = "f50.nc", varName = "V10", path = "c:\testdata") fr [stuff deleted...] [excerpt from function NC_new_cdf in cdf.c] > 80: if(cdf->xdrs->x_op == XDR_DECODE) /* Not NC_CREAT */ > 81: { > 82: if(!xdr_cdf(cdf->xdrs, &cdf) ) > 83: { > 84: NC_free_cdf(cdf) ; > 85: return(NULL) ; > 86: } I've run into this problem before (on SGI's running irix 4.0.5). I was planning on trying to characterize it properly before reporting it, but since it has come up... The problem that I've encountered (same or similar traceback to the one above) arises when the file is truncated so that the header is not completely present (data truncation doesn't cause problems). In this case, xdr_cdf returns an error and NC_free_cdf is called. Unfortunately, the cdf structure has not been completely initialized (either the cdf structure is inconsistent as returned by xdr_cdf or NC_free_cdf is not checking it properly - I haven't had time to figure out who is causing the problem), so we can get a core dump (using uninitialized pointers) IF ncopts DOES NOT HAVE NC_FATAL SET. If NC_FATAL is set then the program exits before trying to clean up and no problem occurs. To repeat, I do not believe that this is a problem with malloc checking, I think that it is a problem in the use of uninitialized pointers in the cdf structure. If I am correct, then your file "c:\testdata" should be truncated - running ncdump on it should give the message "ncopen: xdr_NC_array: loop" or something similar, depending on where it's chopped (of course it could also be some DOS weirdness that causes xdr_cdf to return an error). This bug has forced me into some weird code: I sometimes want to automatically decompress only the header of a very large file, stopping the decompression when ncopen stops returning an error. To catch the error without core dumping, I have to fork (not particularly amenable to your DOS situation) and set ncopts = NC_FATAL before trying ncopen and exiting! So I will appreciate any fix that appears. To the NetCDF support folks: If you would like any more information than I've given (code, files, whatever), let me know - I'm more than happy to provide. -- Peter Neelin (neelin@xxxxxxxxxxxxxxxxx) Positron Imaging Laboratories, Montreal Neurological Institute
netcdfgroup
archives: