NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
I am using netcdf-4.1.1 and hdf5 1.8.4-patch1 on an AIX 5.3 system I am opening several netcdf3 format files to read and writing using the parallel netcdf4 nf90_open_par method. I was crashing on trying to close one of the files opened to write with a meaningless 'HDF error' message string. So I rebuilt netcdf with --enable-logging ( Can you consider implementing this feature using an environment variable instead of requiring a recompile?) and now I am crashing earlier in the program. When I try to use ncdump to look at one of the files written I get ... HDF5-DIAG: Error detected in HDF5 (1.8.4-patch1) thread 0: #000: H5F.c line 1514 in H5Fopen(): unable to open file major: File accessability minor: Unable to open file #001: H5F.c line 1309 in H5F_open(): unable to read superblock major: File accessability minor: Read failed #002: H5Fsuper.c line 322 in H5F_super_read(): unable to load superblock major: Object cache minor: Unable to protect metadata #003: H5AC.c line 1831 in H5AC_protect(): H5C_protect() failed. major: Object cache minor: Unable to protect metadata #004: H5C.c line 6160 in H5C_protect(): can't load entry major: Object cache minor: Unable to load metadata into cache #005: H5C.c line 10990 in H5C_load_entry(): unable to load entry major: Object cache minor: Unable to load metadata into cache #006: H5Fsuper_cache.c line 467 in H5F_sblock_load(): truncated file major: File accessability minor: File has been truncated /contrib/netcdf-4.1.1-mpi/bin/ncdump: camrun.cam2.h0.0000-01-01-00000.nc: HDF error
netcdfgroup
archives: