NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
"Problems compiling netCDF3" (Nov 28, 9:01pm) References: <m0vTCce-0001T4C@xxxxxxxxxxxxxxx> On Nov 28, 9:01pm, Tim Weilkiens wrote: > ./nc_test -c > make[2]: *** [test.nc] Floating point exception > make[2]: *** Deleting file `test.nc' We have run into a similar problem on other systems with the tests. The test deliberatly tries to encode values which are "Out of Range" for a given external type, in order to verify that the appropiate error code is set. What happens in the code (ncx.m4) is something like this: signed char *cp; double dd = some_big_number; ... if( dd > X_SCHAR_MAX || dd < X_SCHAR_MIN) return_code = NC_ERANGE; *cp = dd; The idea is that we tell you that a range error (overflow) occured, but let the system do what it would normally do. What the system actually does is this situation is "undefined". It turns out that on some systems, this includes generating SIGFPE (Floating point exception). In order to retain the behavior specified and yet have the test proceed, we will have to ignore SIGFPE. To nc_test/nc_test.c, #include <signal.h> and add a line like (void) signal(SIGFPE, SIG_IGN); early in nc_test/nc_test.c main(). -glenn
netcdfgroup
archives: