NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Hello, I have a problem with NetCDF C++ Library.I have to write a netcdf file with 4-dimension data (providing from a grib file).
I use the code below: float values[NT][NZ][NJ][NI]; for (it = 0; it < NT; it++) { for (iz = 0; iz < NZ; iz++) { for (j = 0; j < NJ; j++) { for (i = 0; i < NI; i++) { values[it][iz][j][i] = ....; } } } } var->set_cur(0,0,0,0); //using NcVar var var->put(&values[0][0][0][0], NT, NZ, NJ, NI);This works fine, the netcdf file is correct, the data are saved properly. But, with this 'static' 4-dim array declaration, i am using the stack memory which is limited. With big data, i could have a segmentation fault...
So I'd like to dynamically declare the values array. I am using this code below: float * values = (float *) malloc(NT*NZ*NJ*NI*sizeof(float)); for (it...) for (iz ...) for (j ...) for (i...) values[it*NZ+iz*NJ+j*NI+i] = ....; var->set_cur(0,0,0,0); var->put(values, NT, NZ, NJ, NI);In this case, a netcdf file is generated, but the data are not saved properly.
Do you know why this happens and how can I solve the problem? Thanks, Carole
netcdfgroup
archives: