NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Dear all hands, I try to get the field CLOUD out from 239 history files in netCDF format. There are 18 levels, 128 longitudes and 64 latitudes for each CLOUD field in each history file. So when I ran the ncl, it reminded me that """"" fatal:NclMalloc Failed:[errno=11]:Resource temporarily unavailable fatal:_NclBuildConcatArray : Memory allocation failure fatal:Execute: Error occurred at or near line 6 """"" There isn't enough memory to deal with it. What can I do? The following is the CLOUD.ncl I used to get the CLOUD out. begin diri = "/disk/cj2/ccm/ccm-mqz/run.amip2/" ; directory where fil es reside fils = systemfunc ("ls "+diri+"19??-??.nc") f = addfiles (fils, "r") ; note the "s" of addfile ListSetType (f, "cat") ; concatenate or "merge" T = f[:]->CLOUD print (dimsizes(T)) ; T will have 60 time steps T!0 = "time" T!1 = "lev" T!2 = "lat" T!3 = "lon" T&time = f[:]->time ; time coord variable T&lev = f[0]->lev ; get lev from the 1st file T&lat = f[0]->lat ; get lat from the 1st file T&lon = f[0]->lon ; get lon from the 1st file filo = "./nc/CLOUD.nc" system ("/usr/bin/rm "+filo) ; remove any pre-existing file fo = addfile(filo , "c") ; open output file fo@title = "CLOUD from model files" fo->CLOUD = T ; write CLOUD to a file end Anyone knows how to deal with this using any netCDF softwares? I can't insatall nco on my SUN computer now. Thanks in advance. Best regards, Qiaozhen *************************************** *The University of Texas at Austin * *Institute for Geophysics * *4412 Spicewood Springs Rd., Bldg. 600* *Austin, Texas 78759-8500 * *phone:(512) 471-0462 * ***************************************
netcdfgroup
archives: