NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
The netCDF Operators NCO version 4.6.3 are ready. http://nco.sf.net (Homepage, Mailing lists) http://github.com/nco (Source Code, Releases, Developers) What's new? 4.6.3 adds many new convenience features to existing functionality like JSON, ncap2, ncremap, and ncclimo. Multi-dimensional bracketing completes our JSON implementation. ncap2 adds a convenient UDUnits conversion function. ncremap and ncclimo support long options. ncclimo supports binary climatology generation and annual-mean mode. Work on NCO 4.6.4 has commenced. Planned improvements include CMake builds, more flexibility in handling extensive variables during regridding. Enjoy, Charlie NEW FEATURES (full details always in ChangeLog): A. ncclimo supports "binary climos" and annual-mean mode. Binary climos are climos created from merging two climos, rather than re-computing a climos from raw input. This saves disk space and time for long climos. Annual-mean mode allows ncclimo to process input files that are annual rather than monthly means. http://nco.sf.net/nco.html#ncclimo B. ncrcat and ncra now re-base data (move to a common time origin) from arbitrary time units in multiple calendar systems. Previously, re-basing only worked when the basetime (i.e., the YYMMDD in units like "XXX since YYMMDD") changed. Now rebasing takes into account the full units, both the increment (XXX) and the basetime (YYMMDD). Thanks to Dave Allured for the suggestion and Henry Butowsky for the re-implementation. http://nco.sf.net/nco.html#rbs C. ncap2 now supports converting data between any two compatible units systems supported by UDUnits. The udunits() function takes and input variable and a UDUnits dimension string. T[lon]={0.0,100.0,150.0,200.0}; T@units="Celsius"; T=udunits(T,"kelvin"); print(T); 273.15, 373.15, 423.15, 473.15 ; The method auto-magically reads var_in@units and var_in@calendar (so, YES, this works with dates) attributes as necessary. Thanks to Henry Butowsky for this feature. http://nco.sf.net/nco.html#udunits_fnc D. ncclimo and ncremap now support long options, e.g., ncclimo --case=caseid --input=drc_in --output=drc_out --map=rgr_map ncremap --input_file=in_fl --destination=dst_fl --output_file=out_fl http://nco.sf.net/nco.html#ncclimo http://nco.sf.net/nco.html#ncremap E. ncclimo and ncremap now save the full command line with which they were invoked as a single global attribute. Previously portions were saved as separate attributes. The new attributes are climo_command and remap_command. Their contents will exactly replicate (except for datestamps) the climatologies or regridded files they were created for. This improved provenance comes at the cost of up to a few kB more metadata in each file. F. ncks can now print attribute CDL types as comments. CDL attribute types can be hard for humans to discern, so now ncks will print the type when invoked with -D 1 or greater. The printed file is fully CDL-compliant and works with ncgen. Credit to whomever first thought of this feature, and implemented it in some CDL output someone sent me. zender@firn:~/nco$ ncks -D 1 -C --cdl ~/nco/data/in_4.nc ... att_var:byte_att = 0b, 1b, 2b, 127b, -128b, -127b, -2b, -1b ; // byte att_var:char_att = "Sentence one.\n", "Sentence two.\n" ; // char att_var:short_att = 37s ; // short att_var:int_att = 73 ; // intatt_var:float_att = 73.f, 72.f, 71.f, 70.01f, 69.001f, 68.01f, 67.01f ; // float att_var:double_att = 73., 72., 71., 70.01, 69.001, 68.01, 67.010001 ; // double att_var:ubyte_att = 0ub, 1ub, 2ub, 127ub, 128ub, 254ub, 255ub, 0ub ; // ubyte
att_var:ushort_att = 37us ; // ushort att_var:uint_att = 73ul ; // uint G. JSON brackets Similar the CDL and XML backends, ncks supports JSON (as of 4.6.2). ncks now prints strided brackets to demarcate inner dimensions of multi-dimensional variable data. Invoking with --json vs. --jsn_fmt=4 on foo(2,3,4) yields: "data": [[[0.0, 1.0, 2.0, 3.0], [4.0, 5.0, 6.0, 7.0], [8.0, 9.0, 10.0, 11.0]], [[12.0, 13.0, 14.0, 15.0], [16.0, 17.0, 18.0, 19.0], [20.0, 21.0, 22.0, 23.0]]] "data": [0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, 13.0, 14.0, 15.0, 16.0, 17.0, 18.0, 19.0, 20.0, 21.0, 22.0, 23.0] Bracketed data are suitable for pasting into Python. More sample output at: http://dust.ess.uci.edu/tmp/in.json and other *.json files. Thanks to Henry Butowsky for implementing the bracketing. http://nco.sf.net/nco.html#json BUG FIXES: A. None! KNOWN PROBLEMS DUE TO NCO: This section of ANNOUNCE reports and reminds users of the existence and severity of known, not yet fixed, problems. These problems occur with NCO 4.6.3 built/tested under MacOS 10.12.1 with netCDF 4.4.1 on HDF5 1.8.16 and with Linux with netCDF 4.4.2-development (20161116) on HDF5 1.8.16. A. NOT YET FIXED (NCO problem)Correctly read arrays of NC_STRING with embedded delimiters in ncatted arguments
Demonstration:ncatted -D 5 -O -a new_string_att,att_var,c,sng,"list","of","str,ings" ~/nco/data/in_4.nc ~/foo.nc
ncks -m -C -v att_var ~/foo.nc 20130724: Verified problem still exists TODO nco1102 Cause: NCO parsing of ncatted arguments is not sophisticated enough to handle arrays of NC_STRINGS with embedded delimiters. B. NOT YET FIXED (NCO problem?)ncra/ncrcat (not ncks) hyperslabbing can fail on variables with multiple record dimensions
Demonstration: ncrcat -O -d time,0 ~/nco/data/mrd.nc ~/foo.nc 20140826: Verified problem still exists 20140619: Problem reported by rmla Cause: Unsure. Maybe ncra.c loop structure not amenable to MRD? Workaround: Convert to fixed dimensions then hyperslab KNOWN PROBLEMS DUE TO BASE LIBRARIES/PROTOCOLS: A. NOT YET FIXED (netCDF4 or HDF5 problem?) Specifying strided hyperslab on large netCDF4 datasets leads to slowdown or failure with recent netCDF versions. Demonstration with NCO <= 4.4.5: time ncks -O -d time,0,,12 ~/ET_2000-01_2001-12.nc ~/foo.nc Demonstration with NCL: time ncl < ~/nco/data/ncl.ncl 20140718: Problem reported by Parker Norton 20140826: Verified problem still exists 20140930: Finish NCO workaround for problem Cause: Slow algorithm in nc_var_gets()? Workaround #1: Use NCO 4.4.6 or later (avoids nc_var_gets()) Workaround #2: Convert file to netCDF3 first, then use stride B. NOT YET FIXED (netCDF4 library bug)Simultaneously renaming multiple dimensions in netCDF4 file can corrupt output
Demonstration:ncrename -O -d lev,z -d lat,y -d lon,x ~/nco/data/in_grp.nc ~/foo.nc # Completes but file is unreadable
ncks -v one ~/foo.nc20150922: Confirmed problem reported by Isabelle Dast, reported to Unidata
20150924: Unidata confirmed problem 20160212: Verified problem still exists in netCDF library 20160512: Ditto 20161028: Verified problem still exists with netCDF 4.4.1 Bug tracking: https://www.unidata.ucar.edu/jira/browse/fxm More details: http://nco.sf.net/nco.html#ncrename_crdC. FIXED in netCDF Development branch as of 20161116 and in maintenance release 4.4.1.1 nc-config/nf-config produce erroneous switches that cause NCO builds to fail
This problem affects netCDF 4.4.1 on all operating systems. Some pre-compiled netCDF packages may have patched the problem. Hence it does not affect my MacPorts install of netCDF 4.4.1. Demonstration: % nc-config --cflags # Produces extraneous text that confuses make Using nf-config: /usr/local/bin/nf-config -I/usr/local/include -I/usr/local/include -I/usr/include/hdf If your nc-config output contains the "Using ..." line, you are affected by this issue. 20161029: Reported problem to Unidata20161101: Unidata confirmed reproducibility, attributed to netCDF 4.4.1 changes
20161116: Unidata patch is in tree for netCDF 4.4.2 release 20161123: Fixed in maintenance release netCDF 4.4.1.1 D. NOT YET FIXED (would require DAP protocol change?) Unable to retrieve contents of variables including period '.' in name Periods are legal characters in netCDF variable names. Metadata are returned successfully, data are not. DAP non-transparency: Works locally, fails through DAP server. Demonstration:ncks -O -C -D 3 -v var_nm.dot -p http://thredds-test.ucar.edu/thredds/dodsC/testdods in.nc # Fails to find variable
20130724: Verified problem still exists. Stopped testing because inclusion of var_nm.dot broke all test scripts.NB: Hard to fix since DAP interprets '.' as structure delimiter in HTTP query string.
Bug tracking: https://www.unidata.ucar.edu/jira/browse/NCF-47 E. NOT YET FIXED (would require DAP protocol change) Correctly read scalar characters over DAP. DAP non-transparency: Works locally, fails through DAP server. Problem, IMHO, is with DAP definition/protocol Demonstration:ncks -O -D 1 -H -C -m --md5_dgs -v md5_a -p http://thredds-test.ucar.edu/thredds/dodsC/testdods in.nc
20120801: Verified problem still exists Bug report not filed Cause: DAP translates scalar characters into 64-element (this dimension is user-configurable, but still...), NUL-terminated strings so MD5 agreement fails "Sticky" reminders: A. Reminder that NCO works on most HDF4 and HDF5 datasets, e.g., HDF4: AMSR MERRA MODIS ... HDF5: GLAS ICESat Mabel SBUV ... HDF-EOS5: AURA HIRDLS OMI ... B. Pre-built executables for many OS's at: http://nco.sf.net#bnr -- Charlie Zender, Earth System Sci. & Computer Sci. University of California, Irvine 949-891-2429 )'(
netcdfgroup
archives: