NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Hi, John and Dennis! Thank you for the explaining why NetCDF-Java adapted such design decision and what's really going on inside. The URL for case 3. is here: http://eosdap.hdfgroup.uiuc.edu:8080/opendap/data/NASAFILES/hdf5/more/mabel_l2a_20110322T165030_005_1.h5 Don't be surprised if ToolsUI gives a pop-up window with an enormous width. :-) On Fri, Mar 16, 2012 at 1:29 PM, John Caron <caron@xxxxxxxxxxxxxxxx> wrote: > On 3/16/2012 12:02 PM, Dennis Heimbigner wrote: >> >> >>> >>> 1. It reads DDS, DAS outputs and then attempts to read the contents of >>> all string data variables in DDS. Since some strings may be very long, >>> it degrades the performance. I don’t know why it tries to access >>> string variables when it opens data via OPeNDAP. Reading DDS and DAS >>> seems sufficient. > > > Its because netcdf data model has char datatype but dap2 has String, and > theres no way to tell the length of a string without reading. > prefetching strings was a compromise on fixing this problem. > it can go away with DAP4. > > > >> The client side code attempts to do some prefetching of small variables. >> The hidden assumption is that string values are not very big, and indeed >> this >> is usually the case. I am curious what kind of dataset you have that >> violates >> this assumption. >> >>> >>> 2. If a string size is more than 32767, it throws an error. I don’t >>> understand why it put such restriction yet. The line 151 and 152 of >>> [1] checks Short.MAX_VALUE, which is 2^15 - 1 = 32767 according to >>> [2]. This restriction makes it fail to access otherwise valid OPeNDAP >>> data. > > > dennis can you check this? is 32767 limit part of dap2 spec? > > >> >>> 3. It forms a very long URL string if data file has many variables >>> with long variable names. If the requested URL is too long for server >>> to handle, it returns an error “opendap.dap.DAP2.Exception: Method >>> failed:HTTP/1.1 400 Bad Request on URL=http...<all variables will be >>> listed here>”. I don’t understand why it tries to append all variables >>> to get DDS output. Here's the part of code that constructs a very long >>> URL: > > > Joe, can you send a URL with this problem? > > >> This occurs primarily because of prefetching of "small" variables. >> Later, it may need to get the whole DDS (for display in toolsUI, for >> example) >> so it forms a URL requesting all variables except those already >> pre-fetched. >> >> =Dennis Heimbigner >> Unidata. >> >> >> _______________________________________________ >> netcdf-java mailing list >> netcdf-java@xxxxxxxxxxxxxxxx >> For list information or to unsubscribe, visit: >> http://www.unidata.ucar.edu/mailing_lists/ > > > _______________________________________________ > netcdf-java mailing list > netcdf-java@xxxxxxxxxxxxxxxx > For list information or to unsubscribe, visit: > http://www.unidata.ucar.edu/mailing_lists/
netcdf-java
archives: