NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

[netcdf-java] Accessing data files larger than 2GB.

John,

I am trying to access a really large NetCDF file (~3GB), but
it results in error as shown below.

uri='http://www.gri.msstate.edu/rsearch_data/nopp/fvcom_2.5gb.ncml';

>>>>
GridDataset gds = GridDataset.open(uri);

?? Java exception occurred:
java.io.IOException: Server has malformed Content-Length header

at ucar.unidata.io.http.HTTPRandomAccessFile.<init>(HTTPRandomAccessFile.java:110)
   ...........
<<<<

In 'HttpRandomAccessFile.java', I see that this error is due to the 'content-length' in http header being parsed as 'Integer'. So, any file size exceeding the 32bit representation will have this problem.

>>>> nj2.2.22
public HTTPRandomAccessFile(String url, int bufferSize) throws IOException {
   .....
   head = method.getResponseHeader("Content-Length");
   .......
   total_length = Integer.parseInt(head.getValue());
     } catch (NumberFormatException e) {
       throw new IOException("Server has malformed Content-Length header");
     }
<<<<

Is there any strong reason the 'Content-Length' cannot be parsed as 'Long' to accommodate file size > 2.1GB ? My internal tests shows that changing the code to parse as 'long' surely solves the problem, but I am not sure if I am setting myself for some unforeseen disaster dealing with other aspects of netcdf-java API.

Will appreciate your valuable input,

thanks

Sachin.

--
Sachin Kumar Bhate, Research Associate
MSU-High Performance Computing Collaboratory, NGI
John C. Stennis Space Center, MS 39529
http://www.northerngulfinstitute.org/ <http://www.northerngulfinstitute.org/>


  • 2008 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdf-java archives: