NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: large file support




Ah - found the problem.  I was using an old jar file in my netbeans project.

I'm glad you guys finally started naming the jar file with the version. For 2.1, the web site gave me a file called netcdfAll.jar (same name as 2.0), but now the base jar is ncCore-2.2.16.jar.

Not only is my app reading my new 64bit-offset files, but the Negative-seek-offset problem cleared up, too.

Keep up the good work,
Chris

__________________________________________________________________
Christopher W. Moore             email: address@hidden
Research Scientist, Oceanography                 tel: 206.526.6779
University of Washington/JISAO/NOAA-PMEL         fax: 206.526.6744
NOAA Center for Tsunami Research                       Seattle, WA
------------------------------------------------------------------

On Thu, 26 Oct 2006, John Caron wrote:

Hi Chris:

Sounds like a bug!
Any way tou can put your data on an ftp or http server so I can recreate the problem?

thanks

address@hidden wrote:

Hi John & others,

A lot of us are creating large netcdf files these days, and I was wondering if anyone is having the same kind of problems I am:

When I open a large (~2Gb) netcdf file using the latest stable release, I've been getting an error. The code was working when the file size was smaller (fewer time steps, smaller lat/lon resolution).

My data is simply
dimensions:
        LON = 1276 ;
        LAT = 950 ;
        TIME = UNLIMITED ; // (601 currently)
variables:
        double LON(LON) ;
                LON:units = "degrees_east" ;
                LON:point_spacing = "even" ;
        double LAT(LAT) ;
                LAT:units = "degrees_north" ;
                LAT:point_spacing = "uneven" ;
        double TIME(TIME) ;
                TIME:units = "SECONDS" ;
        float HA(TIME, LAT, LON) ;

I start by opening the file, getting the time dimension & variable. It croaks when reading the variable into an Array (note that I'm only reading the TIME axis array, fairly small):

            ...
            Variable testVar = timeDim.getCoordinateVariable();
            System.out.println("testVar.getName: " + testVar.getName());
System.out.println("testVar.getDataType: " + testVar.getDataType());
            System.out.println("testVar.getRank: " + testVar.getRank());
            int[] testShape = testVar.getShape();
            for (int q=0; q<testShape.length; q++)
                System.out.println("testShape["+q+"]: " + testShape[q]);
            int[] testOrigin = new int[testVar.getRank()];
            Array testArr2 = testVar.read(testOrigin,testShape);

error: java.io.IOException: Negative seek offset
java.io.IOException: Negative seek offset
        at java.io.RandomAccessFile.seek(Native Method)
        at ucar.netcdf.RandomAccessFile.read_(RandomAccessFile.java:508)
        at ucar.netcdf.RandomAccessFile.seek(RandomAccessFile.java:350)
at ucar.netcdf.NetcdfFile$V1DoubleIo.readArray(NetcdfFile.java:1520)
        at ucar.netcdf.NetcdfFile$V1Io.copyout(NetcdfFile.java:896)
        at ucar.netcdf.Variable.copyout(Variable.java:276)
        at ucar.nc2.Variable.read(Variable.java:237)


I immediately thought it was a Large File Support problem, but a little investigation turned up the fact that I don't (yet) need 64-bit addressing because my record sizes are small. In fact, I can read the file fine with applications that use the C and FORTRAN netcdf libraries (version 3.6 or higher). And ncdump works.

But I went ahead and created the large file with 64-bit addressing anyway by simply changing one line in my FORTRAN model from

        iret = nf_create(ncfn, NF_CLOBBER, ncid)
to

        iret = nf_create(ncfn, OR(NF_CLOBBER, NF_64BIT_OFFSET), ncid)

It creates the file just fine, checking with:

      od -An -c -N4 foo.nc
gives the expected
      C    D    F 002

and ncdump gives

     netcdf s_2903-563_ha64 { // format variant: 64bit
     dimensions: ...

but my java code still croaks (in the same way Ferret does if not compiled against netcdf lib > 3.6):

Exception in thread "AWT-EventQueue-0" java.lang.IllegalArgumentException: Not a netcdf file
        at ucar.netcdf.NetcdfFile.readV1(NetcdfFile.java:1745)
        at ucar.netcdf.NetcdfFile.<init>(NetcdfFile.java:130)
        at ucar.netcdf.NetcdfFile.<init>(NetcdfFile.java:148)
        at ucar.nc2.NetcdfFile.<init>(NetcdfFile.java:61)


Anybody else out there creating java applications that read large files?

I'm running RHEL4 on an EMT64 x86_64 machine, with java 1.5.0_06 and java netcdf version 2.2.16

Yours,
Chris

__________________________________________________________________
Christopher W. Moore             email: address@hidden
Research Scientist, Oceanography                 tel: 206.526.6779
University of Washington/JISAO/NOAA-PMEL         fax: 206.526.6744
NOAA Center for Tsunami Research                       Seattle, WA
------------------------------------------------------------------

=============================================================================== To unsubscribe netcdf-java, visit:
http://www.unidata.ucar.edu/mailing-list-delete-form.html
===============================================================================

===============================================================================
To unsubscribe netcdf-java, visit:
http://www.unidata.ucar.edu/mailing-list-delete-form.html
===============================================================================



===============================================================================
To unsubscribe netcdf-java, visit:
http://www.unidata.ucar.edu/mailing-list-delete-form.html
===============================================================================