NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

slow write on large files




I'm using NetCDF to stream video to disk real time.  I'm doing this on a
PC, running windows, and C, Windows implementation of NetCDF.  Using Visual
C++ 6.0.

The data is coming in at about 7.5 MB/second.
I'm using successive calls to nc_put_vara_short() to write each frame of
video data.
When my file size is small (1000 frames of data or less), I have no
problem.  When my file size is much larger (1500 frames or more), my
application runs horribly.  It appears that the calls to nc_put_vara_short
are not keeping up with the incoming data.  Yes, I'm buffering in RAM, but
my buffer is limited, and eventually I overflow my buffer.  The real bugger
is that the problems start occuring very early in a large file, it's not
like it's fine for the first 1000 frames, then starts lagging.  I'm seeing
problems very early on in a large file.

Is there something about the way that nc_put_vara_short is coded that it
slows down based on the TOTAL SIZE of the file (or just the Variable
portion of the file).

When I write the data out to disk, using just CFile::Write() instead of
using the NetCDF library, I have no problems.


Thanks,

Jim



* CFile is an MFC (Microsoft Foundation Classes) provided class to make
disk i/o simple.


  • 2004 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: