NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

Thredds out of memory

Hi guys,

I am looking at serving some very large files through thredds. I had found through trial-and-error that on one particular server, somewhere between 60Mb and 300Mb thredds stopped being able to start serving up files before the client timed out.

Unfortunately, this machine services a number of people so I had to do my testing elsewhere. I have a 579Mb NetCDF file on my desktop machine, and tried doing a local test with this, installing my file server and the thredds server on it. What I found was that the thredds server was running out of heap space. Now, I know I can alter the amount of heap space the JVM has available somehow, and that's what I'll try next, but I don't know whether that's a reliable solution. I don't really know how much memory thredds needs on top of the size of the file it's trying to serve, and of course multiple incoming requests might also affect this - I don't know how tomcat deals with that kind of thing in terms of creating new JVM instances etc.

Here is the error from catalina.out:

DODServlet ERROR (anyExceptionHandler): java.lang.OutOfMemoryError: Java heap space
requestState:
 dataset: 'verylarge.nc'
 suffix: 'dods'
 CE: ''
 compressOK: false
 InitParameters:
   maxAggDatasetsCached: '20'
   maxNetcdfFilesCached: '100'
   maxDODSDatasetsCached: '100'
   displayName: 'THREDDS/DODS Aggregation/NetCDF/Catalog Server'

java.lang.OutOfMemoryError: Java heap space


So my question is: what's the best way to make a reliable server than can serve these large files?

Cheers,
-Tennessee

  • 2005 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the thredds archives: