NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

Re: [netcdfgroup] Type conversion when writing netCDF

  • To: Jennifer Adams <jma@xxxxxxxxxxxxx>
  • Subject: Re: [netcdfgroup] Type conversion when writing netCDF
  • From: Rob Ross <rross@xxxxxxxxxxx>
  • Date: Mon, 29 Nov 2010 09:51:23 -0600
So you will want to split up the writes into multiple writes to keep memory allocation down. Perhaps 1GB at a time, given what you describe below. Still a lot easier than doing the conversion by hand!

Rob

On Nov 29, 2010, at 9:45 AM, Ed Hartnett wrote:

Jennifer Adams <jma@xxxxxxxxxxxxx> writes:

That's great! Does the library use a chunk of memory to do this? If my array of doubles is very large, say 20Gb, and I'm running on a system with
24GB of memory, will it work?
--Jennifer
On Nov 29, 2010, at 9:32 AM, Jim Edwards wrote:


Memory will be allocated, but only enough to convert data for each
nc_put_vara call. So if you are writing 1 MB of values at a time, it
will allocated 1 MB at a time to convert, and free that 1 MB after the
write operation is complete.

So it doesn't matter how large your variables are, only how much you try
and write at one time.

Thanks,

Ed
--
Ed Hartnett  -- ed@xxxxxxxxxxxxxxxx

_______________________________________________
netcdfgroup mailing list
netcdfgroup@xxxxxxxxxxxxxxxx
For list information or to unsubscribe,  visit: 
http://www.unidata.ucar.edu/mailing_lists/



  • 2010 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: