patchGNU Octave - Patches: patch #7668, Enhancement, speedup of loading...

 
 

patch #7668: Enhancement, speedup of loading partial data from a hdf5 file

Submitted by:  None
Submitted on:  Thu 17 Nov 2011 09:00:22 AM UTC  
 
Category: NonePriority: 5 - Normal
Status: DonePrivacy: Public
Assigned to: Rik <rik5>Originator Email: -unavailable-
Open/Closed: Closed

Add a New Comment (Rich MarkupRich Markup):
   

You are not logged in

Please log in, so followups can be emailed to you.

 

Mon 21 Oct 2013 03:32:58 AM UTC, comment #3:

I made small changes to follow Octave coding conventions and applied your patch here (http://hg.savannah.gnu.org/hgweb/octave/rev/5415a9cd61d4). It will be part of the new 3.8 release expected in a few weeks.

Rik <rik5>
Project AdministratorIn charge of this item.
Mon 28 Nov 2011 09:18:06 AM UTC, comment #2:

I've attached two small scripts, the first creates a hdf5 file.
Thought that was the easiest way to provide a large file with similar tree structure as the ones I work with.
The second script will perform simple measurements. The file is first loaded into memory and then read again and then some data is read partially.

This is to try and get rid of latency loading stuff from disk and be fair.

Note, the second script uses clear so make sure you don't have anything important in your workspace.

octave shipped with ubuntu 11.10 (not sure which buildflags that were used):

octave:1> test_datafile
fullTime = 3.8980
partialTime = 2.9394
Variables in the current scope:

Attr Name Size Bytes Class
==== ==== ==== ===== =====
filename 1x11 11 char
fullTime 1x1 8 double
p 1x1 3840016000 struct
p500 1x1 3840016 struct
partialTime 1x1 8 double

Total is 15 elements using 3843856043 bytes

octave:2>
octave:2> version
ans = 3.2.4
octave:3>

Local modified copy of octave (-O2 -g etc):
octave:1> cd /home/linde
octave:2>
octave:2> test_datafile
fullTime = 4.2305
partialTime = 0.040248
Variables in the current scope:

Attr Name Size Bytes Class
==== ==== ==== ===== =====
filename 1x11 11 char
fullTime 1x1 8 double
p 1x1 3840016000 struct
p500 1x1 3840016 struct
partialTime 1x1 8 double

Total is 15 elements using 3843856043 bytes

octave:3> version
ans = 3.5.0+

Anonymous
Sat 26 Nov 2011 03:37:07 AM UTC, comment #1:

Could you provide a few sample hdf5 files (attach them here) so that the effects of this patch can be tested?

Jordi GutiƩrrez Hermoso <jordigh>
Project Administrator
Thu 17 Nov 2011 09:00:22 AM UTC, original submission:

I'm working with "big" datasets in hdf5 format. Files being 20-40GB is not uncommon.

If possible, I load the entire file at once:

octave:1> tic(); all = load("filename.hdf5"); toc()
Elapsed time is 418.209 seconds.
octave:2>

But when the dataset is bigger than available ram, I want to do partial loads to get out of core behavior:

octave:1> tic(); extr = load("filename.hdf5", "data000100"); toc()
Elapsed time is 301.926 seconds.
octave:2>

The same file is used in both examples. The file is ~20GB and has 2700 "data elements" which will be returned as structs. The machine I'm testing on has 24GB ram. Due to other things running, some swapping occurs when reading the entire file. The numbers should be seen as rough estimates.

My hope was that reading 1/2700th of the data should take roughly that fraction of time for reading the entire thing. Unfortunately that is not the case.

Why?

do_load will keep calling read_hdf5_data as long as it can read stuff. After read_hdf5_data has returned, do_load will check if the data read matches the variables that should be extracted before calling read_hdf5_data again..

This results in the entire hdf5 file being parsed in both examples above.

I suggest that IF just some variables should be read from a hdf5 file, the name tests should be done within read_hdf5_data so only the corresponding nodes in the file are parsed and that will save a lot of time. If the entire file should be read, things will work just as before.

The patch I've attached has this functionality and if I repeat the test
"tic(); extr = load("filename.hdf5", "data000100"); toc()", it will take less than 0.2 seconds.

I hope this patch is of interest, and if it needs changes to be considered, let me know and I'll try to adapt the patch.

/ Mattias Linde

Anonymous

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

Attached Files
file #24481:  make_datafile.m added by None (703B - text/x-objcsrc - Two small scripts, one for creating a 3.7GB hdf5 and another for reading the file)
file #24482:  test_datafile.m added by None (227B - text/x-objcsrc - Two small scripts, one for creating a 3.7GB hdf5 and another for reading the file)
file #24391:  octave-hdf5patch.txt added by None (3kB - text/plain)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by rik5 (Posted a comment)
  • -unavailable- added by jordigh (Posted a comment)
  • -unavailable- added by None (Submitted the item)
  •  

    Do you think this task is very important?
    If so, you can click here to add your encouragement to it.
    This task has 0 encouragements so far.

    Only logged-in users can vote.

     

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 8 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Mon 21 Oct 2013 03:32:58 AM UTCrik5StatusNeed Info=>Done
      Assigned tojordigh=>rik5
      Open/ClosedOpen=>Closed
    Mon 28 Nov 2011 09:07:30 AM UTCNoneAttached File-=>Added make_datafile.m, #24481
      Attached File-=>Added test_datafile.m, #24482
    Sat 26 Nov 2011 03:37:07 AM UTCjordighStatusNone=>Need Info
      Assigned toNone=>jordigh
    Thu 17 Nov 2011 09:00:22 AM UTCNoneAttached File-=>Added octave-hdf5patch.txt, #24391

    Back to the top


    Powered by Savane 3.1-cleanup