bugGNU Octave - Bugs: bug #45294, [octave forge] (mpi) issues with...

 
 

bug #45294: [octave forge] (mpi) issues with struct (with test case and patch)

Submitter:  Alexander Barth <abarth>
Submitted:  Wed 10 Jun 2015 12:23:37 PM UTC
   
 
Category:  Octave Package Severity:  3 - Normal
Priority:  5 - Normal Item Group:  Incorrect Result
Status:  Fixed Assigned to:  cdf
Originator Name:  Alexander Barth Open/Closed:  * Closed
Release:  * dev Operating System:  * GNU/Linux
Fixed Release:  None Planned Release:  None
* Mandatory Fields

Add a New Comment Rich Markup
   

Jump to the original submission

Thu 14 Mar 2019 08:25:02 AM UTC, comment #17: 

Closing as fixed based on previous comments from 2017 indicating that this has been addressed in the mpi package.

Mike Miller <mtmiller>
Group Member
Mon 04 Dec 2017 09:50:58 AM UTC, comment #16: 

Hello

I finally locate the problem. I did part of the install as a super user. As a consequence, some files were not in the right repositories.
After fixing this, the package you propose worked just fine. Thank you for this and thank you for your reactivity.

Val <valeee>
Wed 29 Nov 2017 12:30:34 PM UTC, comment #15: 

It seems very strange, if you used that procedure, that
MPI_Init.oct ended up being installed in the directory

/home/valentine/Documents/Codes_calcul/DDmpi_2/mpi/

the oct files installed with the pkg command should
go into

OCTAVE_HOME/mpi-2.0.0/canonical_host_type

by default OCTAVE_HOME is "~/octave" and canonical_host_type
should be something like "x86_64-pc-linux-gnu"
so I would expect the full  path of MPI_Init.oct to
be something like

/home/valentine/octave/mpi-2.0.0/x86_64-pc-linux-gnu/MPI_Init.oct

do you have anything installed into

/home/valentine/octave/mpi-2.0.0

? Maybe you have duplicate installations elsewhere?

Carlo de Falco <cdf>
Group Member
Wed 29 Nov 2017 11:57:46 AM UTC, comment #14: 

Hello

This is what I did :

*I downloaded mpi-2.0.tar.z at https://gitserver.mate.polimi.it/redmine/attachments/download/55/mpi-2.0.tar.gz

*Within octave I did :
pkg install mpi-2.0.tar.gz

Was it wrong?

Val <valeee>
Wed 29 Nov 2017 11:53:01 AM UTC, comment #13: 

Hi,
Can you please explain what is exactly the procedure you used
to install the package?

Carlo de Falco <cdf>
Group Member
Wed 29 Nov 2017 11:32:23 AM UTC, comment #12: 

Hello

I downloaded the pre-release you proposed for my octave 4.2.1 on my 16.04 LTS Ubuntu.
Everything worked well and I managed to get rid of the errors and install the mpi package.

However, when I tried to run a parrallel code of mine, I got this
error: mainddlin: /home/valentine/Documents/Codes_calcul/DDmpi_2/mpi/MPI_Init.oct: failed to load: /home/valentine/Documents/Codes_calcul/DDmpi_2/mpi/MPI_Init.oct: undefined symbol: _ZNK5ArrayISsE17resize_fill_valueEv
error: execution exception in mainddlin.m

By searching on the internet, someone else got this issue
https://ubuntuforums.org/showthread.php?t=2300259.

It was fixed by installing the package from forge and not from the source, which is exactly what I did avoid...

Do you have some ideas?

Val <valeee>
Tue 14 Mar 2017 09:23:04 AM UTC, comment #11: 

Hi,


I have uploaded a new pre-release version of the MPI
package that should finally fix this issue.

Instead of trying to add more logic to the current codebase
to fix the problems described here, I tried a completely
new approach which consists of rewriting Send and Recv
function to use binary streams.

In addition to overcoming current issues this also simplifies
the maintainance of the code a lot, it might create an overhead
but I was not able to measure it.

The pre-release can be found here: https://gitserver.mate.polimi.it/redmine/attachments/download/55/mpi-2.0.tar.gz

I would really appreciate some testing before I go through
the official release procedure, especially to check compatibility
with Octave 4.2.x and earlier as I only tested on Octave 4.3.0+




Carlo de Falco <cdf>
Group Member
Wed 18 Jan 2017 10:06:37 PM UTC, comment #10: 

The good news: the version from hg compiles using my homebrewed Octave 4.2 with a few warnings:

warning: 'real' is deprecated [-Wdeprecated-declarations]
warning: 'imag' is deprecated [-Wdeprecated-declarations]
warning: 'nelem' is deprecated [-Wdeprecated-declarations]
warning: 'capacity' is deprecated [-Wdeprecated-declarations]

I will do some tests (parareal...) but this might take some time.

Bye
Sebastian

Sebastian <sschoeps>
Wed 18 Jan 2017 04:08:30 PM UTC, comment #9: 
Carlo de Falco <cdf>
Group Member
Tue 17 Jan 2017 01:48:00 PM UTC, comment #8: 

Can you post the link please? I think the repo that I found is not up to date.

Sebastian <sschoeps>
Tue 17 Jan 2017 09:07:20 AM UTC, comment #7: 

The version in the repository should compile with 4.2
(I have received in private at least one successful install report) but the issue with tags and structures is not reliably fixed.

If you could test and report issues that would be helpful.

If the package does at least install correctly on 4.2.0 I would lean towards making a release in order to get more reports about
how it works.


Carlo de Falco <cdf>
Group Member
Mon 16 Jan 2017 09:37:29 PM UTC, comment #6: 

Dear all, what is the current status of the mpi package for 4.2?

Sebastian

Sebastian <sschoeps>
Fri 23 Oct 2015 03:13:26 PM UTC, comment #5: 

It is my intention to apply this change to the MPI package,
but given some recent changes in core Octave MPI will need many
more updates to be compatible to Octave 4.2.

I don't think I will be able to make a backward compatible update, so if you need a version of MPI to work with Octave < 4.2 it is better to apply the patch yourself.


Carlo de Falco <cdf>
Group Member
Thu 22 Oct 2015 10:25:44 AM UTC, comment #4: 

I would need to install the mpi package on a new machine. I am wondering if this patch will be applied to the version in the hg repository, or if I should continue my own version with this patch.

Best regards,
Alexander

Alexander Barth <abarth>
Fri 12 Jun 2015 01:04:43 PM UTC, comment #3: 

Yes, I have noticed also problems with non-scalar structures.
When you send to multiple nodes at once (using a vector of ranks), to current code can hang. I think that this has to do with the handling of the tag parameter which is a bit simplified in the attached patch.

For instance I did not increment the tag by the capacity of the cell. I presume that in the original code, this is to avoid the reuse of the same tag number. This is a good idea, but it assumes that each individual cells can be send with a single MPI_send call (which is not the case if the individual element is a struct for instance).

A better approach, in my opinion, would be to pass the tag parameter by reference to the send_class/send_cell (and similar) functions and increment the tag by the actual calls to MPI_Send/MPI_Recv.

But as this is a quite large change. Here I wanted to limit the extend of the patch.


Alexander Barth <abarth>
Fri 12 Jun 2015 05:39:28 AM UTC, comment #2: 

Hi,

Thanks for the patch.

It seems you are trying to do more
than solve the problem with scalar
structures in this code, right?

I'll look at it more in detail later
and add this fix in the next release.

thanks again,
c.

Carlo de Falco <cdf>
Group Member
Wed 10 Jun 2015 04:08:37 PM UTC, comment #1: 

Adding maintainer of mpi package.

Carnë Draug <carandraug>
Group Member
Wed 10 Jun 2015 12:23:37 PM UTC, original submission:  



It seems that the MPI package cannot send scalar structures. e.g.
simple structs created by:


message.f1 = 12.3
message.f2 = 45.6


I get the error message:
error: MPI_Send: unsupported class scalar struct

I you send a array struct, it is transformed into a scalar struct at the receiving end.

The attached script test_mpi_send_receive shows this issue. I think that there is also problem in the code when sending a structure to multiple nodes: It makes a loop over all ranks (rankrec), but within the loop it send again the message to all nodes instead to only the current node.


  // Now we start the big loop
  const octave_idx_type *rankrec_ptr = rankrec.fortran_vec ();
  for (octave_idx_type i = 0; i < rankrec.nelem (); i++)
    {
[...]

      // iterate through keys(fnames)

      for (octave_map::const_iterator p = map.begin (); p != map.end (); p++)
        {

          // *** inside the loop over all ranks, the message is send to all ranks (instead of only rank i ***

          info = send_class (comm, key, rankrec, ntagkey);
          if (info != MPI_SUCCESS) return info;

         [...]
        }
    }


I have prepared a patch which corrects these issues.
There is also a problem with cells, but it leave this for later.

I hope that you accept the attached patch. It would be great if you can also include the test-case (test_mpi_send_receive.m) which also tests other data types.

Best regards,
Alex



Alexander Barth <abarth>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach Files:
   
   
Comment:
   

Attached Files
file #34192:  mpi_struct.patch added by abarth (8KiB - text/x-patch)
file #34193:  test_mpi_send_receive.m added by abarth (4KiB - text/x-objcsrc)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by valeee (Posted a comment)
  • -email is unavailable- added by jwe (Updated the item)
  • -email is unavailable- added by cdf
  • -email is unavailable- added by cdf
  • -email is unavailable- added by sschoeps (Posted a comment)
  • -email is unavailable- added by cdf (Posted a comment)
  • -email is unavailable- added by carandraug (Carlo de Falco - maintainer of the mpi package)
  • -email is unavailable- added by abarth (Submitted the item)
  • -email is unavailable- added by abarth
  •  

    There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.

    Only group members can vote.

     

    Follow 11 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2019-03-14 mtmiller StatusNone Fixed
        Open/ClosedOpen Closed
    2017-08-13 jwe Summaryoctave-forge-mpi issues with struct (with test case and patch) [octave forge] (mpi) issues with struct (with test case and patch)
    2017-03-14 cdf Carbon-Copy- Added -email is unavailable-
    2017-03-14 cdf Release3.8.0 dev
        Carbon-Copy- Added -email is unavailable-
    2015-06-12 cdf Assigned toNone cdf
    2015-06-10 carandraug Carbon-Copy- Added -email is unavailable-
    2015-06-10 abarth Attached File- Added mpi_struct.patch, #34192
        Attached File- Added test_mpi_send_receive.m, #34193
        Carbon-Copy- Added -email is unavailable-

    Back to the top

    Powered by Savane 3.13-4448.
    Corresponding source code