bugGNU Octave - Bugs: bug #45294, [octave forge] (mpi) issues with...


bug #45294: [octave forge] (mpi) issues with struct (with test case and patch)

Submitted by:  Alexander Barth <abarth>
Submitted on:  Wed 10 Jun 2015 12:23:37 PM UTC  
Category: Octave Forge PackageSeverity: 3 - Normal
Priority: 5 - NormalItem Group: Incorrect Result
Status: NoneAssigned to: Carlo de Falco <cdf>
Originator Name: Alexander BarthOpen/Closed: Open
Release: devOperating System: GNU/Linux

Add a New Comment(Rich Markup)

You are not logged in

Please log in, so followups can be emailed to you.


(Jump to the original submission Jump to the original submission)

Mon 04 Dec 2017 09:50:58 AM UTC, comment #16:


I finally locate the problem. I did part of the install as a super user. As a consequence, some files were not in the right repositories.
After fixing this, the package you propose worked just fine. Thank you for this and thank you for your reactivity.

Val <valeee>
Wed 29 Nov 2017 12:30:34 PM UTC, comment #15:

It seems very strange, if you used that procedure, that
MPI_Init.oct ended up being installed in the directory


the oct files installed with the pkg command should
go into


by default OCTAVE_HOME is "~/octave" and canonical_host_type
should be something like "x86_64-pc-linux-gnu"
so I would expect the full path of MPI_Init.oct to
be something like


do you have anything installed into


? Maybe you have duplicate installations elsewhere?

Carlo de Falco <cdf>
Project MemberIn charge of this item.
Wed 29 Nov 2017 11:57:46 AM UTC, comment #14:


This is what I did :

*I downloaded mpi-2.0.tar.z at https://gitserver.mate.polimi.it/redmine/attachments/download/55/mpi-2.0.tar.gz

*Within octave I did :
pkg install mpi-2.0.tar.gz

Was it wrong?

Val <valeee>
Wed 29 Nov 2017 11:53:01 AM UTC, comment #13:

Can you please explain what is exactly the procedure you used
to install the package?

Carlo de Falco <cdf>
Project MemberIn charge of this item.
Wed 29 Nov 2017 11:32:23 AM UTC, comment #12:


I downloaded the pre-release you proposed for my octave 4.2.1 on my 16.04 LTS Ubuntu.
Everything worked well and I managed to get rid of the errors and install the mpi package.

However, when I tried to run a parrallel code of mine, I got this
error: mainddlin: /home/valentine/Documents/Codes_calcul/DDmpi_2/mpi/MPI_Init.oct: failed to load: /home/valentine/Documents/Codes_calcul/DDmpi_2/mpi/MPI_Init.oct: undefined symbol: _ZNK5ArrayISsE17resize_fill_valueEv
error: execution exception in mainddlin.m

By searching on the internet, someone else got this issue

It was fixed by installing the package from forge and not from the source, which is exactly what I did avoid...

Do you have some ideas?

Val <valeee>
Tue 14 Mar 2017 09:23:04 AM UTC, comment #11:


I have uploaded a new pre-release version of the MPI
package that should finally fix this issue.

Instead of trying to add more logic to the current codebase
to fix the problems described here, I tried a completely
new approach which consists of rewriting Send and Recv
function to use binary streams.

In addition to overcoming current issues this also simplifies
the maintainance of the code a lot, it might create an overhead
but I was not able to measure it.

The pre-release can be found here: https://gitserver.mate.polimi.it/redmine/attachments/download/55/mpi-2.0.tar.gz

I would really appreciate some testing before I go through
the official release procedure, especially to check compatibility
with Octave 4.2.x and earlier as I only tested on Octave 4.3.0+

Carlo de Falco <cdf>
Project MemberIn charge of this item.
Wed 18 Jan 2017 10:06:37 PM UTC, comment #10:

The good news: the version from hg compiles using my homebrewed Octave 4.2 with a few warnings:

warning: 'real' is deprecated [-Wdeprecated-declarations]
warning: 'imag' is deprecated [-Wdeprecated-declarations]
warning: 'nelem' is deprecated [-Wdeprecated-declarations]
warning: 'capacity' is deprecated [-Wdeprecated-declarations]

I will do some tests (parareal...) but this might take some time.


Sebastian <sschoeps>
Wed 18 Jan 2017 04:08:30 PM UTC, comment #9:


Carlo de Falco <cdf>
Project MemberIn charge of this item.
Tue 17 Jan 2017 01:48:00 PM UTC, comment #8:

Can you post the link please? I think the repo that I found is not up to date.

Sebastian <sschoeps>
Tue 17 Jan 2017 09:07:20 AM UTC, comment #7:

The version in the repository should compile with 4.2
(I have received in private at least one successful install report) but the issue with tags and structures is not reliably fixed.

If you could test and report issues that would be helpful.

If the package does at least install correctly on 4.2.0 I would lean towards making a release in order to get more reports about
how it works.

Carlo de Falco <cdf>
Project MemberIn charge of this item.
Mon 16 Jan 2017 09:37:29 PM UTC, comment #6:

Dear all, what is the current status of the mpi package for 4.2?


Sebastian <sschoeps>
Fri 23 Oct 2015 03:13:26 PM UTC, comment #5:

It is my intention to apply this change to the MPI package,
but given some recent changes in core Octave MPI will need many
more updates to be compatible to Octave 4.2.

I don't think I will be able to make a backward compatible update, so if you need a version of MPI to work with Octave < 4.2 it is better to apply the patch yourself.

Carlo de Falco <cdf>
Project MemberIn charge of this item.
Thu 22 Oct 2015 10:25:44 AM UTC, comment #4:

I would need to install the mpi package on a new machine. I am wondering if this patch will be applied to the version in the hg repository, or if I should continue my own version with this patch.

Best regards,

Alexander Barth <abarth>
Fri 12 Jun 2015 01:04:43 PM UTC, comment #3:

Yes, I have noticed also problems with non-scalar structures.
When you send to multiple nodes at once (using a vector of ranks), to current code can hang. I think that this has to do with the handling of the tag parameter which is a bit simplified in the attached patch.

For instance I did not increment the tag by the capacity of the cell. I presume that in the original code, this is to avoid the reuse of the same tag number. This is a good idea, but it assumes that each individual cells can be send with a single MPI_send call (which is not the case if the individual element is a struct for instance).

A better approach, in my opinion, would be to pass the tag parameter by reference to the send_class/send_cell (and similar) functions and increment the tag by the actual calls to MPI_Send/MPI_Recv.

But as this is a quite large change. Here I wanted to limit the extend of the patch.

Alexander Barth <abarth>
Fri 12 Jun 2015 05:39:28 AM UTC, comment #2:


Thanks for the patch.

It seems you are trying to do more
than solve the problem with scalar
structures in this code, right?

I'll look at it more in detail later
and add this fix in the next release.

thanks again,

Carlo de Falco <cdf>
Project MemberIn charge of this item.
Wed 10 Jun 2015 04:08:37 PM UTC, comment #1:

Adding maintainer of mpi package.

Carnë Draug <carandraug>
Project Member
Wed 10 Jun 2015 12:23:37 PM UTC, original submission:

It seems that the MPI package cannot send scalar structures. e.g.
simple structs created by:

I get the error message:
error: MPI_Send: unsupported class scalar struct

I you send a array struct, it is transformed into a scalar struct at the receiving end.

The attached script test_mpi_send_receive shows this issue. I think that there is also problem in the code when sending a structure to multiple nodes: It makes a loop over all ranks (rankrec), but within the loop it send again the message to all nodes instead to only the current node.

I have prepared a patch which corrects these issues.
There is also a problem with cells, but it leave this for later.

I hope that you accept the attached patch. It would be great if you can also include the test-case (test_mpi_send_receive.m) which also tests other data types.

Best regards,

Alexander Barth <abarth>


(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):

Attached Files
file #34192:  mpi_struct.patch added by abarth (8KiB - text/x-patch)
file #34193:  test_mpi_send_receive.m added by abarth (4KiB - text/x-objcsrc)


Depends on the following items: None found

Items that depend on this one: None found


Carbon-Copy List
  • -unavailable- added by valeee (Posted a comment)
  • -unavailable- added by jwe (Updated the item)
  • -unavailable- added by cdf
  • -unavailable- added by cdf
  • -unavailable- added by sschoeps (Posted a comment)
  • -unavailable- added by cdf (Posted a comment)
  • -unavailable- added by carandraug (Carlo de Falco - maintainer of the mpi package)
  • -unavailable- added by abarth (Submitted the item)
  • -unavailable- added by abarth

    Do you think this task is very important?
    If so, you can click here to add your encouragement to it.
    This task has 0 encouragements so far.

    Only project members can vote.


    Please enter the title of George Orwell's famous dystopian book (it's a date):



    Follow 9 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Sun 13 Aug 2017 01:42:04 PM UTCjweSummaryoctave-forge-mpi issues with struct (with test case and patch)=>[octave forge] (mpi) issues with struct (with test case and patch)
    Tue 14 Mar 2017 09:37:06 AM UTCcdfCarbon-Copy-=>Added -unavailable-
    Tue 14 Mar 2017 09:23:04 AM UTCcdfRelease3.8.0=>dev
      Carbon-Copy-=>Added -unavailable-
    Fri 12 Jun 2015 05:48:43 AM UTCcdfAssigned toNone=>cdf
    Wed 10 Jun 2015 04:08:37 PM UTCcarandraugCarbon-Copy-=>Added -unavailable-
    Wed 10 Jun 2015 12:23:37 PM UTCabarthAttached File-=>Added mpi_struct.patch, #34192
      Attached File-=>Added test_mpi_send_receive.m, #34193
      Carbon-Copy-=>Added -unavailable-

    Back to the top

    Powered by Savane 3.1-cleanup1