bugGNU Parallel - build and execute command lines from standard input in parallel - Bugs: bug #36942, Can't redirect STDOUT:...

 
 

bug #36942: Can't redirect STDOUT: Inappropriate ioctl for device

Submitted by:  Jay Hacker <jayqhacker>
Submitted on:  Tue 24 Jul 2012 05:05:51 PM UTC  
 
Category: NoneSeverity: 3 - Normal
Item Group: NoneStatus: Fixed
Privacy: PublicAssigned to: Ole Tange <tange>
Open/Closed: Closed

Add a New Comment(Rich Markup)
   

You are not logged in

Please log in, so followups can be emailed to you.

 

(Jump to the original submission Jump to the original submission)

Mon 01 Oct 2012 07:03:25 PM UTC, comment #11:

Limited to 1000 jobs in parallel if perl version == 5.008008 in [5c557c0973]

Ole Tange <tange>
Project AdministratorIn charge of this item.
Wed 15 Aug 2012 01:04:55 PM UTC, comment #10:

Doesn't sound like my idea of fun, either. :) I also found a workaround, so I'm not worried about it. Thanks for your help.

Jay Hacker <jayqhacker>
Tue 14 Aug 2012 06:25:13 PM UTC, comment #9:

http://perl5.git.perl.org/perl.git/tags might be useful for the binary search.

Ole Tange <tange>
Project AdministratorIn charge of this item.
Tue 14 Aug 2012 06:24:17 PM UTC, comment #8:

I compiled perl 5.10 on the CentOS VM and ran finddupes with this perl. It did not fail with ulimit -n 16384.

$ perl --version
... v5.10.1 (perl-5.10.1-1-gca8de22*)

That leads me to believe that the problem is caused by a bug in perl-5.8.8 itself or in combination with CentOS.

It would be good to test if perl-5,8,8 on a new distribution would exhibit the same behavior. That would rule out the combination with CentOS. Unfortunately I do not have access to that and the compile af perl-5.8.8 that I tried on my Ubuntu failed during configure.

But to get closer to a solution I think it is needed to:

  • downloading all releases of Perl (git may make this relatively painless).
  • compile them on Centos VM
  • Do binary search to figure out which version makes it fail
  • Report back.

Then I can put a hard limit in GNU Parallel that depends on the version of Perl using $].

As this is a lot of work and as this bug can be easily worked around (just dont use -j0 on those systems) I will not be the one doing the binary search.

I can put in a hard limit for the version that is known bad (5.008008).

If you want to be the binary searcher, feel free to leave a note here.

Ole Tange <tange>
Project AdministratorIn charge of this item.
Tue 14 Aug 2012 05:37:17 PM UTC, comment #7:

I can reproduce your error using the CentOS VM, too.

I tried on my Ubuntu and even with the ulimit 16384 it works fine. My guess is that we are dealing with an old bug in perl/libc/... that has been fixed on newer systems.

The "solution" might therefore turn out to be: Identify if it is this kind of system, limit the number of job slots to 1019 (which seems to work fine).

Thanks for your hard work.

Ole Tange <tange>
Project AdministratorIn charge of this item.
Tue 14 Aug 2012 01:38:34 PM UTC, comment #6:

I can reproduce the error on the CentOS 5.8 (i386) image from http://sourceforge.net/projects/virtualboximage/files/ with the following settings:

Base Memory: 4096 MB
Processors : 4

Using parallel from git revision b6a729c5 and the attached data:

$ su
# ulimit -n 16384
# time ./finddupes.sh small/{1..1021} > /dev/null

[snip...]

parallel: This should not happen. You have found a bug.
Please contact <parallel@gnu.org> and include:

  • The version number: 20120806
  • The bugid: Can't redirect STDOUT: Inappropriate ioctl for device
  • The command line being run
  • The files being read (put the files on a webserver if they are big)

If you get the error on smaller/fewer files, please include those instead.
parallel: Warning: No more processes: Decreasing number of running jobs to 1020. Raising ulimit -u may help.

Jay Hacker <jayqhacker>
Mon 06 Aug 2012 10:56:50 PM UTC, comment #5:

Could the problem be caused by you having more file handles and fewer processes (compared to file handles)? This might be the reason why I cannot reproduce the error on virtual machines.

In that case please test git version [ed69039], which has better support for limited number of processes.

Ole Tange <tange>
Project AdministratorIn charge of this item.
Wed 01 Aug 2012 01:19:51 PM UTC, comment #4:

It is good you have been able to minimize the problem set.

I cannot reproduce the error on Ubuntu64, Debian64 stable, and RHEL WS 4u3.

I do not have access to RedHat 5.8, but according to http://en.wikipedia.org/wiki/CentOS CentOS 5.8 is based on RedHat 5.8.

I cannot reproduce the problem on CentOS 5.8 x86 either.

Can you find a virtual machine from
http://sourceforge.net/projects/virtualboximage/files/ where the error exists?

Ole Tange <tange>
Project AdministratorIn charge of this item.
Thu 26 Jul 2012 08:46:56 PM UTC, comment #3:

Also, lowering the number of available files (ulimit -n) to say 1024 makes it work.

Jay Hacker <jayqhacker>
Thu 26 Jul 2012 04:53:08 PM UTC, comment #2:

OK, I was able to narrow it down; I have a script and some data (attached) that will reproduce the problem, at least on Red Hat 5.8 x86_64.

$ ./finddupes small/{1..1021}

will fail with the given error message. Using only the first 1019 files succeeds; 1020 gives the open3 error message, shows a job failed, but does not give the "should not happen" message. I can reproduce this error on several similar machines.

(file #26263)

Jay Hacker <jayqhacker>
Thu 26 Jul 2012 01:00:15 AM UTC, comment #1:

I understand you cannot include data. Do you get the same error with:

parallel md5sum ::: /about/1200/big/files.*

Can you generate the 1200 big files using something like:

seq 100000000 > big
seq 1200 | parallel cp big file{#}

Can you reproduce the error on any of the VMs here: http://sourceforge.net/projects/virtualboximage/files/

Ole Tange <tange>
Project AdministratorIn charge of this item.
Tue 24 Jul 2012 05:05:51 PM UTC, original submission:

This may not be very helpful, as I can't include the data, but, on Red Hat 5.8 x86_64:

open3: open(GLOB(0x1f712bb0), >&Job::OUT) failed: Illegal seek at /usr/bin/parallel line 3674
parallel: This should not happen. You have found a bug.
Please contact <parallel@gnu.org> and include:

  • The version number: 20120722
  • The bugid: Can't redirect STDOUT: Inappropriate ioctl for device

The command is something like:

parallel --halt 2 cut -f1 \< {} \| tail -n+2 \> {#}.fifo ::: /about/1200/big/files.*

Where I am cutting out a particular field of a CSV file, getting rid of the header, and redirecting to a FIFO. The command line totals about 52K, and the --max-line-length-allowed is 128K. It works fine with about 1/10th the number of arguments. It seems to work OK if I use

ls /about/1200/big/files.* | parallel ...

but I'm not sure about that. I can't reproduce it with fewer/smaller files. Sorry I can't be more helpful, but maybe you can do something with this much.

Jay Hacker <jayqhacker>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

Attached Files
file #26263:  bugdata.tar.bz2 added by jayqhacker (38KiB - application/x-bzip-compressed-tar)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by tange (Posted a comment)
  • -unavailable- added by jayqhacker (Submitted the item)
  •  

    Do you think this task is very important?
    If so, you can click here to add your encouragement to it.
    This task has 0 encouragements so far.

    Only logged-in users can vote.

     

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 7 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Mon 01 Oct 2012 07:03:25 PM UTCtangeStatusIn Progress=>Fixed
      Open/ClosedOpen=>Closed
    Tue 14 Aug 2012 06:24:17 PM UTCtangeStatusConfirmed=>In Progress
    Tue 14 Aug 2012 05:37:17 PM UTCtangeStatusNeed Info=>Confirmed
    Tue 07 Aug 2012 09:31:22 AM UTCtangeStatusNone=>Need Info
    Mon 06 Aug 2012 10:56:50 PM UTCtangeAssigned toNone=>tange
    Thu 26 Jul 2012 04:53:08 PM UTCjayqhackerAttached File-=>Added bugdata.tar.bz2, #26263

    Back to the top


    Powered by Savane 3.1-cleanup1