bugGNU Parallel - Bugs: bug #39814, Feature request: --pipe with :::

 
 

bug #39814: Feature request: --pipe with :::

Submitter:  Ole Tange <tange>
Submitted:  Mon 19 Aug 2013 09:07:27 AM UTC
   
 
Category:  None Severity:  1 - Wish
Item Group:  None Status:  Fixed
Privacy:  Public Assigned to:  tange
Open/Closed:  Closed
* Mandatory Fields

Add a New Comment Rich Markup
   

Sun 05 Feb 2017 01:49:16 AM UTC, comment #3: 

Fixed in [c028fa0].

Ole Tange <tange>
Group administrator
Fri 14 Nov 2014 03:46:18 PM UTC, comment #2: 

How should it work with --pipe-part?

Ole Tange <tange>
Group administrator
Fri 14 Nov 2014 03:45:45 PM UTC, comment #1: 

What would be expected behaviour?

::: and :::: and -a should be interchangeable: So if they contain the same data, the behaviour should be the same.

In this example the output is expected to go to all the commands:

cat id_rsa.pub | parallel --pipe ssh {} "sudo tee -a /home/foobar/.ssh/authorized_keys" :::: serverlist.txt

Like:

cat id_rsa.pub | tee >(ssh s1 "sudo tee -a /home/foobar/.ssh/authorized_keys") >(ssh s2 "sudo tee -a /home/foobar/.ssh/authorized_keys")

With --xapply one block should be given to one argument:

cat lines | parallel --xapply 'cat >{}' ::: {1..10}

would be splitting lines in to 1MB blocks and saving the first 10 blocks to different files.

Ole Tange <tange>
Group administrator
Mon 19 Aug 2013 09:07:27 AM UTC, original submission:  

It never makes sense to have --pipe read from the command line. It will always be a pipe or from a file.

And it is a minimal restriction to have it only read from STDIN (i.e. -a will not work).

By making that restriction you could have commands like:

seq 3 | parallel --pipe 'echo {}; cat' ::: A B C

The idea here being that the same input will be sent to all three commands. So it will run:

seq 3 | parallel --pipe 'tee >(echo A; cat) >(echo B; cat) >(echo C; cat) >/dev/null'

but with normal --group buffering. Maybe like:

seq 3 | parallel --pipe 'tee >((echo A; cat)>/tmp/buffer1) >((echo B; cat)>/tmp/buffer2) >((echo C; cat)>/tmp/buffer3) >/dev/null'

There are several limiting problems:

  • Command line size: Too many arguments or combinations will make the command line too big
  • Number of processes: Too many arguments or combinations will spawn too many processes.
  • Getting the exit code back if one of the processes dies. The total exit code should be of the first that failed.
  • Cleanup will not be done if ctrl-C.


But if implemented it would make a pretty solution to:
http://stackoverflow.com/questions/1570328/run-few-commands-simultaneously

find ./incoming/kontraktor/ -type f -name '*.html' | sort |
  parallel "awk 'NR % 3 == {}' | ./bin/foo.py -m 3 -b 3 | next_command" ::: 1 2 0  >> log/foo_log.log 2>&1



Ole Tange <tange>
Group administrator

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach Files:
   
   
Comment:
   

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by tange (Submitted the item)
  •  

    There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.

    Only logged-in users can vote.

     

    Follow 3 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2017-02-05 tange StatusNone Fixed
        Assigned toNone tange
        Open/ClosedOpen Closed

    Back to the top

    Powered by Savane 3.13-d3ae.
    Corresponding source code