Perl Parallel::Forkmanager doesn't allow to co

2020-06-26 13:01发布

问题:

Maybe because the child processes are not aware of my hash (see the below code), the hash %output is not collecting anything .. is there any other way to collect the value apart from writing a tmp file ?

foreach $Item (@AllItems) {
$pid = $pm->start($Item) and next;
$Temp = qx($Item);
$output{$Item}= $Temp; // This doesn't collect anything. :-(
$pm->finish;
}

$pm->wait_all_children;

TIA, Tim

回答1:

Forked processes have their own copies (well, copies-on-write) of the parent process's memory. Writing to the hash in a child process won't affect the hash in the parent.

To do what you want, you'll need to employ some sort of IPC. See the perlipc manpage for a lengthy discussion of the various possibilities.

For something like this, I'd probably use something simple like an on-disk hash. DB_File provides a nice tied hash interface. Here's how you might do it:

use strict;
use warnings;

use DB_File;

tie my %output, "DB_File", "output.dat" ;

foreach my $item( @AllItems) { 
    my $pid = $pm->start and next;
    $output{$item} = qx($item);
    $pm->finish;
}


回答2:

Each process has its own memory, and data is not shared between processes. But you have several options:

  1. Write data from child processes to temp files to be read in parent, as you suggest
  2. Use sockets or pipes to accomplish the same thing
  3. Use threads with shared variables instead of fork()
  4. Use a shared memory facility (see this question, for example)
  5. Use a lightweight database (SQLLite, maybe DBD::CSV). This is a fancy way of using temporary files.

Any more? I don't have any idea how to use the builtin shmget/shmread/shmwrite functions or if they might be helpful here. Other commenters please feel free to edit.