gpt4 book ai didi

perl - 如何异步运行多个 perl open 命令并按顺序显示输出

转载 作者:行者123 更新时间:2023-12-02 05:08:57 25 4
gpt4 key购买 nike

我正在尝试对多个服务器异步运行多个 SSH 命令,我想捕获命令的输出并按顺序显示它们。抛出一个额外的曲线球,我希望 pid3 只在 pid2 完成后运行,而 pid4 在前三个命令完成后运行。如何最好地实现这一目标?

示例:

// $pid1 and $pid2 should run asynchronously
my $pid1 = open(my $SSH1, "|ssh -t -t runuser\@$server{'app'} 'sudo chef-client'");

my $pid2 = open(my $SSH2, "|ssh -t -t runuser\@$server{'auth'} 'sudo chef-client'");

// This command should wait for $pid2 to complete.
my $pid3 = open(my $SSH3, "|ssh -t -t runuser\@$server{'auth'} \"sudo -- sh -c '$update_commands'\"");

// This command should wait for $pid1-3 to complete before running.
my $pid4 = open(my $SSH4, "|ssh -t -t runuser\@$server{'varn'} \"sudo -- sh -c '$varn_commands'\"");

最佳答案

Forks::Super处理所有这些要求:

use Forks::Super;

# run $command1 and $command2 , make stderr available
my $job1 = fork { cmd => $command1, child_fh => 'err' };
my $job2 = fork { cmd => $command2, child_fh => 'err' };

# job 3 must wait for job 2. Collect stdout, stderr
my $job3 = fork { cmd => $command3, depend_on => $job2, child_fh => 'out,err' };

# and job 4 waits for the other 3 jobs
my $job4 = fork { cmd => $command4, depend_on => [ $job1, $job2, $job3 ],
child_fh => 'out,err' };

# wait for jobs to finish, then we'll collect output
$_->wait for $job1, $job2, $job3, $job4;
my @output1 = $job1->read_stderr;
my @output2 = $job2->read_stderr;
my @output3 = ($job3->read_stdout, $job3->read_stderr);
my @output4 = ($job4->read_stdout, $job4->read_stderr);
...

关于perl - 如何异步运行多个 perl open 命令并按顺序显示输出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/15825879/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com