2014-01-29 56 views
1

如何在我的代碼中實現多線程以減少時間。在從遠程服務器獲取文件時,在perl中實現多線程?

if(exists $ddts_attachments->{$id}->{'urls'}){ 
    sub do { 
    foreach my $url(sort keys %{$ddts_attachments->{$id}->{'urls'}}){ 
     $ENV{HTTP_proxy}=$proxy_url; 
     my $ff = File::Fetch->new(uri => $url); 
     my $where = $ff->fetch(to => "/attachments5/$id/"); 
     my $file = $ff->file; 
     delete $ENV{HTTP_proxy}; 
     print "url: $file attached to $id key \n ......\n"; 
    } 
    } 
} 

在這裏散$ddts_attachments我已存儲的URL列表,從這些URL我要取的文件和目錄下存放。 請任何人都可以幫助我如何實現多線程,有助於減少時間。

+0

操作系統? – ysth

+0

現在我正在windows中運行。但在將來它必須運行在linux –

回答

0

這裏是一個可能的解決方案:

use strict; 
use threads; 
use Thread::Queue; 

my $queue = Thread::Queue->new(); 
my @threads; 
my $maxthread = 5; #how many threads are you want 
push @threads, threads->create(\&worker) for 1 .. $maxthread; 

if(exists $ddts_attachments->{$id}->{'urls'}){ 
    foreach my $url(sort keys %{ $ddts_attachments->{$id}->{'urls'} }){ 
     $queue->enqueue($url);  
    } 
    $queue->enqueue(undef) for 1 .. $maxthread; #no more data to process 
} 
#wait here until all worker finish 
$_->join for @threads; 

sub worker { 
    while (defined(my $url = $queue->dequeue)) { 
     my $tid = threads->tid; 
     print "Thread $tid got $url\n"; 
     #download and store the url 
     local $ENV{HTTP_proxy} = $proxy_url; 
     my $ff = File::Fetch->new(uri => $url); 
     my $where = $ff->fetch(to => "/attachments5/$id/"); 
     my $file = $ff->file; 
     print "Thread $tid url: $file attached to $id key \n ......\n";  
    } 
} 
+0

感謝您的答覆amon,但如何獲得在子例程工人的url的價值? –

+0

@ user3177669我沒有寫代碼,user1126070沒有。該URL通過'$ queue'傳遞給worker。你的問題到底是什麼? – amon

+0

它的工作很好 –

相關問題