Why is my Perl script for slow file decompression when using streams?

So, I run perl 5.10 on the 2 duo macbook pro core compiled with threading support: usethreads=define, useithreads=define. I have a simple script to read 4 gzipped files containing aroud 750,000 lines each. I am using Compress :: Zlib to perform file compression and reading. I have 2 implementation, the only difference between them use threads. In addition, both scripts execute the same subroutine for reading. Therefore, in psuedocode, a program without streaming does the following:

read_gzipped(file1);
read_gzipped(file2);
read_gzipped(file3);
read_gzipped(file4);

The threaded version is as follows:

my thr0 = threads->new(\$read_gzipped,'file1')
my thr1 = threads->new(\$read_gzipped,'file1')
my thr2 = threads->new(\$read_gzipped,'file1')
my thr3 = threads->new(\$read_gzipped,'file1')

thr0->join()
thr1->join()
thr2->join()
thr3->join()

2 , -threaded script. , , . - , ?

+3
4

- GZIP - . , platter, . . , , .

+9

, , IO-, . -, script.

+10

ithreads , -, . cpu.

Parallel::ForkManager.

- Perl .

+4

, /, top, . depesz, , / ( ), , .

, /, , [1] - 100%, / - , , , .

[1] , , , , make , , , . , , , , , . , . I/O ( - , // -), .

0

Source: https://habr.com/ru/post/1722335/


All Articles