Unable to parse large XML file using XML :: TWIG

I am trying to parse a large XML file (about 100,000 records) using XML :: Twig, but the perl analysis is not with an error:

perl.exe - Application Error: The instruction at "0x28086920" referenced memory at "0x00000004". The memory could not be "written"... 

I read that XML::Twig parses large xml files without any problems, but in my case it fails with the above error.

My .pl file has forloop code that rotates 100,000 times lower:

 foreach my $d1(@detailData) { if ($d1->first_child('a')->getElementsByTagName('b')) { $id= $d1->first_child('a')->first_child('x')->field('b'); } .... ..... ...... } 

Inside forloop I have about 20 loops as above. Does this cause memory problems?

Can anyone suggest me how to overcome these memory problems?

+4
source share
1 answer

After googling for perl "The memory could not be written" I would suggest that the problem occurs either when loading the file directly into memory, see, for example, http://www.perlmonks.org/?node_id=457265 or (less likely ) from mixing modules compiled with various compilers (for example, using Activestate packages with Cygwin perl, see http://cygwin.com/ml/cygwin/2006-12/msg00798.html ).

For XML :: Twig for working with huge files, you need to specify at what level the parts of the file should be processed (usually by defining handlers that process the subtree and then discard it), see the module documents.

+6
source

Source: https://habr.com/ru/post/1393902/


All Articles