I built in Laravel a script that reads the JSON file line by line and imports the contents into my database.
However, when I run the script, I get an error from memory after inserting about 80 thousand records.
mmap() failed: [12] Cannot allocate memory mmap() failed: [12] Cannot allocate memory PHP Fatal error: Out of memory (allocated 421527552) (tried to allocate 12288 bytes) in /home/vagrant/Code/sandbox/vendor/laravel/framework/src/Illuminate/Database/Query/Builder.php on line 1758 mmap() failed: [12] Cannot allocate memory PHP Fatal error: Out of memory (allocated 421527552) (tried to allocate 32768 bytes) in /home/vagrant/Code/sandbox/vendor/symfony/debug/Exception/FatalErrorException.php on line 1
I created a kind of temporary queue to only capture collected items every 100, but that didn't matter.
This is what the part of my code that does the inserts looks like:
public function callback($json) { if($json) { $this->queue[] = [ 'type' => serialize($json['type']), 'properties' => serialize($json['properties']), 'geometry' => serialize($json['geometry']) ]; if ( count($this->queue) == $this->queueLength ) { DB::table('features')->insert( $this->queue ); $this->queue = []; } } }
These are the actual inserts ( DB::table('features')->insert( $this->queue );
) that cause an error, if I leave them I can perfectly iterate over all rows and output them without any problems with performance.
I suppose I could allocate more memory, but I doubt it would be a solution, because I am trying to insert 3 million records, and currently it does not work after 80K with 512 MB of allocated memory. Also, I really want to run this script on a low-budget server.
The time taken to execute this script is not a concern, so if I could somehow slow down the insertion of records, this would be a solution that I could agree to.
source share