The fastest way to read a csv file

I am looking for a very fast method for reading a csv file. My data structure is as follows:

timestamp ,float , string ,ip ,string 1318190061,1640851625, lore ipsum,84.169.42.48,appname 

and I use fgetcsv to read this data into arrays.

Problem: performance. On a regular basis, the script must read (and process) more than 10,000 records.

My first attempt is very simple:

 //Performance: 0,141 seconds / 13.5 MB while(!feof($statisticsfile)) { $temp = fgetcsv($statisticsfile); $timestamp[] = $temp[0]; $value[] = $temp[1]; $text[] = $temp[2]; $ip[] = $temp[3]; $app[] = $temp[4]; } 

My second attempt:

 //Performance: 0,125 seconds / 10.8 MB while (($userinfo = fgetcsv($statisticsfile)) !== FALSE) { list ($timestamp[], $value[], $text, $ip, $app) = $userinfo; } 
  • Is there a way to improve performance even more, or is my method as fast as it could get?
  • Perhaps more important: is there a way to determine which columns are read, for example. sometimes only a timestamp, floating point columns is required. Is there a better way than my way (take a look at my second try :)

Thanks:)

+6
source share
2 answers

How long does the line last? Pass this as the second parameter to fgetcsv () and you will see the greatest improvement.

+1
source

Verify that PHP is reading this file:

If a bigg file moves to ramdisk or SSD

  • [..] sometimes only a timestamp

Something like that

 preg_match_all('#\d{10},\d{10}, (.*?),\d.\d.\d.\d,appname#',$f,$res); print_r($res); 
0
source

Source: https://habr.com/ru/post/898932/


All Articles