I am looking for a very fast method for reading a csv file. My data structure is as follows:
timestamp ,float , string ,ip ,string 1318190061,1640851625, lore ipsum,84.169.42.48,appname
and I use fgetcsv to read this data into arrays.
Problem: performance. On a regular basis, the script must read (and process) more than 10,000 records.
My first attempt is very simple:
//Performance: 0,141 seconds / 13.5 MB while(!feof($statisticsfile)) { $temp = fgetcsv($statisticsfile); $timestamp[] = $temp[0]; $value[] = $temp[1]; $text[] = $temp[2]; $ip[] = $temp[3]; $app[] = $temp[4]; }
My second attempt:
//Performance: 0,125 seconds / 10.8 MB while (($userinfo = fgetcsv($statisticsfile)) !== FALSE) { list ($timestamp[], $value[], $text, $ip, $app) = $userinfo; }
- Is there a way to improve performance even more, or is my method as fast as it could get?
- Perhaps more important: is there a way to determine which columns are read, for example. sometimes only a timestamp, floating point columns is required. Is there a better way than my way (take a look at my second try :)
Thanks:)
source share