Best practice for high-load file input / output?

What is your recommended best practice for a LAMP server with high concurrent downloads and do I need to handle file I / O without hanging too much when locking files?

I mean, say, I want to have a SUBSCRIBERS.CSV file that has a bunch of names and email addresses in it. But I want people to be able to fill out the unsubscribe form. The unsubscribe action will scroll this file to delete the corresponding line if it exists for the given email address. This seems like a simple task in PHP, but what happens when you have 10 people trying to immediately unsubscribe and 10 new subscribers are added? That I think PHP may run into difficulties and an error may occur due to file locking, unless Linux or PHP are more efficient than I think.

Please note that my client needs a CSV file, not a database table. In the database table, this will not be a problem, but as file I / O, I may run into a potential problem, right?

(BTW, to prevent identity theft, I use the .htaccess trick, so you can’t download CSV over the Internet, guessing its name - only my PHP script or FTP should be available to it.)

+3
source share
2 answers

, CSV , CSV . , PHP CSV .

, http://example.com/SUBSCRIBERS.CSV, PHP SUBSCRIBERS.CSV - :

header("Content-type: text/csv");
$data = get_subscriber_data();
foreach ($data as $row) {
  // $row is an array of columns
  print implode(',', $row);
}
+8
header("Content-type: text/csv");
$data = get_subscriber_data();
$fp = fopen('php://stdout', 'w'); 
foreach ($data as $row) {
  // $row is an array of columns
  fputcsv($fp, $row);
}
fclose($fp);
+1

Source: https://habr.com/ru/post/1782219/


All Articles