I have a PHP script that calls an API method that can easily return 6k + results.
I am using PEAR DB_DataObject to write each row in the foreach loop to the DB.
The above script is a batch process of 20 users at a time - and although some of them will only have a few results from other APIs, they will have more. Worst of all, everyone has 1000 results.
The loop for calling the API seems to be in order, lots of 20 every 5 minutes work fine. My only concern is 1000 mysql INSERT for each user (with a long pause between each user for new API calls)
Is there a good way to do this? Or am I doing it well?
source
share