A very good +1 question from me and the answer is not simple.
PHP does not provide a way to maintain data for different pages and different sessions, so you cannot restrict access by IP address unless you store access information somewhere.
If you do not want to use a database connection for this, you can, of course, use the file system. I'm sure you already know how to do this, but here you can see an example:
DL Script Archives http://www.digi-dl.com/ (click on "HomeGrown PHP s", then on "IP/networking", then on "View Source" for the "IP Blocker with Time Limit" section)
The best option is "mod_throttle". Using this, you can limit each IP address to one access in five seconds by adding this directive to your Apache configuration file:
<IfModule mod_throttle.c> ThrottlePolicy Request 1 5 </IfModule>
But there is some bad news. The author of mod_throttle abandoned the product:
"Snert Apache modules currently CLOSED to the public until further notice. Questions as to why or requests for archives are ignored."
Another apache module, mod_limitipconn, is most commonly used these days. This does not allow arbitrary restrictions to be made (for example, "no more than ten requests in every 15 seconds"). All you can do is limit each IP address to a specific number of concurrent connections. Many webmasters seem to argue that this is a good way to combat bot spam, but it looks less flexible than mod_throttle.
You need different versions of mod_limitipconn, depending on which version of Apache you are using:
mod_limitipconn.c - for Apache 1.3 http://dominia.org/djao/limitipconn.html mod_limitipconn.c - Apache 2.0 port http://dominia.org/djao/limitipconn2.html
Finally, if your Apache server is hosted on a Linux machine, you can use a solution that does not require recompiling the kernel. Instead, it uses the iptables firewall rules. This method is quite elegant and flexible enough to impose restrictions, such as "no more than three connections from this IP in one minute." Here's how to do it:
Linux Noob forums - SSH Rate Limit per IP http:
I understand that none of these options will be perfect, but they illustrate what is possible. Perhaps using a local database will eventually get better? In any case, keep in mind that simply limiting the speed of requests or limiting the bandwidth does not solve the problem of bots. They may take longer, but they ultimately drain as much resources as they would if they had not slowed down. You must actually reject their HTTP requests, not just delay or distribute them.
Good luck escalating the battle between content and spam!