Tracking and testing for abusive clients in PHP

Now there is an item that could be used in many ways. Hope I can fix it as I describe my problem and start receiving suggestions.

I am developing a site that will replace the existing one. Historically, one of the problems we had was spider bots that draw in and suck out all the content. Now we do not mind that the content is being downloaded. In fact, we are pleased with this, however, some of the bootloaders and boot accelerators were found to be problematic with the current site.

What I'm looking for is what I need to sit down at the beginning of my php, which works first. It takes the fingerprint of the page request (ip, referrer, request uri, cookie, session id, whatever) and passes it ... something. Something then compares the fingerprint with the fingerprints in the last second or three. Then it returns a message based on some pre-configured threshold of what to do with the request.

Some threshold values:

  • User requested> x pages in the last 0.n seconds.
  • The user requested the same page in <0.n seconds.
  • User submitted identical data to the form in the last n seconds.

So, you see that I am looking at rather hard windows. Is finding such things even feasible? Will I be able to do this with some kind of db file or data source? Regardless of what I use to store fingerprints between page loads, you will encounter a lot of outflow, since most of the data will be stored for a second or two. Should I just have something that parses apache logs to check the threshold? Should I look for some kind of external daemon that stores data per second or two in memory that I can call from a script? Is there anything in apache that can handle this, and I just need to hold a server guy to handle this?

, , PHP - , ? HTTP, - 408 503, . , ? - "Woah there"?

+3
3

, /, ? DOS- ( ) , .

+3

mod_evasive

+2

Source: https://habr.com/ru/post/1721546/


All Articles