What is the procedure for stopping robots and malicious scanners that slow down the site?

What should I do so that users do not run scanners or automatic robots to publish on my site that would slow down the processing of the site?

Is it enough to mark every message that the user makes and create a posting delay? How long should the interval be?

What else can I do besides te above and captchas on form posts?

thanks

+4
source share
2 answers

A time interval is a good idea and is used for. Different operations should have different time frames depending on:

  • How often ordinary users can use this feature.
  • How intense is the operation.

If you have an operation that requires a lot of processing time, you can set a limit on this operation higher than for a relatively simple operation.

Stack overflow combines time limits with CAPTCHA for editing messages. If you edit too often, you need to pass the CAPTCHA test.

+1
source

I searched for this a year ago or so and found a list of known “bad useragents” that I added to my .htaccess to block their access to my blog . this small change has had a significant impact on my bandwidth usage.

+1
source

Source: https://habr.com/ru/post/1305500/


All Articles