I try to block all bots / scanners / spiders for a special directory. How to do this using htaccess
? I searched a bit and found a solution by blocking based on user agent:
RewriteCond %{HTTP_USER_AGENT} googlebot
Now I will need more user agents (for all known bots), and the rule should be valid only for my separate directory. I already have a robots.txt file, but not all scanners consider it ... Blocking by IP address is not an option. Or are there other solutions? I know the password, but first I need to ask if this will be an option. However, I am looking for a user agent based solution.
source share