Deny access, but allow robots, i.e.Google, sitemap.xml

Is there a way by which you can only allow robots, such as Google, Yahoo or other search engine robots, to my Sitemap file, which is located at http://www.mywebsite.com/sitemap.xml . Is it possible to prohibit direct user access, but only to robots?

+6
source share
3 answers

Well, basically not, but you can do something using the user agent string and deny access (assuming Apache)

<Location /sitemap.xml> SetEnvIf User-Agent GodBot GoAway=1 Order allow,deny Allow from all Deny from env=!GoAway </Location> 

But as they say here (where I found the syntax)

Attention:

Access control using the User-Agent is an unreliable technique, because the User-Agent Header can be installed in anything, at the whim of the end user.

+5
source

In my source in Red:

 $ip = $_SERVER["REMOTE_PORT"]; $host = gethostbyaddr($ip); if(strpos($host, ".googlebot.com") !== false){ readfile("sitemap.xml"); }else{ header("Location: /"); 

+1
source

sitemap.php

 <?php $ip = $_SERVER["REMOTE_PORT"]; $host = gethostbyaddr($ip); if(strpos($host, ".googlebot.com") !== false){ readfile("sitemap.xml"); }else{ header("Location: /"); } 
0
source

Source: https://habr.com/ru/post/891949/


All Articles