Sitemap contains URLs blocked by robots.txt

We had a random situation when a robots.txt file was installed on our Wordpress website, which prohibits scanning for about 7 days. Now I'm trying to reproduce the cleanup, and the webmaster tools say "Sitemap contains URLs that are blocked by robots.txt." AFTER I corrected the robots.txt file and allowed the crawl. There is no reason why the URLs will still be blocked, and when I arrive the examples that they display are OK.

Robots.txt URL: http://bit.ly/1u2Qlbx

Sitemap url: http://bit.ly/1BfkSmx

My URLs where robots.txt is blocked according to Webmaster Tools: http://bit.ly/1uLBRea OR <a3>

+5
source share
3 answers

Don’t worry if you re-enable the URLs, Googlebots will notice and crawl your pages again. The message will disappear from the GWT in a few days.

In the end, you can test your robots.txt file with the corresponding GWT function.

+5
source

use this plugin

https://wordpress.org/plugins/wp-robots-txt/

it will delete the previous robots.txt file and install a simple wordpress robots.txt and wait a day

The problem can be solved.

+4
source

Google may take some time to crawl your site. I would say that waiting is probably your only option.

It took me up to 7 days to properly index things after sending the full sitemap using webmaster tools.

It looks like you are using Yoast SEO, this plugin should tell you if there are any other problems.

+3
source

Source: https://habr.com/ru/post/1203466/


All Articles