We are making a whitelabel site that should not be indexed by Google.
Does anyone know a tool to check if googlebot will index a given URL?
I placed it <meta name="robots" content="noindex" />on all pages, so it should not be indexed, however I would prefer to check it at 110%.
I know that I can use robots.txt, but the problem with robots.txt is as follows: Our main site should be indexed, and this is the same application on IIS (ASP.Net) as a whitelabel site - the only difference is URL
I cannot modify the robots.txt file depending on the incoming URL, but I can add a meta tag to all pages from my code.
source
share