I am using ASP.Net and I have a custom page 404. When a user inserts a URL that is not found, he will be redirected to a custom page 404. However, google indexes my own page 404. Search 404 (page not found).
Anyone have a solution?
You can make the error catalog or each file inaccessible to Google by adding it to the robots.txt file in the root directory of your site. This really applies to any directories / files that you do not want to index Google.
robots.txt http://en.wikipedia.org/wiki/Robots.txt http://www.robotstxt.org/.
, "404" HTTP 404, , 404, HTTP 200. , Google ( ) , 404.
- 404.
, Google 404.
<META NAME="ROBOTS" CONTENT="NOINDEX">
Google’s webmaster blog indicates that they will not index pages using this meta tag.
You will always see a journal entry in which Google goes to page 404. Since Google must request a page in order to know that it has a status of 404. However, you should never see this page in the search results if that is what you are asking for ?
Source: https://habr.com/ru/post/1699195/More articles:C ++ project structure in Visual Studio 2008 - c ++Sql Server Management Studio 2008 does not allow the use of table tables - sql-serverHow to revoke Windows Administrator user permission from DB2? - db2Google AppEngine POST request file name - postDoes Windsor lock permit value type resolution? - inversion-of-controlHow to configure SVK proxy configuration - proxyEnabling https for Java Webstart - javaApache user cannot write in .python-eggs - pythonHow to fix a Trac installation that starts to give errors related to PYTHON_EGG_CACHE? - pythonFinding Multiple Lists for Missing Entries - comparisonAll Articles