robots.txt is the magic URL used by the search engine and other robots before processing your site. See wikipedia for more details.
The best way to handle this error in GAE is to put the robots.txt file and define it as a static file in your app.yaml for gae / python:
- url: /(robots\.txt) static_files: \1 upload: (robots\.txt)
And in the appengine-web.xml the gae / java file:
<?xml version="1.0" encoding="UTF-8"?> <appengine-web-app xmlns="http://appengine.google.com/ns/1.0" xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xsi:schemaLocation='http://kenai.com/projects/nbappengine/downloads/download/schema/appengine-web.xsd appengine-web.xsd'> .... <static-files> <include path="/favicon.ico" /> <include path="/robots.txt" /> <include path="/img/**.png" /> <include path="/img/**.gif" /> <include path="/css/**.css" /> </static-files>
Of course, you can simply ignore errors, they do not matter to anyone except yourself (no one encounters an error).
source share