Adding a header to the response for specific URLs using HAproxy

I have a simple condition in my HAproxy configuration (I tried this for the interface and backend):

acl no_index_url path_end .pdf .doc .xls .docx .xlsx rspadd X-Robots-Tag:\ noindex if no_index_url 

It should add a no-robots header to content that should not be indexed. However, this gives me this WARNING when analyzing the configuration:

 acl 'no_index_url' will never match because it only involves keywords that are incompatible with 'backend http-response header rule' 

and

 acl 'no_index_url' will never match because it only involves keywords that are incompatible with 'frontend http-response header rule' 

According to the documentation , rspadd can be used both in the interface and in the backend. path_end used in examples in the interface. Why am I getting this error and what does it mean?

+6
source share
2 answers

Starting with HaProxy 1.6, you cannot simply ignore the error message. To get this working, use a temporary variable:

 frontend main http-request set-var(txn.path) path backend local http-response set-header X-Robots-Tag noindex if { var(txn.path) -m end .pdf .doc } 
+12
source

Obviously, even with a warning, the presence of acl in the interface works fine. All resources with .pdf, .doc, etc. Get the correct X-Robots-Tag added to them.

In other words, this WARNING is misleading and actually matches acl .

+1
source

Source: https://habr.com/ru/post/985458/


All Articles