Searchengine bots and meta updates for disabled Javascript

I have a site where javascript must be enabled for it to work

there is a <noscript> that has a meta to redirect the user to a page that warns him of javascript disabled ...

I wonder if this is bad for search engine robots?
Since I send an email to myself when someone does not have js, so I can analyze whether it needs to be rebuilt for these people, but its 100% js are activated, and the only ones that do not have JS are search robots ... I think google, yahoo etc. Don't take meta updates seriously when inside <noscript>?

Do I have to do something to check if they are bots and do not redirect their meta?

Thanks,
Joe

+4
source share
6 answers

Instead of sending the user / bot strongly, why not just make the text at the top of the page to enable javascript to use the site?

This will allow bots to still read the page and follow links other than javascript. This would end the redirect problems, and there would be no need to serve bots on another page. This will allow you to refresh multiple pages.

You can also take a look at Google’s webmaster tools to find out what all of Google currently reads and improves based on this.

Example: disabling javascript on SO creates a red banner at the top that simply indicates that "Stack overflow works best with JavaScript enabled." You can make this a link to a page with additional information if you feel that it is not enough.

+3
source

Have you tried <!--googleoff: all--> <noscript><meta redirect... /></noscript><!--googleon: all--> ? This is not a complete solution, but worth it ...

+2
source

Here is what I would do:

  • Make the site work somewhat with javascript. if you use ajax everywhere, then make sure the links have the href specified in the url you will be entering. This may cause your site to work "somewhat" without javascript.
  • Add some .htaccess redirects for bots. redirect them to some normal place where they can go to some links and index some things.

Your current site is probably very poor in terms of search and SEO.

edit : ok, i see your problem. Scanners are redirected after viewing the material inside noscript.

what about this solution then:

if you have only one page with noscript, then you can add some rewrite rules to your apache configuration, which will show a different version of the page for bots, and this version will not contain the noscript tag. eg:

 RewriteCond %{HTTP_USER_AGENT} Googlebot [OR] RewriteCond %{HTTP_USER_AGENT} msnbot [OR] RewriteCond %{HTTP_USER_AGENT} Slurp RewriteRule ^.*$ nometa.html [L] 

Also, what technology are you using? do you use any server languages, do you even use apache? I assumed that you have apache + html, but not server side. If you have something on the server side, then this is easier.

+1
source

Since <meta> is not allowed in the <body> of the page and <noscript> is not legal in the <head> section, the bots may simply refuse the page where they fall into bad HTML.

I suggest you simply use <noscript> to encapsulate the warning message and the link that the user can click if they do not have Javascript.

Search engines may not be allowed to follow this link using the /robots.txt file or by placing

 <meta name="ROBOTS" content="NOINDEX,NOFOLLOW" /> 
tag

on the page with which it is associated.

+1
source

You may have a page that says "You need javascript." And then add to this page

 <script> window.location.href='/thejspage.html'; </script> 

That way, people with javascript support will be easily sent to a valid page, and spiders will simply stay on that page, instead of saving a page that doesn't have javascript.

It should also help your SEO (since search engines will find a page that ordinary users can see).

+1
source

Perhaps you could use a browser without a header and could use an HTML snapshot of the page for those who don't have javascript, including scanners.

http://code.google.com/web/ajaxcrawling/docs/getting-started.html

0
source

Source: https://habr.com/ru/post/1307451/


All Articles