Are the benefits of using jQuery to load content outweigh the seo negatives?

I’m redesigning the site now and considering whether to use .load too much for most navigations to make it faster for the user and better used.

For this, I have links with <a href="/the/link" id="linkId">link</a>

Then I use $("#main").on("click", "linkId" with return false so that the links will not be executed.

I have /load/page.php and / page.php to provide either the required download code or the full page if the user goes directly to it.

Finally, with all changes to the download page, I update the page hash with document.location.hash = "/" + $(this).attr("href");

This means site URLs will look like this:

 domain.com/#/file/page 

and this is for search engines:

 domain.com/file/page 

If the user enters a hashed URL, he will be redirected with the following code to the current search engine URL, so I think I have everything covered?

 if (location.href.indexOf("#") > -1) { location.assign(location.href.replace(/\/?#/, "")); } 

I would block the hashed URLs from indexing and only allow valid URLs, I thought that if people are associated with the hashed URL, would you need to move the page for seo?

Are there any serious flaws for this approach and / or are there ways to improve it when trying to create fully dynamic sites?

+6
source share
4 answers

First of all, I would not use two versions of php manager. Instead, check if the ajax request is, if so - only serve the content (otherwise, serve the full html - this should make web crawlers still see the html version).

secondly, instead of providing different hrefs, use sth like this:

 $(function(){ $('a').click(function(){ // load content of $(this).attr('href') // & change the hash return false; }); }); 

this will support the site seo friendly (as well as user-without-js-friendly;))

Take a look at http://tkyk.github.com/jquery-history-plugin/

// - added 18: 45Z

if for some reason changing the dispatcher is too complicated or impossible, you may need to use the jQuery load filtering option (see jQuery load docs), however this method creates unnecessary overhead, and it is better to avoid this.

+4
source

You might want to read how to make Ajax applications crawlable on Google - http://code.google.com/web/ajaxcrawling/

+1
source

You are doing it right. This is a basic progressive improvement. If the user does not have access to js, ​​they get a static version no matter what URL they refer to. If they have js, then they get the ajaxified version.

Your method is correct, except that I will follow @migajek's suggestion regarding checking if the request is ajax. Here on such a solution: http://davidwalsh.name/detect-ajax .

Thus, you do not need to manage both /load/page.php and page.php with just one.

In addition, search engines will not index the hash of the URL, so you don’t have to worry about blocking it.

+1
source

Regarding the "hash location update", major browsers today support changing the entire URL without reloading the page ( History Push State ), and not what happens after # .

You must implement this functionality because:

  • URLs look better
  • If the user posts a link on a forum / blog / no matter what you want the crawlers to link the right content to that link.

Link to additional information on this subject:


I would block the hashed URLs from indexing and only allow valid URLs.

The content after the hash remains on the client side, unless you send it to the server using javascript or something similar.

And since crawlers usually do not run javascript or “something similar”, you cannot verify that some #key not indexed by a web #key .


AJAX + SEO = No Biggie!

If you do it right, there really won't be a SEO penalty when using ajax to further satisfy your visitors (and your server servers).

Although you will need to keep your head straight, as there are some pitfalls for a long time, but if you pay attention, everything will work.

+1
source

Source: https://habr.com/ru/post/903939/


All Articles