Google noscript snapshot, safe way

I have a page that works so intensively in JavaScript that I canโ€™t even write equivalent PHP for it. In this case, the page has a <noscript> tag that says that the page cannot be accessed by non-js users.

In any case, I can take a snapshot of the page version so that Google can see what the page contains and that people can find my pages by searching for phrases.

So do I have a way to get my cake and eat it? Does it make sense for me to have a blank page for non-js users, but if a Google bot comes in, does it serve for a skeleton page?

How could I do this? Will something like this also hurt to rank?

+3
javascript php seo noscript
Apr 02 2018-12-12T00:
source share
4 answers

It may be worth noting that Googlebot is known to run a significant amount of Javascript now. As long as the page you have loaded has all the necessary content, you can be fine (for Google, at least for other search engines, maybe less).

Failure to do this: you do not have to have all the same content in the noscript version of the page, just an attempt to convey the same information. I donโ€™t know what is connected with your site, but if you can just reset the basic textual representation of the content of the page, this may be enough.

+4
Apr 02 2018-12-12T00:
source share

In them, you can use something else for user agents with Googlebot , but this is very risky, since Google can detect it as an attempt to deceive your robot.

+2
Apr 02 2018-12-12T00:
source share

Just to let you know: Googleโ€™s policy for serving various content based on user information.

Cloaking refers to the practice of presenting various content or URLs to users and search engines. Serving various results to the user-agent can lead to the fact that your site will be perceived as deceptive and removed from the Google index.

Some examples of cloaking include:

 Serving a page of HTML text to search engines, while showing a page of images or Flash to users. Serving different content to search engines than to users. 

If your site contains elements that are not crawled by search engines (for example, multimedia files other than Flash, JavaScript or images), you should not provide hidden content to search engines. Rather, you should consider visitors to your site who cannot elements. For example:

 Provide alt text that describes images for visitors with screen readers or images turned off in their browsers. Provide the textual contents of JavaScript in a noscript tag. 

Make sure that you provide the same content in both elements (for an instance, provide the same text in JavaScript as in the noscript tag). Including substantially different content in an alternative element may cause Google to take action on the site. Hidden JavaScript redirects

When Googlebot indexes a page containing JavaScript, it will index this page, but it cannot follow or index any links hidden in JavaScript itself. Using JavaScript is a completely legit network. practice. However, using JavaScript with the intention of tricking search engines is not. For example, placing text in JavaScript than in the noscript tag violates our recommendations for webmasters because it displays different content for users (who see JavaScript-based text) than for search engines (who see text based on noscript). According to these lines, he violates the Webmaster Rules for a link in JavaScript, which redirects the user to another page using the intention to show the user a different page than the search engine sees. When the redirect link is embedded in JavaScript, the search engine indexes the source page, not the link, while users fall into the redirect target. Like cloaking, this practice is deceptive, as it displays various content for users and Googlebot, and may take the visitor somewhere else than they intended to go.

Please note that linking inside JavaScript is not misleading. When examining JavaScript on your site to make sure your site adheres to our recommendations, consider the intent.

Keep in mind that since search engines usually cannot access JavaScript content, valid JavaScript links are more likely to be inaccessible to them (as well as visitors without Javascript-compatible browsers). Instead, you can store links outside of JavaScript or copy them into a noscript tag. Passage Pages

Doorway pages are usually large sets of low-quality pages, where each page is optimized for a specific keyword or phrase. In many cases, door pages are written to rank a specific phrase and then move users to one destination.

Deployed in many domains or installed within the same domain, doorway pages tend to upset users and violate our Webmaster Guidelines.

Googleโ€™s goal is to provide our users with the most valuable and relevant search results. Therefore, we are unhappy with the practice of manipulating search engines and deceiving users by directing them to sites other than those that they choose, and which provide content exclusively for the advantage of search engines. Google may take action on doorways and other sites using this fraudulent practice, including removing these sites from the Google index.

If your site has been removed from search results, check out our Webmaster Guidelines for more information. Once you change and are confident that your site no longer violates our recommendations, submit your site for review.

0
Apr 27 2018-12-12T00:
source share

I am working on a similar problem with a website, and there is a way to show an HTML snapshot of the page. You can find it here on Google Developers:

https://developers.google.com/webmasters/ajax-crawling/docs/getting-started This page was updated in February 2012.

It is also possible to add a noscript tag in front of the content, which is a faster solution, but which has the opportunity to see Google as an attempt to distort the rating. From the studies that I did, people posted about this type as a solution in 2008-2010.

Although Googlebot may run some Javascript, and this is what Google is working on improving, it does not say that it runs all Javascript on the site. From what I found on the Internet, Google only runs Javascript on the most heavily moved pages of the site. I still recommend serving the HTML snapshot, but I don't like it.

You can check the Apache log to see if Google runs Javascript on your page.

http://arstechnica.com/information-technology/2012/05/googles-bots-learn-to-read-interactive-web-pages-more-like-humans/

0
Jun 29 2018-12-12T00:
source share



All Articles