Is search engine optimization on the mean.js full stack javascript application still the main problem, how to solve it

I am working on my first full javascript application using specifically mean.js as a starting point, and I began to get nervous and somewhat embarrassed about the problem of search engine optimization (SEO).

Has Googles made a recent effort (over the past year or so) to improve javascript crawl, is this not a problem, or is it something I need to consider when planning and structuring my project?

If Google can now crawl AngularJS / Ajax heavy applications, why do we get blog posts about SEO solutions: http://blog.meanjs.org/post/78474995741/mean-seo

  • Is this type of solution needed?
  • It will be as effective as server side rendering in terms of SEO.
  • Is there a hash hang (#!), Calls the necessary evil or just evil.

I know that questions about SEO and AngularJS have been asked before, but it seems that there are so many different opinions on this issue that I am lost, and it would be nice to have some more specific thoughts. My main concerns:

  • If creating a heavy implementation of angularjs will make it an SEO black hole.
  • If I end up creating almost the entire project again in static files just for SEO
  • If I need to take a look at the server side rendering solution.
+5
source share
3 answers

If you make most of your content with JavaScript, then yes, it becomes a black hole in the search engine. This is one of the big drawbacks of the thick client application. If you need high visibility in search engines, this is a challenge. There is a midpoint.

You will need a combination of server side rendering and client side rendering. When the page first loads, it should have all the visible content that the user needs, or at least the content that appears “above the fold” (at the top of the page). Links should be descriptive and allow search engines to dive deeper into the site. The main menu of your site should be delivered with a web page, as well as give search engines something to bite.

The content below the summary, or paginated content, can be dynamically pulled out and displayed on the client using any JavaScript infrastructure. This gives you a good combination of server-side rendering for search engine feeds, as well as a performance boost that dynamic content can offer.

+5
source

well, you will need to worry about the public face of your site , you should not be considered something behind the login screen, for me a shot using the approach without a browser using farment_scape seems to be the way to go, this is the one that will consume less time, and since you looked at mean-seo, it's not that hard to implement.

take a look at this question , there are some answers on how to create links on pages in order to be SEO friendly, almost all the latest posts fit together.

https://support.google.com/webmasters/answer/174992?hl=en

and also try registering at https://webmasters.stackexchange.com/ , you will find more about seo

+1
source

Just wanted to mention this npm package https://www.npmjs.com/package/mean-seo , which uses PhantomJS to render your application and caches it on the / redis drive for any period that you installed.

0
source

Source: https://habr.com/ru/post/1207740/


All Articles