I try my best to make an AJAX-based website optimized for SEO. As recommended in online tutorials, I added the "pretty" href attributes to the links: <a href="#!site=contact" data-id="contact" class="navlink"></a> , and in div, where the content is loaded by default AJAX, PHP script for scanners:
$files = glob('./pages/*.php'); foreach ($files as &$file) { $file = substr($file, 8, -4); } if (isset($_GET['site'])) { if (in_array($_GET['site'], $files)) { include ("./pages/".$_GET['site'].".php"); } }
I have a feeling that at first I need to additionally cut out _escaped_fragment_= part from (...)/index.php?_escaped_fragment_=site=about , because otherwise the script will not be able to GET the site value from the URL, am I?
but be that as it may, how do I find out that the crawler converts beautiful links (those with #! ) to ugly links (containing ?_escaped_fragment_= )? I was told that this happens automatically, and I do not need to provide this mapping, but Fetch, like Googlebot, does not provide me with any information about what happens with the URL.
source share