I’m redesigning the site now and considering whether to use .load too much for most navigations to make it faster for the user and better used.
For this, I have links with <a href="/the/link" id="linkId">link</a>
Then I use $("#main").on("click", "linkId" with return false so that the links will not be executed.
I have /load/page.php and / page.php to provide either the required download code or the full page if the user goes directly to it.
Finally, with all changes to the download page, I update the page hash with document.location.hash = "/" + $(this).attr("href");
This means site URLs will look like this:
domain.com/
and this is for search engines:
domain.com/file/page
If the user enters a hashed URL, he will be redirected with the following code to the current search engine URL, so I think I have everything covered?
if (location.href.indexOf("#") > -1) { location.assign(location.href.replace(/\/?#/, "")); }
I would block the hashed URLs from indexing and only allow valid URLs, I thought that if people are associated with the hashed URL, would you need to move the page for seo?
Are there any serious flaws for this approach and / or are there ways to improve it when trying to create fully dynamic sites?
source share