Ajax #! URL support?
-
Hi Moz,
My site is currently following the convention outlined here:
https://support.google.com/webmasters/answer/174992?hl=en
Basically since pages are generated via Ajax we are setup to direct bots that replace the #! in a url with ?escaped_fragment to cached versions of the ajax generated content.
For example, if the bot sees this url:
it will replace it will instead access the page:
http://www.discoverymap.com/?escaped_fragment=/California/Map-of-Carmel/73
In which case my server serves the cached html instead of the live page. This is all per Googles direction and is indexing fine.
However the MOZ bot does not do this. It seems like a fairly straight-forward feature to support. Rather than ignoring the hash, you look to see if it is a #! and then try to spider the url replaced with ?escaped_fragment. Our server does the rest.
If this is something MOZ plans on supporting in the future I would love to know. If there is other information that would be great.
Also, pushstate is not practical for everyone due to limited browser support, etc.
Thanks,
Dustin
Updates:
I am editing my question because it won't let me respond to my own question. It says I need to sign up for MOZ analytics. I was signed up for Moz Analytics?! Now I am not? I responded to my invitation weeks ago?
Anyway, you are misunderstanding how this process works. There is no site-map involved. The bot reads this URL on the page:
And when it is ready to spider the page for content it, it spider's this URL instead:
http://www.discoverymap.com/?escaped_fragment=/California/Map-of-Carmel/73
The server does the rest, it is simply telling Roger to recognize the #! format and replace it with
?escaped_fragment
Though I obviously do not know how Roger is coded but it is a simple string replacement.
Thanks.
-
Hello Dustin, this is Abe on the Moz Help team.
This question is a bit intricate, I apologize if i am not reading your question correctly.
With AJAX content like this, I know Google's full specifications
https://developers.google.com/webmasters/ajax-crawling/docs/specification
indicate that the #! and ?escaped_fragment= technique works for their crawlers. However, Roger is a bit picky and isn't robust enough yet to use only the sitemap as the reference in this case. Luckily, one of our wonderful users came up with a solution using pushState() method. Click here:
http://www.moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
to find out how to create crawl-able content using pushState . This should help our crawler read AJAX content. Let me know if this information works for you!
I hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On Page Grader - URL not accessible
We have tried to use the On Page Grader today and it is coming back with URL not accessible for all pages on our website. We previously used the On Page Grader on Friday 10th Nov for a couple of product pages with no issues. Since then, the only changes we have made on the websites is updating some downloadable documents. We have done this several times before and it has never affected Moz. We have not changed the page URLs, and therefore do not know why it is now not working. The pages are working fine on the website with no issues. A link to one of the pages is below. http://www.processinstruments.co.uk/products/dissolved-oxygen-monitor/ Any help would be greatly appreciated.
Moz Bar | | PiMike0 -
4XX client error with email address in URL
I have an unusual situation I have never seen before and I did not set up the server for this client. The 4XX error is a string of about 74 URLs similar to this: http://www.websitename.com/about-us/info@websitename.com I will be contacting the server host as well to troubleshoot this issue. Any ideas? Thanks
Moz Bar | | EliteVenu0 -
Onpage Grader not Finding Keyword in URL
I've noticed that the Onpage Grader is not including my keyword in the URL when the keyword is in the domain. If I grade an inner page and the keyword is in the sub-directory, it finds it. Is this intentional? If so, why does the grader not include my keyword in the domain as Keyword in the URL?
Moz Bar | | Dino640 -
MOZ Page Grader - Sorry, but that URL is inaccessible?
Hi, I am trying to use the MOZ page grader on this page - https://www.respuestasparaelhombre.com/datos-sobre-disfunción-eréctil-preguntas but it is saying Sorry, but that URL is inaccessible - any ideas? Thanks
Moz Bar | | Jason_Marsh1230 -
URLS appearing twice in Moz crawl
I have asked this question before and got a Moz response to which i replied but no reply after that. Hi, We have noticed in our moz crawl that urls are appearing twice so urls like this - http://www.recyclingbins.co.uk/about/ www.recyclingbins.co.uk/about/ Thought it may be possible rel=canonical issue as can find URL's but no linking URL's to the pages. Does anyone have any ideas? Thank you Jon I did the crawl test and they were not there
Moz Bar | | imrubbish0 -
Getting 'Sorry, but that URL is inaccessible' error msg when trying to run On-Page Grader
I just signed up for MOZ Pro for the first time today. Tried to run the 'on-page grader' tool on some of my pages but I'm getting a 'Sorry, but that URL is inaccessible' error msg. I have verified against the robot.txt file that the pages are NOT blocking any crawlers. Can anybody help?
Moz Bar | | spinoki0 -
On Page Grader can't access my URLs
HI- I am trying to grade some specific pages for keywords with the on page grader but it keeps telling me "Sorry, but that URL is inaccessible. " I can reach them via the browser and they are not https. Any thoughts? Here is a sample: www.bulkcandystore.com/kosher-candy Any help is appreciated. Ken
Moz Bar | | CandymanKen0 -
Best way to submit multiple, simultaneous urls to SEOMOZ
I have a pro membership, and am looking to get inbound link data on multiple urls. Is there a way to submit multiple urls at once?
Moz Bar | | cmcilwain0