How to make an AJAX site crawlable when PushState and #! can't be used?
-
Dear Mozzers,
Does anyone know a solution to make an AJAX site crawlable if:
1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics
2. PushState can't be implemented
Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)?
Or is there another magical solution that works as well?
Any input or advice is highly appreciated!
Kind regards,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website can't break into Google Top100 for main keywords, considering 301 Redirect to a new domain
A little background on our case. Our website, ex: http://ourwebsite.com was officially live in December 2015 but it wasn't On-Site optimized and we haven't done any Off-site SEO to it. In April we decided to do a small redesign and we did it an online development server. Unfortunately, the developers didn't disallow crawlers and the website got indexed while we were developing it on the development server. The development version that got indexed in Google was http://dev.web.com/ourwebsite We learned that it got indexed when we migrated the new redesigned website to the initial domain. When we did the migration we decided to add www and now it looks like: http://www.ourwebsite.com Meanwhile, we deleted the development version from the development server and submitted "Remove outdated content" from the development server's Search Console. This was back in early May. It took about 15-20 days for the development version to get de-indexed and around 30 days for the original website (http://www.ourwebsite.com) to get indexed. Since then we have started our SEO campaign with Press Releases, Outreach to bloggers for Guest and Sponsored Posts etc. The website currently has 55 Backlinks from 44 Referring domains (ahrefs: UR25, DR37) moz DA:6 PA:1 with various anchor text. We are tracking our main keywords and our brand keyword in the SERPs and for our brand keyword we are position #10 in Google, but for the rest of the main (money) keywords we are not in the Top 100 results in Google. It is very frustrating to see no movement in the rankings for the past couple of months and our bosses are demanding rankings and traffic. We are currently exploring the option of using another similar domain of ours and doing a complete 301 Redirect from the original http://www.ourwebsite.com to http://www.ournewebsite.com Does this sound like a good option to you? If we do the 301 Redirect, will the link-juice be passed from the backlinks that we already have from the referring domains to the new domain? Or because the site seems "stuck," would it not pass any power to the new domain? Also, please share any other suggestions that we might use to at least break into the Top 100 results in Google? Thanks.
Intermediate & Advanced SEO | | DanielGorsky0 -
Why isn't my site being indexed by Google?
Our domain was originally pointing to a Squarespace site that went live in March. In June, the site was rebuilt in WordPress and is currently hosted with WPEngine. Oddly, the site is being indexed by Bing and Yahoo, but is not indexed at all in Google i.e. site:example.com yields nothing. As far as I know, the site has never been indexed by Google, neither before nor after the switch. What gives? A few things to note: I am not "discouraging search engines" in WordPress Robots.txt is fine - I'm not blocking anything that shouldn't be blocked A sitemap has been submitted via Google Webmaster Tools and I have "fetched as Google" and submitted for indexing - No errors I've entered both the www and non-www in WMT and chose a preferred There are several incoming links to the site, some from popular domains The content on the site is pretty standard and crawlable, including several blog posts I have linked up the account to a Google+ page
Intermediate & Advanced SEO | | jtollaMOT0 -
My site has a loft of leftover content that's irrelevant to the main business -- what should I do with it?
Hi Moz! I'm working on a site that has thousands of pages of content that are not relevant to the business anymore since it took a different direction. Some of these pages still get a lot of traffic. What should I do with them? 404? Keep them? Redirect? Are these pages hurting rankings for the target terms? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Can I use rel=canonical and then remove it?
Hi all! I run a ticketing site and I am considering using rel=canonical temporary. In Europe, when someone is looking for tickets for a soccer game, they look for them differently if the game is played in one city or in another city. I.e.: "liverpool arsenal tickets" - game played in the 1st leg in 2012 "arsenal liverpool tickets - game played in the 2nd leg in 2013 We have two different events, with two different unique texts but sometimes Google chooses the one in 2013 one before the closest one, especially for queries without dates or years. I don't want to remove the second game from our site - exceptionally some people can broswer our website and buy tickets with months in advance. So I am considering place a rel=canonical in the game played in 2013 poiting to the game played in a few weeks. After that, I would remove it. Would that make any sense? Thanks!
Intermediate & Advanced SEO | | jorgediaz0 -
Best procedure for distributing identical content about your company/site for affiliates to use?
When dealing with affiliate websites, whereby you send them a stock standard bio or info on your company for them to use on their sites, what is best practice? Is is OK to have multiple websites all linking to you with pages that contain the same content? Should I ask them to implement canonical or no-index tags for those particular pages? Should I ask them to rewrite the content (which may be impractical or they're unwilling to do)? Thanks
Intermediate & Advanced SEO | | Martin_S0 -
Can I Use Cross Domain Canonical For Duplicate Categories & Product Pages?
I want to fix issue regarding duplicate categories & product pages on my multiple eCommerce websites. http://www.vistastores.com/patio-umbrellas-fiberbuilt-umbrellas-llc-7gcrw-teal.html - Want to rank with this... http://www.vistapatioumbrellas.com/patio-umbrellas-fiberbuilt-umbrellas-llc-7gcrw-teal.html - Duplicate one! http://www.vistastores.com/patio-umbrellas - Want to rank with this... http://www.vistapatioumbrellas.com/patio-umbrellas - Duplicate one!
Intermediate & Advanced SEO | | CommercePundit0 -
Is it OK to have a site that has some URLs with hyphens and other, older, legacy URLs that use underscores?
I'm working with a VERY large site that has recently been redesigned/recategorized. They kept only about 20% of the URLs from the legacy site, the URLs that had revenue tied to them, and these URLs use underscores. Whereas the new URLs created for the site use hyphens. I don't think that this would be an issue for Google, as long as the pages are of quality, but I wanted to get everyone's opinion on this. Will it hurt me to have two different sets of URLs, those with using hyphens and those using underscores?
Intermediate & Advanced SEO | | Business.com0