Issues with Duplicates and AJAX-Loader
-
Hi,
On one website, the "real" content is loaded via AJAX when the visitor clicks on a tile (I'll call a page with some such tiles a tile-page here). A parameter is added to the URL at the that point and the content of that tile is displayed. That content is available via an URL of its own ... which is actually never called.
What I want to achieve is a canonicalised tile-page that gets all of the tiles' content and is indexed by google - if possible with also recognising that the single-URLs of a tile are only fallback-solutions and the "tile-page" should be displayed instead.
The current tile-page leads to duplicate meta-tags, titles etc and minimal differences between what google considers a page of its own (i.e. the same page with different tiles' contents).
Does anybody have an idea on what one can do here?
-
Hi CleverPhD,
Thanks for your answer! The website is indeed a little dated and did not consider SEO - or so I have been informed.
http://www.g1.de/projekte.aspx is the URL with the most clearest problems, although similar tiles also exist on other pages. As you can see by checking the code, the URL is changed, albeit in a non-ideal way (parameter) and the page basically stays the same with only a tiny fraction of its content changed.
The USAtoday approach is interesting and I will look into it. I have a slight feeling, though, that the approach is quite a bit different(?).
-
I would really need to see the page you mention to make sure I am following you, but I think one approach would be that when the page is called via AJAX, call the actual URL, not the one with the parameter. That way you do not have the 2 URLs that need to be canonicalized to start with. You would still need to test this with a spider program to make sure the URLs are found. I am thinking you would also need a sitemap or alternative navigation to allow the spiders to find the pages and get the cataloged.
All of that said, I have to be honest, my gut is telling me that if you are having to work this hard to get the spider to find the URLs correctly, then you may also have an issue with this design being too clever for what it is worth. You may need to rethink how you approach this. USA today uses a setup that seems similar to yours check it out http://www.usatoday.com/ When you click on a tile to view a story, there is an AJAX type overlay of the home page with the article on top. It allows you to X out and go back to the home page. Likewise from the article you can page through other articles (left and right arrows). While you do this, notice that USA today is updating with an SEO friendly URL. I have not tested this site spider wise, but just by the look of it they seem to have the balance correct.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking issue
So i am working with a review company and I am having a hard time with something. We have created a category which lists and categorizes every one of our properties. For example a specific property in the category "restaurant" would be as seen below: /restaurant/mcdonalds /restaurant/panda-express And so on and so on. What I am noticing however is that our more obscure properties are not being linked to by any page. If I were to visit the page myurl.com/restaurant I would see 100+ pages of properties, however it seems like only the properties on the first few pages are being counted as having links. So far the only way I have been able to work around this issue is by creating a page and hiding it in our footer called "all restaurants". This page lists and links to every one of our properties. However it isn't exactly user friendly and I would prefer scrapers not to be able to scrape all properties at once! Anyway, any suggestions would be greatly appreciated.
Technical SEO | | HashtagHustler0 -
Redirect Chain Issue
I keep running into a redirect chain issue trying to get a non-https/non-www domain to forward directly to the https/www domain on an Apache server. For background, we are forcing https and forcing www, but it appears the non-https/non-www domain is first redirecting to https/non-www and then redirecting again to the desired final https/www version of the domain. (Hope I am making sense here) I am trying to find code to add to my .htaccess file that will perform the following... 301 Redirect
Technical SEO | | FitzSWC
http://example.com directly to https://www.example.com (without 1st redirecting to https://example.com)
http://www.example.com directly to https://www.example.com Any experts in this with any thoughts? Thanks,
Fitz0 -
Site Link Issues
For several search terms I get site links for the page http://www.waikoloavacationrentals.com/kolea-rentals/kolea-condos/ It makes sense that that page be a site link as it is one of my most used pages, but the problem is google gave it the site link "Kolea 10A". I am having 0 luck making any sense of why that was chosen. It should be something like "Kolea Condos" or something of that nature. Does anyone have any thoughts on where google is coming up with this?
Technical SEO | | RobDalton0 -
Duplicate Titles Aren't Actually Duplicate
I am seeing duplicate title errors, but when I go to fix the problem, the titles are not actually identical. Any advice? Becky
Technical SEO | | Becky_Converge0 -
Avoiding duplication in TLDs
I have started a ecom site with following config global version geekwik.com priced in usd india version geekwik.in priced in inr mostly the content in both sites is same (90% same), major difference is currency (and payment gateway) and helpline numbers etc How do I setup robots.txt and google webmaster so that indian users get results from India TLD and global users get results from global TLD and there is no duplication of content. .
Technical SEO | | geekwik0 -
Duplicate Page Title Crawl Error Issue
In the last crawl for on of our client websites the duplicate page title and page content numbers were very high. They are reading every page twice. http://www.barefootparadisevacations.com and http://barefootparadisevacations.com are being read as two different pages with the same page title. After the last crawl I used our built in redirect tool to redirect the urls, but the most recent crawl showed the same issue. Is this issue really hurting our rankings and if so, any suggestions on a fix for the problem? Thank you!
Technical SEO | | LoveMyPugs0 -
Duplicate Content Errors
Ok, old fat client developer new at SEO so I apologize if this is obvious. I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title. Here is the duplicate title error Rare Currency And Old Paper Money Values and Information.
Technical SEO | | Banknotes
http://www.antiquebanknotes.com/ Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspx So, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this: this.Title = "Rare Currency And Old Paper Money Values and Information."; And it occurs only once...0 -
Sitemaps - Format Issue
Hi, I have a little issue with a client site whose programmer seems kind of unwilling to change things that he has been doing a long time. So, he has had this dynamic site set up for a few years and active in google webmaster tools and others, but is not happy with the traffic it is getting. When I looked at webmaster tools I see that he has a sitemap registered, but it is /sitemap.php When I said that we should be offering the SE's /sitemap.xml his response is that sitemap.php checks the site every day and generates /sitemap.xml, but there is no /sitemap.xml registered in webmaster tools. My gut is telling me that he should just register /sitemap.xml in webmaster tools, but it is a hard sell 🙂 Anyone have any definitive experience of people doing this before and whether it is an issue? My feeling is that it doesn't need to be rocket science... Any input appreciated, Sha
Technical SEO | | ShaMenz0