Issues with Duplicates and AJAX-Loader
-
Hi,
On one website, the "real" content is loaded via AJAX when the visitor clicks on a tile (I'll call a page with some such tiles a tile-page here). A parameter is added to the URL at the that point and the content of that tile is displayed. That content is available via an URL of its own ... which is actually never called.
What I want to achieve is a canonicalised tile-page that gets all of the tiles' content and is indexed by google - if possible with also recognising that the single-URLs of a tile are only fallback-solutions and the "tile-page" should be displayed instead.
The current tile-page leads to duplicate meta-tags, titles etc and minimal differences between what google considers a page of its own (i.e. the same page with different tiles' contents).
Does anybody have an idea on what one can do here?
-
Hi CleverPhD,
Thanks for your answer! The website is indeed a little dated and did not consider SEO - or so I have been informed.
http://www.g1.de/projekte.aspx is the URL with the most clearest problems, although similar tiles also exist on other pages. As you can see by checking the code, the URL is changed, albeit in a non-ideal way (parameter) and the page basically stays the same with only a tiny fraction of its content changed.
The USAtoday approach is interesting and I will look into it. I have a slight feeling, though, that the approach is quite a bit different(?).
-
I would really need to see the page you mention to make sure I am following you, but I think one approach would be that when the page is called via AJAX, call the actual URL, not the one with the parameter. That way you do not have the 2 URLs that need to be canonicalized to start with. You would still need to test this with a spider program to make sure the URLs are found. I am thinking you would also need a sitemap or alternative navigation to allow the spiders to find the pages and get the cataloged.
All of that said, I have to be honest, my gut is telling me that if you are having to work this hard to get the spider to find the URLs correctly, then you may also have an issue with this design being too clever for what it is worth. You may need to rethink how you approach this. USA today uses a setup that seems similar to yours check it out http://www.usatoday.com/ When you click on a tile to view a story, there is an AJAX type overlay of the home page with the article on top. It allows you to X out and go back to the home page. Likewise from the article you can page through other articles (left and right arrows). While you do this, notice that USA today is updating with an SEO friendly URL. I have not tested this site spider wise, but just by the look of it they seem to have the balance correct.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue with ?utm_source=rss&utm_medium=rss&utm_campaign=
Hello,
Technical SEO | | Dinsh007
Recently, I was checking how my site content is getting indexed in Google and from today I noticed 2 links indexed on google for the same article: This is the proper link - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/ But why this URL was indexed, I don't know - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/?utm_source=rss&utm_medium=rss&utm_campaign=hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims Could you please tell me how to solve this issue? Thank you1 -
Best way to fix duplicate content issues
Another question for the Moz Community. One of my clients has 4.5k duplicate content issues. For example: http://www.example.co.uk/blog and http://www.example.co.uk/index.php?route=blog/blog/listblog&year=2017. Most of the issues are coming from product pages. My initial thoughts are to set up 301 redirects in the first instance and if the issue persists, add canonical tags. Is this the best way of tackling this issue?
Technical SEO | | Laura-EMC0 -
Crawl issues
Hello there, I have found that when crawling my site I have errors regarding the meta description and it says it is missing from few pages. I checked these pages but there is a meta description. I also ran the same report with other tools and it comes up the same issues. What should I do?
Technical SEO | | PremioOscar0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
Htaccess issue
I have some urls in my site due to a rating counter. These are like: domain.com/?score=4&rew=25
Technical SEO | | sesertin
domain.com/?score=1&rew=28
domain.com/?score=5&rew=95 These are all duplicate content to my homepage and I want to 301 redirect them there. I tried so far: RedirectMatch 301 /[a-z]score[a-z] http://domain.com
RedirectMatch 301 /.score. http://domain.com
RedirectMatch 301 /^score$.* http://domain.com
RedirectMatch 301 /.^score$.* http://domain.com
RedirectMatch 301 /[a-z]score[a-z] http://domain.com
RedirectMatch 301 score http://domain.com
RedirectMatch 301 /[.]score[.] http://domain.com
RedirectMatch 301 /[.]score[.] http://domain.com
RedirectMatch 301 /[a-z,0-9]score[a-z,0-9] http://domain.com
RedirectMatch 301 /[a-z,0-9,=,&]score[a-z,0-9,=,&] http://domain.com
RedirectMatch 301 /[a-z,0-9,=&?/.]score[a-z,0-9,=&] http://domain.com None of them works. Anybody? Solution? Would be very much appriciated0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0