Duplicate Content Issue from using filters on a directory listing site
-
I have a directory listing site of harpists and have alot of issues coming up that say:
Content that is identical (or nearly identical) to content on other pages of your site forces your pages to unnecessarily compete with each other for rankings.
Because this is a directory listing site the content is quite generic.The main issue appears to be coming from the functionality of the page. It appears that the "spider" is picking up each different choice of filter as a new page? If you have a look at this link you will see what I mean.
People searching the site can filter the results of the songs played by this harpist by changing the dropdowns etc... but for some reason the filter arguments are being picked up...? Do you have any good approaches to solving this issue?
A similar issue comes from the video pages for each harpist. They are being flagged as identical content - as there are currently no videos on the page.
|
http://www.find-a-harpist.co.uk/user/39/videos
|
http://www.find-a-harpist.co.uk/user/37/videos
|
Do you have any suggestions?
Many thanks for taking the time to read this and respond.
| | | | | |
| | -
Thank you both for you responses. Yes the site is relatively new. I shall implement your suggestions and hopefully they will do teh trick.
-
Is your site relatively new? I currently show no pages in the Google index at all, which makes the duplicate content issue a bit moot (at least in the short-term). The search filters and pagination are a bit different issues. You could META NOINDEX any pages with the filter parameters active, or rel-canonical them to the unfiltered version (as @Steve25 said). Since no pages are indexed yet, you could also just "nofollow" the filter links ("Title", etc.), which should help prevent those filtered versions getting crawled. Pagination (pages 2+ of search) is a trickier issue, but it might be best to just NOINDEX, FOLLOW those. You could also let Google know in Google Webmaster Tools that that page= parameter is for pagination (I've had that be hit-or-miss, but it is easy, relative to other solutions). For the empty profiles, it really depends on the scope. If you have a lot, I'd ideally want to code them to have META NOINDEX if they're empty. You can lift the NOINDEX once they have content posted. You'd have to do that dynamically, but it shouldn't be too tricky. That way, Google would see new pages only once they have some content in place.
-
Could you set up canonical tags so that when users select certain criteria a parent page is shown in the canonical?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use location in keywords in a campaign?
We're setting up 25 campaigns for clients and many of the keywords in each are identical. Rather than creating a keyword with a specific location for each client (pool builder san francisco), would it be more effective to use only "pool builder" as the keyword and then set the location for the San Francisco area? I'm concerned that we are going to run out of our keyword allotment since many of our clients have multiple locations.
Moz Pro | | getsmartgroup0 -
Duplicate canonical tag issue
i have this site https://www.dealsmango.com/ which i have selected for canonical , but google is still selecting my old website https://www.selldealsmango.com/ , i have removed everything from old site only one page with new site link, and also put 301 redirect , but still when i click on request for indexing on google search console same error appears regarding duplicate canonical tag . what should i do? remove the canonical tag from old site which i don't want google to index, or what will be the best possible solution.
Moz Pro | | MudassirSultn0 -
Site Crawl 4xx Errors?
Hello! When I check our website's critical crawler issues with Moz Site Crawler, I'm seeing over 1000 pages with a 4xx error. All of the pages that are showing to have a 4xx error appear to be the brand and product pages we have on our website, but with /URL at the end of each permalink. For example, we have a page on our site for a brand called Davinci. The URL is https://kannakart.com/davinci/. In the site crawler, I'm seeing the 4xx for this URL: https://kannakart.com/davinci/URL. Could this be a plugin on our site that is generating these URLs? If they're going to be an issue, I'd like to remove them. However, I'm not sure exactly where to begin. Thanks in advance for the help, -Andrew
Moz Pro | | mostcg0 -
Getting rid of duplicate content
Hi everyone, I'm a newbie and at the moment don't know very much about SEO. I have a problem with some of my campaigns where i keep getting a report with either Duplicate Page and/or Duplicate Content errors. I have no idea how to rectify this error, remove it or fix it on the relevant websites. Can anyone please help explain how to do this, maybe step by step? I really appreciate your views and opinions! Regards, Hugh
Moz Pro | | DigitalAcademyZA0 -
Duplicate Page Titles and Content
The SeoMoz crawler has found many pages like this on my site with /?Letter=Letter, e.g. http://www.johnsearles.com/metal-art-tiles/?D=A. I believe it is finding multiple caches of a page and identifying them as duplicates. Is there any way to screen out these multiple cache results?
Moz Pro | | johnsearles0 -
Issue in number of pages crawled
i wanted to figure out how our friend Roger Bot works. On the first crawl of one of my large sites, the number of pages crawled stopped at 10000 (due to the restriction on the pro account). However after a few weeks, the number of pages crawled went down to about 5500. This number seemed to be a more accurate count of the pages on our site. Today, it seems that Roger Bot has completed another crawl and the number is up to 10000 again. I know there has been no downtime on our site, and the items that we fixed on our site did not reduce or increase the number of pages we had. Just making sure there are no known issues with Roger Bot before I look deeper into our site to see if there is an issue. Thanks!
Moz Pro | | cchhita0 -
Solving duplicate content errors for what is effectively the same page.
Hello,
Moz Pro | | jcarter
I am trying out your SEOMOZ and I quite like it. I've managed to remove most of the errors on my site however I'm not sure how to get round this last one. If you look at my errors you will see most of them revolve around things like this: http://www.containerpadlocks.co.uk/categories/32/dead-locks
http://www.containerpadlocks.co.uk/categories/32/dead-locks?PageSize=9999 These are essentially the same pages because the category for Dead Locks does not contain enough products to view over more than one resulting in the fact that when I say 'View all products' on my webpage, the results are the same. This functionality works with categories with more than the 20 per page limit. My question is, should I be either: Removing the link to 'show all products' (which adds the PageSize query string value) if no more products will be shown. Or putting a no-index meta tag on the page? Or some other action entirely? Looking forward to your reply and you showing how effective Pro is. Many Thanks,
James Carter0