Duplicate Page Content
-
Hi there,
We keep getting duplicate page content issues. However, its not actually the same page.
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content? -
Thanks all - will give those options a try and see which works the best for us.
-
Hi!
I suggested the noindex in order to deindex pages that maybe are already indexed. But, yes, the rel="canonical" should be doing the same (the problem is that Google may not respect it).
The nofollow is order to not letting the crawler wasting budget crawl following the links of those (many) pages.
-
Gianluca,
Wouldn't be much more work to identify if the parameter is set and then add the noindex meta? Wouldn't be easier to just set the canonical? I'm sure that's a dynamic site, just one canonical cal without using any extra code (PHP or whatever).
Why the nofollow? If I just preventing that page for being indexed as it would constitute a duplicate content issue, why the nofollow?, noindex should be enough in this case.
We recently fixed a similar issue with our blog tags, showing duplicate content on about 400 pages. We fixed that by adding the noindex (they already had the canonical but it wasn't enough as the canonical couldn't point to a definite version as that changed if the tag had or not another post on it). Within a few days all those pages were deindexed, we noticed a loss in search traffic and I decided to run a small test removing the noindex tag. Results: 2 weeks later none of those pages returned to the index (I added the noindex tag back as it was just a test to see if we could regain that traffic, but ultimately decided it wouldn't help to have a duplicate content issue for that lost traffic).
-
Federico is right.
Your duplicated content issue is due to the date parameters, hence you are potentially duplicating every page having that calendar for all the possible combination of dates... and that is an huge issue.
You should implement the rel="canonical" in order to have all these kind of URLs having as canonical the URL without the parameter.
Or, even better, you should implement the meta robots "noindex,nofollow" in every date parametered URL.
Said that, the most logical thing to do was to block these URLs via robots.txt when launching the site. Unfortunately, now blocking these URLs is not enough, as they are already indexed (even if they not appear in the index because they are filtered out by Google).
-
Ah you mean that if the dates of the reservation changes then it creates a duplicate page content?
If that's the case, you should use the rel="canonical" the definite page, no dates selected, just the page that shows the property.
-
Did you try adding the rel="canonical" tag to the pages?
-
So they might look at this page: http://www.hihh.com.au/property-details?hihhpropertyId=HCP006&checkin=2013-08-06&checkout=2013-08-09&search=checkindate%3D2013-08-06%26checkoutdate%3D2013-08-09
Then the same page would come up on the error list but with different dates.
-
Can you provide us with some examples? It would make our job easier
-
Its basically all seperate pages/URL's with different information on each. However it seems to be crawled for each possible range of that page. e.g for check in/check out dates. It will search a range of dates and think that each page has different information. However, its all exactly the same.
-
Is the issue on the pagination? as sometimes some pages from categories/tags/etc can have the same content within an exact page.
If that's the issue, I would recommend you add a noindex meta to the least important pages (tags for example).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate and thin content - advanced..
Hi Guys Two issues to sort out.. So we have a website that lists products and has many pages for: a) The list pages - that lists all the products for that area.
On-Page Optimization | | nick-name123
b) The detailed pages - that when click into from the list page, will list the specific product in full. On the list page, we perhaps have half the description written down, when clicked into you see the full description.
If you search in google for a phrase on the detailed page, you will see results for that specific page including 'multiple' list pages where it is on. For example, lets say we are promoting 'trees' which are situated in Manhatten. And we are also promoting trees in Brooklyn, there is a crossover. So a tree listed in Manhatten will also be listen in brooklyn as its close by (not from America so don't laugh if I have areas muddled)
We then have quite a few pages with the same content as a result. I read a post a while back from the mighty Cutts who said not to worry about the duplicate unless its spammy, but what is good for one person, is spammy to another.. Does anyone have any ideas as to if this is a genuine problem and how you would solve? Also, we know we have alot of thin content on the site, but we dont know how to identify it. It's a large site so needs something automated (I think).. Thanks in advance Nick0 -
Duplicate product content/disclaimers for non-e-commerce sites
This is more a follow-up to Rand's recent Whiteboard "Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs." I posed my question in the comments, but unsure it will get picked up. My situation isn't exactly the same, but it's similar: Our site isn't an e-commerce site and doesn't have user reviews yet, but we do have maybe 8 pages across 2 product categories featuring very similar product features with duplicate verbiage. However, we don't want to re-write it because we want to make it easy for users to compare apples-to-apples to easily see which features are actually different. We also have to run disclaimers at the bottom of each page.\ Would i-framing the product descriptions and disclaimers be beneficial in this scenario, with the addition of good content? It would still be nice to have some crawlable content on those pages, so the i-framing makes me nervous unless we compensate with at least some above-the-fold, useful content that could be indexed. Thanks, Sarah
On-Page Optimization | | sbs2190 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
Empty public profiles are viewed as duplicate content. What to do?
Hi! I manage a social networking site. We have a lot of public user profiles that are viewed as duplicate content. This is because these users haven't filled out any public profile info and thus the profiles are "empty" (except for the name). Is this something I should worry about? If yes, what are my options to solve this? Thanks!
On-Page Optimization | | thomasvanderkleij0 -
Duplicate Page Content Should we 301 - Best Practices?
What would be the best way to avoid a Duplicate Page Content for these type of pages. Our website generates user friendly urls, for each page..
On-Page Optimization | | 365ToursSafaris
So it is the same exact page, just both versions of the url work.. Example: http://www.safari365.com/about-africa/wildebeest-migration http://www.safari365.com/wildebeest-migration I don't think adding code to the page will work because its the same page for the incorrect and correct versions of the page. I don't think i can use the URL parameter setting because the version with /about-africa/ is the correct (correct as it it follows the site navigation) I was thinking of using the htaccess to redirect to the correct version.. Will that work ? and does it follow best Practices ? any other suggestions that would work better ?0 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5