How would you handle this duplicate content - noindex or canonical?
-
Hello
Just trying look at how best to deal with this duplicated content.
On our Canada holidays page we have a number of holidays listed (PAGE A)
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspxWe also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspxOf the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search.
From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A.
Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A?
Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing:
Duplicate Content: Block, Redirect or Canonical - SEO Tips
-
OK, I think I understand what you are asking now.
Canonicals are for identical or near-identical pages. I don't know that those two pages would be considered to be identical, even after you added the arctic listings to the Canada page, especially as the above-the-fold content is different.
Keep in mind that the "penalty" for duplicate content is that Google will choose only one page to show, depending on which one it thinks is most relevant. And if you have one page that gets a lot more traffic and engagement, that is likely to be the one Google chooses, anyway.
If I were you, I'd probably make sure the description sections at the top of those pages each has a good bit of unique content and maybe I'd change the titles and h1s to make them a little more different from each other (if you can do that) then I'd just leave it at that and see what Google makes of it.
If it seems that your higher traffic page starts to lose traffic, you can always add the canonicals then, and resubmit the URL through Fetch as Google in Webmaster Tools.
-
Hi both
Thank you.
Linda - It's people arriving at the Canada page who want to see all Canada, not the other way round. People select Canada as a destination but are also interested in our Arctic Canada trips.
The Canada page itself doesn't rank well or act as a landing page portal, however it is important in terms of site structure as people check that destination to see if we do trips there once they reach the site. People equally come onto the site looking for a trip to the Arctic as a destination so we do need both within the site in terms of the user journey.
The canonical tag would be my preference - if there is enough unique content on both pages do you think it matters if the holidays list is the same - this could be an alternative although we won't escape a percentage of duplication?
-
I don't recommend no following either page. The Canonical tag should help with the duplicate content errors. If it were my site I would list all of the holidays on one page only by combining the two pages together. If you use the Canonical tag you will decrease your chances of having both pages rank, however you will be telling the engines which page is the authoritative page.
-
First, are you sure that the people who are arriving at the arctic page really want to see all of the holidays and not the arctic ones? The arctic page is pretty well optimized for "arctic", and it is in the title and description. Take a look in your Webmaster Tools at those pages and see which keywords are bringing them up.
If you have a good reason to think that people really want the more general page (page A) but it is not getting a lot of traffic, putting that content on the arctic page (page B) probably won't solve your problem as there is obviously some reason page A is not doing as well and you are just spreading around the content that is not working.
I don't think your answer lies in making the pages duplicates--you should actually be making them more different from each other so the arctic one is very clearly specific for arctic trips and the overview one for general inquiries.
And in the meantime you could put a prominent link at the top of your arctic page linking back to the overview page, saying something like, "For more ideas, see all of our suggested holidays." (In fact there should be a link like that on each of your specialty pages, pointing back to the general page--that will help build the authority of page A and help it rank higher in the SERPs.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0