Duplicate content & canonicals
-
Hi,
Working on a website for a company that works in different european countries.
The setup is like this:
www.website.eu/nl
www.website.eu/be
www.website.eu/fr
...You see that every country has it's own subdir, but NL & BE share the same language, dutch...
The copywriter wrote some unique content for NL and for BE, but it isn't possible to write unique for every product detail page because it's pretty technical stuff that goes into those pages.
Now we want to add canonical tags to those identical product pages. Do we point the canonical on the /be products to /nl products or visa versa?
Other question regarding SEOmoz: If we add canonical tags to x-pages, do they still appear in the Crawl Errors "duplicate page content", or do we have to do our own math and just do "duplicate page content" minus "Rel canonical" ?
-
Hey Joris,
As of now it will most likely see it as duplicate content, because technically it still is duplicate content to a crawler bot, they won't know your intentions or target audience for each subfolder. The only way you could get around our crawler seeing it as duplicate is by blocking rogerbot with robots.txt or meta robots from that subfolder. Then there is putting up relconanoicals, which is the best way.
Hope this sheds some light on the duplicate content issues.
Best,
Nick
SEOmoz -
Thanks Robert!
-
Will do!
-
Now, that was a good question. Why not send a quick email to help@SEOmoz.org and just ask if there is a way to circumvent? LMK please.
-
Hi Robert,
Thx for your quick answer, I will make sure that in Google Webmaster Tools we say that the /be is for Belgium and the /nl for The Netherlands, but the duplicate content will still show up in our reports in SEOmoz, no?
-
First question is: Have you thought of using the .cc instead of the sub directory? Rand speaks to the .fr issue in his WBF mentioned by iBiz Leverage.
As to canonical to avoid duplicate content, you shouldn't have a duplicate content issue even with the two languages so long as you set your country target for each. But, read or watch the WBF by Rand as it is full of info on this subject and domain auth, etc.
-
I have same problem and found this URL: http://www.youtube.com/watch?v=Ets7nHOV1Yo
Here is also another link from SEOmoz; i think this is most helpful: http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Hope this can help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Avoid Duplicate Page Content errors when using Wordpress Categories & Tags?
I get a lot of duplicate page errors on my crawl diagnostics reports from 'categories' and 'tags' on my wordpress sites. The post is 1x link and then the content is 'duplicated' on the 'category' or 'tag' that is added to the page. Should I exclude the tags and categories from my sitemap or are these issues not that important? Thanks for your help Stacey
Moz Pro | | skehoe1 -
Why SEOmoz bot consider these as duplicate pages?
Hello here, SEOmoz bot has recently marked the following two pages as duplicate: http://www.virtualsheetmusic.com/score/PatrickCollectionFlPf.html?tab=mp3 http://www.virtualsheetmusic.com/score/PatrickCollectionFlPf.html?tab=pdf I don't personally see how these pages can be considered duplicate since their content is quite different. Thoughts??!!
Moz Pro | | fablau0 -
I'm seeing duplicate links listed in open site explorer. Can anyone explain why this might be happening?
For each link listed in Open Site Explorer, I am seeing two identical entries. You can see why this might be a problem for both workflow and data accuracy. Any ideas why this might be happening? Is there a setting or filter that I'm missing?
Moz Pro | | StephenEggett0 -
Duplicate Content being caused by home page?
Hello everyone, I am new to SEOmoz and SEO in general and I have a quick questions. When running a SEO Web Crawler report on my URL, I noticed in the report that my home page (also known as my index page) was listed twice. Here is what the report was showing: www.example.com/ www.example.com/index.php So are these 2 different urls? If so, is this considered duplicate content and should I block crawler access to the index.php? Thanks in advance for the help!
Moz Pro | | threebiz0 -
& VS & - Title too long?
It seems that SEO Moz inturprets & as the html ascii character code: & in my titles. This is pushing the titles over the limit by 1 or 2 characters in some cases. Does this matter? does google actually treat & the same way? or is this an SEO Moz bug?
Moz Pro | | adriandg1 -
How to handle crawl diagnostic errors for the same url. /products & /products/
I have copied on of the errors out of the crawl diagnostics report. Both /products and /products/ are returning an error, and both have pretty good domain authority so I feel like its hurting my site that these show up this way. Both urls create the same page, should I just setup a 301 on the /products with no slash or will that cause more harm... I am using the MODx cms system and that could have something to do with it. | Products | Datalight http://www.datalight.com/products 1 37 5 Products | Datalight http://www.datalight.com/products/ | 1 | 30 | 1 |
Moz Pro | | tjsherrill0 -
Reducing duplicate content
Callcatalog.com is a complaint directory for phone numbers. People post information on the phone calls they get. Since there are many many phone numbers, obviously people haven't posted information on ALL of the phone numbers, THUS I have many phone numbers with zero content. SEOMoz is telling me that pages with zero content looks like duplicate content with each other.. The only difference between two pages that have zero coments is the title and phone number embedded in the page. For example, http://www.callcatalog.com/phones/view/413-563-3263 is a page that has zero comments.. I don't want to remove these zero comment phone number pages from the directory since many people find the pages via a phone number search. Here's my question: what can I do to make google / seomoz think that thexe zero comment pages is not dupliicate content?
Moz Pro | | seo_ploom0