Duplicate content & canonicals
-
Hi,
Working on a website for a company that works in different european countries.
The setup is like this:
www.website.eu/nl
www.website.eu/be
www.website.eu/fr
...You see that every country has it's own subdir, but NL & BE share the same language, dutch...
The copywriter wrote some unique content for NL and for BE, but it isn't possible to write unique for every product detail page because it's pretty technical stuff that goes into those pages.
Now we want to add canonical tags to those identical product pages. Do we point the canonical on the /be products to /nl products or visa versa?
Other question regarding SEOmoz: If we add canonical tags to x-pages, do they still appear in the Crawl Errors "duplicate page content", or do we have to do our own math and just do "duplicate page content" minus "Rel canonical" ?
-
Hey Joris,
As of now it will most likely see it as duplicate content, because technically it still is duplicate content to a crawler bot, they won't know your intentions or target audience for each subfolder. The only way you could get around our crawler seeing it as duplicate is by blocking rogerbot with robots.txt or meta robots from that subfolder. Then there is putting up relconanoicals, which is the best way.
Hope this sheds some light on the duplicate content issues.
Best,
Nick
SEOmoz -
Thanks Robert!
-
Will do!
-
Now, that was a good question. Why not send a quick email to help@SEOmoz.org and just ask if there is a way to circumvent? LMK please.
-
Hi Robert,
Thx for your quick answer, I will make sure that in Google Webmaster Tools we say that the /be is for Belgium and the /nl for The Netherlands, but the duplicate content will still show up in our reports in SEOmoz, no?
-
First question is: Have you thought of using the .cc instead of the sub directory? Rand speaks to the .fr issue in his WBF mentioned by iBiz Leverage.
As to canonical to avoid duplicate content, you shouldn't have a duplicate content issue even with the two languages so long as you set your country target for each. But, read or watch the WBF by Rand as it is full of info on this subject and domain auth, etc.
-
I have same problem and found this URL: http://www.youtube.com/watch?v=Ets7nHOV1Yo
Here is also another link from SEOmoz; i think this is most helpful: http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
Hope this can help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you create tracking URLs in Wordpress without creating duplicate pages?
I use Wordpress as my CMS, but I want to track click activity to my RFQ page from different products and services on my site. The easiest way to do this is through adding a string to the end of a URL (ala http://www.netrepid.com/request-for-quote/?=colocation) The downside to this, of course, is that when Moz does its crawl diagnostic every week, I get notified that I have multiple pages with the same page title and the dup content. I'm not a programming expert, but I'm pretty handy with Wordpress and know a thing or two about 'href-fing' (yeah, that's a thing). Can someone who tracks click activity in WP with URL variables please enlighten me on how to do this without creating dup pages? Appreciate your expertise. Thanks!
Moz Pro | | Netrepid0 -
Crawl report - duplicate page title/content issue
When the crawl report is finished, it is saying that there are duplicate content/page titles issues. However there is a canonical tag that is formatted correctly so just wondered if this was a bug or if anyone else was having the same issues? For example, I'm getting a error warning for this page http://www.thegreatgiftcompany.com/categories/categories_travel?sort=name_asc&searchterm=&page=1&layout=table
Moz Pro | | KarlBantleman0 -
Why does Rel Canonical show up as a notice?
In the crawl diagnostics screen "Rel Canonical" shows up as a notice for every page that has a rel="canonical" meta tag in it. Why is this the case? Shouldn't every page have a canonical tag on it to show the absolute URL to the content? Wouldn't a better notice be to display pages that do not have a canonical tag instead? I could be wrong but that would make more sense to me. (In fact.. let's be honest here.. I probably am wrong.. but I'd like someone to explain it if they could.) Thanks
Moz Pro | | rrolfe1 -
How to delete/redirect duplicate content
Hello, Our site thewealthymind(dot)com has a lot of duplicate content. How do you clear up duplicate content when there's a lot of it. The owners redid the site several times and didn't update the URLs. Thank you.
Moz Pro | | BobGW0 -
Crawl Diagnostics returning duplicate content based on session id
I'm just starting to dig into crawl diagnostics and it is returning quite a few errors. Primarily, the crawl is indicating duplicate content (page titles, meta tags, etc), because of a session id in the URL. I have set-up a URL parameter in Google Webmaster Tools to help Google recognize the existence of this session id. Is there any way to tell the SEOMoz spider the same thing? I'd like to get rid of these errors since I've already handled them for the most part.
Moz Pro | | csingsaas0 -
Roger keeps telling me my canonical pages are duplicates
I've got a site that's brand spanking new that I'm trying to get the error count down to zero on, and I'm basically there except for this odd problem. Roger got into the site like a naughty puppy a bit too early, before I'd put the canonical tags in, so there were a couple thousand 'duplicate content' errors. I put canonicals in (programmatically, so they appear on every page) and waited a week and sure enough 99% of them went away. However, there's about 50 that are still lingering, and I'm not sure why they're being detected as such. It's an ecommerce site, and the duplicates are being detected on the product page, but why these 50? (there's hundreds of other products that aren't being detected). The URLs that are 'duplicates' look like this according to the crawl report: http://www.site.com/Product-1.aspx http://www.site.com/product-1.aspx And so on. Canonicals are in place, and have been for weeks, and as I said there's hundreds of other pages just like this not having this problem, so I'm finding it odd that these ones won't go away. All I can think of is that Roger is somehow caching stuff from previous crawls? According to the crawl report these duplicates were discovered '1 day ago' but that simply doesn't make sense. It's not a matter of messing up one or two pages on my part either; we made this site to be dynamically generated, and all of the SEO stuff (canonical, etc.) is applied to every single page regardless of what's on it. If anyone can give some insight I'd appreciate it!
Moz Pro | | icecarats0 -
Canonical for Mobile
Hi Guys, I am curious why in SEOMoz, our mobile site is showing to have the canonical tags used on the desktop site but when you double check the code of the mobile website it is showing m.domain.com Any thoughts on why we are seeing this? Also is there any lag in the code updates being reported through the SEOmoz toolset? Thanks for all your help! Cheers,
Moz Pro | | lwalker0 -
Should I worry about duplicate content errors caused by backslashes?
Frequently we get red-flagged for duplicate content in the MozPro Crawl Diagnostics for URLs with and without a backslash at the end. For example: www.example.com/ gets flagged as being a duplicate of www.example.com I assume that we could rel=canonical this, if needed, but our assumption has been that Google is clever enough to discount this as a genuine crawl error. Can anyone confirm or deny that? Thanks.
Moz Pro | | MackenzieFogelson0