Reinforcing Rel Canonical? (Fixing Duplicate Content)
-
Hi Mozzers,
We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint.
Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up?
Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
-
Have you seen a corresponding drop-off in the ListFinder pages over that time. If the canonical is kicking in, you should see some of those pages fall out as more ConsumerBase pages kick in.
Is there a reason your canonical'ing from the more indexed site to the less indexed one. It could be a mixed signal if Google things that ListFinder is a more powerful or authoritative site. Cross-domain can get tricky fast.
Unfortunately, beyond NOINDEX'ing, it's about your best option, and certainly one of your safest. It's really hard to predict what the combo of cross-domain canonical plus link would do. From a dupe content standpoint, it's risk free. From the standpoint of creating 80K links from one of your sites to another of your sites, it's a little risky (don't want to look like a link network). Since you're only talking two sites, though, it's probably not a huge issue, especially with the canonical already in place.
Google interprets cross-domain canonical heavily, so it can be a little hard to predict and control. Interestingly, the ConsumerBase site has higher Domain Authority, but the page you provided has lower Page Authority than its "sister" page. Might be a result of your internal linking structure giving more power to the ListFinder pages.
-
Great post Peter.
Here are some links of a product that is on both sites. Hopefully this will help you provide some more insight.
http://www.consumerbase.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.html
http://www.listfinder.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.htmlThe ListFinder pages are currently mostly indexed (70k out of 80k) which makes me think they are different enough from one another to not warrant a penalty.
The ConsumerBase pages started indexing well when we added the rel canonical code to LF (went from about 2k pages to 30k in early December, but since 1/2/2013 we have seen a dropoff in indexed pages down to about 5k.
Thanks!
-
With products, it's a bit hard to say. Cross-domain canonical could work, but Google can be a bit finicky about it. Are you seeing the pages on both sides in the Google index, or just one or the other? Sorry, it's a bit hard to diagnose without seeing a sample URL.
If this were more traditional syndicated content, you could set a cross-domain canonical and link the copy back to the source. That would provide an additional signal of which site should get credit. With your case, though, I haven't seen a good example of that - I don't think it would be harmful, though (to add the link, that is).
If you're talking about 80K links, then you've got 80K+ near-duplicate product pages. Unfortunately, it could go beyond just having one or the other version get filtered out. This could trigger a Panda or Panda-like penalty against the site in general. The cross-domain canonical should help prevent this, whereas the links probably won't. I do think it's smart to be proactive, though.
Worst case, you could META NOINDEX the product pages on one site - they'd still be available to users, but wouldn't rank. I think the cross-domain canonical is probably preferable here, but if you ran into trouble, META NOINDEX would be the more severe approach (and could help solve that trouble).
-
Yes, sir - that would be correct.
www.consumerbase.com and www.listfinder.com.
The sites are not 100% identical, just the content on the product pages.
-
are these two sites on the same root domain? it seems like most of the feedback you're getting are from people who are assuming they are however, it sounds to me like there are two separate domains
-
Zora,
Google accepts cross domain canonical as long as the pages have more similar content.
It is not necessary to add hyperlink pointing to canonical page. If your sites are crawler friendly, canonical hints will change search results very quickly.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
Ensure that Google doesn't find any issue with your Sitemaps. If you add products frequently, submit the updated Sitemap following the same schedule.
All the best.
-
I am sorry i am not understanding why you need a rel = in this matter if the sites are two different sites?
What is your end goal ?
-
We chose rel canonical because we still want users to be able to visit and navigate through site 2.
They are both e-commerce sites with similar products, not exactly identical sites.
-
Zora. Totally understand, but my input and what Majority of people do is redirect the traffic.
A server side htaccess 301 Redirect is your BEST choice here.
Why dont you want o use a 301 and prefer a Rel, curious on what your take is on this.
and Thanks for the rel update info i didnt know
-
Thanks for the info Hampig, I'll definitely take a look.
Rel Canonical actually works cross domain now, Google updated it from when it originally came out.
-
Zora hope you are doing well.
I came across this video about a few weeks ago. I think this is suppose to be found under Webmaster tools although i have not used it, i think it might be the best solution to get googles attention to portions of the pages and what they are suppose to be
http://www.youtube.com/watch?v=WrEJds3QeTw
Ok but i am confused a bit. You have two different domains ?
or two version of the same domain?
Because from the sound of it you have two different domains and using rel = con wont work and you would have to do a 301 redirect. Even for my sites when i change the pages around i use 301 redirect for the same existing site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Is a Rel Canonical Sufficient or Should I 'NoIndex'
Hey everyone, I know there is literature about this, but I'm always frustrated by technical questions and prefer a direct answer or opinion. Right now, we've got recanonicals set up to deal with parameters caused by filters on our ticketing site. An example is that this: http://www.charged.fm/billy-joel-tickets?location=il&time=day relcanonicals to... http://www.charged.fm/billy-joel-tickets My question is if this is good enough to deal with the duplicate content, or if it should be de-indexed. Assuming so, is the best way to do this by using the Robots.txt? Or do you have to individually 'noindex' these pages? This site has 650k indexed pages and I'm thinking that the majority of these are caused by url parameters, and while they're all canonicaled to the proper place, I am thinking that it would be best to have these de-indexed to clean things up a bit. Thanks for any input.
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Duplicate content on yearly product models.
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
Intermediate & Advanced SEO | | DCochrane0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0