Reinforcing Rel Canonical? (Fixing Duplicate Content)
-
Hi Mozzers,
We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint.
Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up?
Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
-
Have you seen a corresponding drop-off in the ListFinder pages over that time. If the canonical is kicking in, you should see some of those pages fall out as more ConsumerBase pages kick in.
Is there a reason your canonical'ing from the more indexed site to the less indexed one. It could be a mixed signal if Google things that ListFinder is a more powerful or authoritative site. Cross-domain can get tricky fast.
Unfortunately, beyond NOINDEX'ing, it's about your best option, and certainly one of your safest. It's really hard to predict what the combo of cross-domain canonical plus link would do. From a dupe content standpoint, it's risk free. From the standpoint of creating 80K links from one of your sites to another of your sites, it's a little risky (don't want to look like a link network). Since you're only talking two sites, though, it's probably not a huge issue, especially with the canonical already in place.
Google interprets cross-domain canonical heavily, so it can be a little hard to predict and control. Interestingly, the ConsumerBase site has higher Domain Authority, but the page you provided has lower Page Authority than its "sister" page. Might be a result of your internal linking structure giving more power to the ListFinder pages.
-
Great post Peter.
Here are some links of a product that is on both sites. Hopefully this will help you provide some more insight.
http://www.consumerbase.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.html
http://www.listfinder.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.htmlThe ListFinder pages are currently mostly indexed (70k out of 80k) which makes me think they are different enough from one another to not warrant a penalty.
The ConsumerBase pages started indexing well when we added the rel canonical code to LF (went from about 2k pages to 30k in early December, but since 1/2/2013 we have seen a dropoff in indexed pages down to about 5k.
Thanks!
-
With products, it's a bit hard to say. Cross-domain canonical could work, but Google can be a bit finicky about it. Are you seeing the pages on both sides in the Google index, or just one or the other? Sorry, it's a bit hard to diagnose without seeing a sample URL.
If this were more traditional syndicated content, you could set a cross-domain canonical and link the copy back to the source. That would provide an additional signal of which site should get credit. With your case, though, I haven't seen a good example of that - I don't think it would be harmful, though (to add the link, that is).
If you're talking about 80K links, then you've got 80K+ near-duplicate product pages. Unfortunately, it could go beyond just having one or the other version get filtered out. This could trigger a Panda or Panda-like penalty against the site in general. The cross-domain canonical should help prevent this, whereas the links probably won't. I do think it's smart to be proactive, though.
Worst case, you could META NOINDEX the product pages on one site - they'd still be available to users, but wouldn't rank. I think the cross-domain canonical is probably preferable here, but if you ran into trouble, META NOINDEX would be the more severe approach (and could help solve that trouble).
-
Yes, sir - that would be correct.
www.consumerbase.com and www.listfinder.com.
The sites are not 100% identical, just the content on the product pages.
-
are these two sites on the same root domain? it seems like most of the feedback you're getting are from people who are assuming they are however, it sounds to me like there are two separate domains
-
Zora,
Google accepts cross domain canonical as long as the pages have more similar content.
It is not necessary to add hyperlink pointing to canonical page. If your sites are crawler friendly, canonical hints will change search results very quickly.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
Ensure that Google doesn't find any issue with your Sitemaps. If you add products frequently, submit the updated Sitemap following the same schedule.
All the best.
-
I am sorry i am not understanding why you need a rel = in this matter if the sites are two different sites?
What is your end goal ?
-
We chose rel canonical because we still want users to be able to visit and navigate through site 2.
They are both e-commerce sites with similar products, not exactly identical sites.
-
Zora. Totally understand, but my input and what Majority of people do is redirect the traffic.
A server side htaccess 301 Redirect is your BEST choice here.
Why dont you want o use a 301 and prefer a Rel, curious on what your take is on this.
and Thanks for the rel update info i didnt know
-
Thanks for the info Hampig, I'll definitely take a look.
Rel Canonical actually works cross domain now, Google updated it from when it originally came out.
-
Zora hope you are doing well.
I came across this video about a few weeks ago. I think this is suppose to be found under Webmaster tools although i have not used it, i think it might be the best solution to get googles attention to portions of the pages and what they are suppose to be
http://www.youtube.com/watch?v=WrEJds3QeTw
Ok but i am confused a bit. You have two different domains ?
or two version of the same domain?
Because from the sound of it you have two different domains and using rel = con wont work and you would have to do a 301 redirect. Even for my sites when i change the pages around i use 301 redirect for the same existing site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Duplicate content across different domains
Hi Guys, Looking for some advice regarding duplicate content across different domains. I have reviewed some previous Q&A on this topic e.g. https://moz.com/community/q/two-different-domains-exact-same-content but just want to confirm if I'm missing anything. Basically, we have a client which has 1 site (call this site A) which has solids rankings. They have decided to build a new site (site B), which contains 50% duplicate pages and content from site A. Our recommendation to them was to make the content on site B as unique as possible but they want to launch asap, so not enough time. They will eventually transfer over to unique content on the website but in the short-term, it will be duplicate content. John Mueller from Google has said several times that there is no duplicate content penalty. So assuming this is correct site A should be fine, no ranking losses. Any disagree with this? Assuming we don't want to leave this to chance or assume John Mueller is correct would the next best thing to do is setup rel canonical tags between site A and site B on the pages with duplicate content? Then once we have unique content ready, execute that content on the site and remove the canonical tags. Any suggestions or advice would be very much appreciated! Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
I have implemented rel = "next" and rel = "prev" but google console is picking up pages as being duplicate. Can anyone tell me what is going on?
I have implemented rel="next" and rel = "prev" across our site but google console is picking it up as duplications. Also individual pages show up in search result too. Here is an example linkhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-healthhttp://www.empowher.com/mental-health/content/sizeismweightism-how-cope-it-and-how-it-affects-mental-health?page=0,3The second link shows up as duplicate. What can i do to fix this issue?
Intermediate & Advanced SEO | | akih0 -
What is considered duplicate content?
Hi, We are working on a product page for bespoke camper vans: http://www.broadlane.co.uk/campervans/vw-campers/bespoke-campers . At the moment there is only one page but we are planning add similar pages for other brands of camper vans. Each page will receive its specifically targeted content however the 'Model choice' cart at the bottom (giving you the choice to select the internal structure of the van) will remain the same across all pages. Will this be considered as duplicate content? And if this is a case, what would be the ideal solution to limit penalty risk: A rel canonical tag seems wrong for this, as there is no original item as such. Would an iFrame around the 'model choice' enable us to isolate the content from being indexed at the same time than the page? Thanks, Celine
Intermediate & Advanced SEO | | A_Q0 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Using unique content from "rel=canonical"ized page
Hey everyone, I have a question about the following scenario: Page 1: Text A, Text B, Text C Page 2 (rel=canonical to Page 1): Text A, Text B, Text C, Text D Much of the content on page 2 is "rel=canonical"ized to page 1 to signalize duplicate content. However, Page 2 also contains some unique text not found in Page 1. How safe is it to use the unique content from Page 2 on a new page (Page 3) if the intention is to rank Page 3? Does that make any sense? 🙂
Intermediate & Advanced SEO | | ipancake0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840