Reinforcing Rel Canonical? (Fixing Duplicate Content)
-
Hi Mozzers,
We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint.
Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up?
Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
-
Have you seen a corresponding drop-off in the ListFinder pages over that time. If the canonical is kicking in, you should see some of those pages fall out as more ConsumerBase pages kick in.
Is there a reason your canonical'ing from the more indexed site to the less indexed one. It could be a mixed signal if Google things that ListFinder is a more powerful or authoritative site. Cross-domain can get tricky fast.
Unfortunately, beyond NOINDEX'ing, it's about your best option, and certainly one of your safest. It's really hard to predict what the combo of cross-domain canonical plus link would do. From a dupe content standpoint, it's risk free. From the standpoint of creating 80K links from one of your sites to another of your sites, it's a little risky (don't want to look like a link network). Since you're only talking two sites, though, it's probably not a huge issue, especially with the canonical already in place.
Google interprets cross-domain canonical heavily, so it can be a little hard to predict and control. Interestingly, the ConsumerBase site has higher Domain Authority, but the page you provided has lower Page Authority than its "sister" page. Might be a result of your internal linking structure giving more power to the ListFinder pages.
-
Great post Peter.
Here are some links of a product that is on both sites. Hopefully this will help you provide some more insight.
http://www.consumerbase.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.html
http://www.listfinder.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.htmlThe ListFinder pages are currently mostly indexed (70k out of 80k) which makes me think they are different enough from one another to not warrant a penalty.
The ConsumerBase pages started indexing well when we added the rel canonical code to LF (went from about 2k pages to 30k in early December, but since 1/2/2013 we have seen a dropoff in indexed pages down to about 5k.
Thanks!
-
With products, it's a bit hard to say. Cross-domain canonical could work, but Google can be a bit finicky about it. Are you seeing the pages on both sides in the Google index, or just one or the other? Sorry, it's a bit hard to diagnose without seeing a sample URL.
If this were more traditional syndicated content, you could set a cross-domain canonical and link the copy back to the source. That would provide an additional signal of which site should get credit. With your case, though, I haven't seen a good example of that - I don't think it would be harmful, though (to add the link, that is).
If you're talking about 80K links, then you've got 80K+ near-duplicate product pages. Unfortunately, it could go beyond just having one or the other version get filtered out. This could trigger a Panda or Panda-like penalty against the site in general. The cross-domain canonical should help prevent this, whereas the links probably won't. I do think it's smart to be proactive, though.
Worst case, you could META NOINDEX the product pages on one site - they'd still be available to users, but wouldn't rank. I think the cross-domain canonical is probably preferable here, but if you ran into trouble, META NOINDEX would be the more severe approach (and could help solve that trouble).
-
Yes, sir - that would be correct.
www.consumerbase.com and www.listfinder.com.
The sites are not 100% identical, just the content on the product pages.
-
are these two sites on the same root domain? it seems like most of the feedback you're getting are from people who are assuming they are however, it sounds to me like there are two separate domains
-
Zora,
Google accepts cross domain canonical as long as the pages have more similar content.
It is not necessary to add hyperlink pointing to canonical page. If your sites are crawler friendly, canonical hints will change search results very quickly.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
Ensure that Google doesn't find any issue with your Sitemaps. If you add products frequently, submit the updated Sitemap following the same schedule.
All the best.
-
I am sorry i am not understanding why you need a rel = in this matter if the sites are two different sites?
What is your end goal ?
-
We chose rel canonical because we still want users to be able to visit and navigate through site 2.
They are both e-commerce sites with similar products, not exactly identical sites.
-
Zora. Totally understand, but my input and what Majority of people do is redirect the traffic.
A server side htaccess 301 Redirect is your BEST choice here.
Why dont you want o use a 301 and prefer a Rel, curious on what your take is on this.
and Thanks for the rel update info i didnt know
-
Thanks for the info Hampig, I'll definitely take a look.
Rel Canonical actually works cross domain now, Google updated it from when it originally came out.
-
Zora hope you are doing well.
I came across this video about a few weeks ago. I think this is suppose to be found under Webmaster tools although i have not used it, i think it might be the best solution to get googles attention to portions of the pages and what they are suppose to be
http://www.youtube.com/watch?v=WrEJds3QeTw
Ok but i am confused a bit. You have two different domains ?
or two version of the same domain?
Because from the sound of it you have two different domains and using rel = con wont work and you would have to do a 301 redirect. Even for my sites when i change the pages around i use 301 redirect for the same existing site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Internal Content on E-Commerce Website
Hi, I find my e-commerce pharmacy website is full of little snippets of duplicate content. In particular: -delivery info widget repeated on all the product pages -product category information repeated product pages (e.g. all medicines belonging to a certain category of medicines have identical side effects and I also include a generic snippet of the condition the medicine treats) Do you think it will harm my rankings to do this?
Intermediate & Advanced SEO | | deelo5550 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Should we use the rel-canonical tag?
We have a secure version of our site, as we often gather sensitive business information from our clients. Our https pages have been indexed as well as our http version. Could it still be a problem to have an http and an https version of our site indexed by Google? Is this seen as being a duplicate site? If so can this be resolved with a rel=canonical tag pointing to the http version? Thanks
Intermediate & Advanced SEO | | annieplaskett1 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Duplicate content resulting from js redirect?
I recently created a cname (e.g. m.client-site .com) and added some js (supplied by mobile site vendor to the head which is designed to detect if the user agent is a mobi device or not. This is part of the js: var CurrentUrl = location.href var noredirect = document.location.search; if (noredirect.indexOf("no_redirect=true") < 0){ if ((navigator.userAgent.match(/(iPhone|iPod|BlackBerry|Android.*Mobile|webOS|Window Now... Webmaster Tools is indicating 2 url versions for each page on the site - for example: 1.) /content-page.html 2.) /content-page.html?no_redirect=true and resulting in duplicate page titles and meta descriptions. I am not quite adept enough at either js or htaccess to really grasp what's going on here... so an explanation of why this is occurring and how to deal with it would be appreciated!
Intermediate & Advanced SEO | | SCW0 -
Worldwide Stores - Duplicate Content Question
Hello, We recently added new store views for our primary domain for different countries. Our primary url: www.store.com Different Countries URLS: www.store.com/au www.store.com/uk www.store.com/nz www.store.com/es And so forth and so on. This resulted in an almost immediate rankings drop for several keywords which we feel is a result of duplicate content creation. We've thousands of pages on our primary site. We've assigned a "no follow" tags to all store views for now, and trying to roll back the changes we did. However, we've seen some stores launching in different countries with same content, but with a country specific extensions like .co.uk, .co.nz., .com.au. At this point, it appears we have three choices: 1. Remove/Change duplicate content in country specific urls/store views. 2. Launch using .co.uk, .com.au with duplicate content for now. 3. Launch using .co.uk, .com.au etc with fresh content for all stores. Please keep in mind, option 1, and 3 can get very expensive keeping hundreds of products in untested territories. Ideally, we would like test first and then scale. However, we'd like to avoid any duplicate penalties on our main domain. Thanks for your help and answers on the same.
Intermediate & Advanced SEO | | globaleyeglasses0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0 -
Help With Preferred Domain Settings, 301 and Duplicate Content
I've seen some good threads developed on this topic in the Q&A archives, but feel this topic deserves a fresh perspective as many of the discussion were almost 4 years old. My webmaster tools preferred domain setting is currently non www. I didn't set the preferred domain this way, it was like this when I first started using WM tools. However, I have built the majority of my links with the www, which I've always viewed as part of the web address. When I put my site into an SEO Moz campaign it recognized the www version as a subdomain which I thought was strange, but now I realize it's due to the www vs. non www preferred domain distinction. A look at site:mysite.com shows that Google is indexing both the www and non www version of the site. My site appears healthy in terms of traffic, but my sense is that a few technical SEO items are holding me back from a breakthrough. QUESTION to the SEOmoz community: What the hell should I do? Change the preferred domain settings? 301 redirect from non www domain to the www domain? Google suggests this: "Once you've set your preferred domain, you may want to use a 301 redirect to redirect traffic from your non-preferred domain, so that other search engines and visitors know which version you prefer." Any insight would be greatly appreciated.
Intermediate & Advanced SEO | | JSOC1