Duplicating content from manufacturer for client site and using canonical reference.
-
We manage content for many clients in the same industry, and many of them wish to keep their customers on their individualized websites (understandably). In order to do this, we have duplicated content in part from the manufacturers' pages for several "models" on the client's sites. We have put in a Canonical reference at the start of the content directing back to the manufacturer's page where we duplicated some of the content. We have only done a handful of pages while we figure out the canonical reference potential issue.
So, my questions are:
- Is this necessary?
- Does this hurt, help or not do anything SEO-wise for our ranking of the site?
Thanks!
-
Thank you all for your information. It is very insightful and will help us move towards the correct decisions.
-
Hello,
Laura and EGOL really nailed it as usually they both do!
By using Canonical you have basically told the search engines hey this content all belongs to X.
What I would suggest is use the manufacturers description in conjunction with the sites or owners own description. There is absolutely nothing wrong with using a manufactures description but you have to own it, which means unique content for every client for every product. Amazon for example uses manufacturers descriptions but they also usually add a slew of other things to a page to make it theirs, Manufacture Description / Amazon Description / Technical Information / User Reviews / User Questions / User Images / Shipping Information.
And here is the crux of the matter, people don't want to buy things from companies who know nothing about what they are selling. If a site can't add some sort of information or opinion about what the product is and why it is worth buying, they honestly have no business trying to sell such a product.
Just my thoughts along with the other 2 great answers,
Don
-
If I was the manufacturer, I would be jumping for joy. Every page on your client's sites with rel=canonical on it is passing the SEO value of the page to my website. This is going to make me buckets of money and make my website really hard to beat. I might send each of your clients a fruit cake!
Your clients need to know that writing original and substantive content for each product description is the minimum investment needed to be visible in the SERPs. What you have done will prevent them from getting a Panda problem because of duplicate content or keep the client pages from being filtered out of the SERPs, however, it will not make them any money.
Your options are.....
-
Get a cheap writer to compose minimal content (if they don't cheat and copy/paste or spin the content from another website). This might bring in a tiny amount of traffic - but if the content is really thin you will have a Panda problem for a low-quality page.
-
Get a better writer to write average-qualilty, better-than-minimal content. This will avoid Panda problems and rank better than choice #1.
-
Get a decent writer to write substantive, quality content. This will avoid Panda problems and rank better than choice #2.
-
Get a good writer and a photographer to prepare superior content with a few generous-size attractive photos with nice captions. This will rank better than #3 and pull in traffic for more long-tail keywords that appear in the product description and captions.
Even with #4, rankings will be largely determined by the quality of the optimization efforts that you put into your pages, the authority of your domain and the linkage going into each product page. Doing #4 for a weak website is probably a waste of money, but if your website is of average authority in your niche then #4 could be a good investment. If you have a strong site in your niche then #4 will be a kickass investment.
We do #4 on almost all of our product pages. Money maker products get better than #4 often with video. Best products have articles on separate pages that explain how to use the product, how to fix it, how to select, how to enjoy, etc.
Your reward is usually proportional to your attack, again, as long as your domain has enough authority to take advantage of the content investment.
But... most websites only make money for the hosting company and the developer, because the investments are inadequate to become competitive (or the wrong investments are made).
-
-
Adding a cross-domain canonical tag like this is fine assuming you are doing it for customer service (and the manufacturer doesn't mind you copying content from their site). You won't see any SEO benefit from the content on those pages because they are unlikely to be indexed. On the other hand, it wouldn't hurt your site either.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Duplicate content created by website Calendar - A Penalty?
A colleague of mine asked me a question about duplicate content coming from their event calendar. I don't think this will affect them negatively, but I would love some feedback and thoughts. ThanksOne of my clients, LifeTech Academy, is using my RavenTools software. Raventools has reported a HUGE amount of duplicate content (4.4K instances).The duplicate content all revolves around their calendar and repeating events (http://lifetechacademy.org/events/)The question is this - will this impact their SEO efforts in a negative way?
Intermediate & Advanced SEO | | Bill_K0 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Re-using Content From a Previous Website - Risky?
Over the years, I've gathered thousands of user reviews on a website I am shutting down although I would like to keep them for another website. I removed the reviews from the old website, set the reviews pages to "noindex" and removed the pages from Google's index using the Webmaster Tools. At this point the reviews are not showing up in Google's search results anymore. Would there be any concerns about posting these reviews on a new website? Can it get penalized for duplicate content?
Intermediate & Advanced SEO | | sbrault740 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
SEO Link on Clients Site
Hey SEOMozzers, Quick question. In light of the possible 'over-optimisation' penalties pending from Google should we be looking to remove the SEO links to our site from our Clients websites? I appreciate that including a link to our site from an anchor text that includes 'SEO' in it may be like waving a flag to Search Engines saying we are carrying out SEO on our Clients sites. Obviously we would sooner risk a drop in our SEO keyword rankings than risk a penalty of any kind for our Clients. What is the recommended practice here?
Intermediate & Advanced SEO | | MiroAsh0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0