Will having duplicate content on four websites cause a problem?
-
A client of ours has four websites for different shops they run in the surrounding area. Each website has original content as well as duplicate content. This is for things like product advice which needs to be the same
Will having duplicate content on these four websites cause a problem? How can it be mitigated? We can't refer the visitor to another website to get the product information as this will break the user experience, and of course shopping cart sessions will not pass on.
-
Hi Tom,
Are they on the same server and linking to each other? If so, then just canonical them. Im also assuming from what you said that it's just a few pages so it shouldn't be any problem. That, or just edit the pages a wee bit.
If they are on different servers and practically can't be traced to be related to each other, then I wouldn't even worry about it. Just being practical.
-
Hi Tom
It might also be worth checking if the clients e-commerce platforms will allow you to add tags into the head. Some ERP website or cloud based sites don't so worth checking from the get go
Bruce
-
Hi Tom
Short answer is that it probably won't be a problem. From what you are saying the duplication is 'natural' in the sense it is information which you might normally expect to see duplicated since it relates to similar products across multiple sites (also think privacy policy or terms and conditions pages). In this case it is unlikely to attract a penalty.
Matt Cutts covered that topic in this video (posted on SearchEngineLand): Duplicate Content Won't Hurt You Unless its Spammy.
However, it will probably mean you are leaving it up to the search engine to decide which 'version' of your duplicate content it should prioritise and serve up to people searching. If it is not important to rank for the content on these duplicate pages then again it is not really an issue.
However if you want to play safe or aim to get some rankings for a specific page - among all the duplicate versions, you can use a rel=canonical tag to let the search engines know which page is the "original" so that they will prioritise this one (ie point the link juice at a specific page). Matt Cutts talks about that in this video (although he talks about it in the context of a news article).
Check out this Moz article on Duplicate Content - it also has a short explanation on how to use the rel=canonical.
All the best
Neil
-
use the canonical tag it will let you keep the pages where they are (as well as visitors) but tell Google which page is the original. As for harm it depends on how many pages I wouldn't see it being the greatest harm but its so easy to put the tag on you might as well do that. A quick heads up though by putting the tag on pages it means one page will rank but the others will no so be aware of that.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
American and Canadian spelling on the same webpage (impact on website and rankings)
Hi guys, Just a quick question here - will google penalize a website for having both Canadian and American spelling on a webpage or negatively impact rankings? Appreciate your help with this. Mark
Content Development | | marktheshark100 -
Does a "Read More" button to open up the full content affect SEO?
As we've been refining our metrics for gauging whether or not a blog is effective -- if people are engaging with it -- one of the strategies we've seen (e.g. NYT, WaPo, Yahoo!) is "Read More." I've read a few articles with some who advocate using it and others who discourage it. Does anyone have any history adding "Read More" to their content and the effect it had?
Content Development | | ReunionMarketing0 -
Duplicate Content for Non-SEO Purposes
Duplicate Content for Non-SEO Purposes There are a few layers to this question, but at the most basic level the question is... -Will having the same article (in the form of archived e-newsletter issues) on multiple different websites' newsletter archives HURT those sites? I'm fairly sure it won't HELP any of them in terms of SEO, but will having these back issues of their e-newsletters archived on their websites get them penalized? For the purpose of this question, these are not clients we are doing SEO for, just hosting and their e-newsletters. So it's fine if the archives provide no SEO benefit, we just don't want to leave them up if they will become LIABILITIES for the websites. -If having the same article in archived issues of e-newsletters on multiple different websites WOULD be harmful, would moving these archives to a sub-domain change anything or would it be best to simply take the archives down altogether? -Alternately, would spinning these articles make any difference in whether or not these sites get penalized? -Lastly, would spinning make the articles usable for archived e-newsletters for clients that ARE signed on for SEO services? I have a hunch about this, but I'd love to hear your expert opinions. Thanks!
Content Development | | BrianAlpert780 -
Get duplicate content error in seomoz even after fixed canonical
http://www.webworld.no - We have been getting error Page Duplicate Content Errors. But we have fixed the canonical. Actually, we have list of portfolio and detailed page for each portfolio But We get error all portfolio pages are having duplicate content. But, i made canonical tag to direct to root page. Please help me to overcome this. Also, i see duplicae webworld.no & www.webworld.no. is there anything i need to fix in redirection? in server?
Content Development | | Webworld_Norway0 -
Prevent average users from copying content and pasting into their websites
Please do not respond with a "you can't stop them" comment, I understand this. Most of our pages have content that is duplicated across multiple domains. The recent Google algorithm update focused on penalizing pages that have duplicate content, and it could be one of the reasons that we have been seeing traffic loss. I'm looking for some type of javascsript/php code that will help minimize this issue.If could be code that does not allow you to copy and paste the code without turning of javascript or a dialog box pops up and says "this content is copyright protected, anyone copying this content is subject to legal action" I've found one script that might work http://www.ioncube.com/html_encoder.php My questions are still the same: 1 What is the best method to achieve my objective? 2 Will this additional code affect how the webbots see our site and or affect rankings? I know that anyone can figure out how to get the code, I am trying to mitigate by providing a warning about copyright infringement and making it more challenging to copy our content. Please do not respond with a "you can't stop them" comment, etc, I understand this. Thank you for your comments!
Content Development | | 4RealLocal0 -
How do you build authority for your website ? (after Penguin update)
Hi, I was just watching one of the whiteboard Friday videos about the Penguin http://www.seomoz.org/blog/the-penguin-update-whiteboard-friday In the video he mentions building authority into your website, My question is how do you build authority ? Is it all about content? Adding quality content to your site on a regular basis or is there anything else we can do to build authority? Regards Jay
Content Development | | Jay_Mistry0 -
Panda and Thin Content
Hi Guys, I have a quick question. We have a website and in the wake of Panda, we are worried about our video news section. We produce about 10 videos news a month on a templated page and beneath it is a small extract of the words spoken in the video. The text below each video is about 180 words each. Currently the video news section makes up and 1/5 of the content on the site. I.e Out of 500 pages, we have about 100 video news articles. Should I be worried about being wacked by Panda for this? Can I tell Google this is a news section?
Content Development | | VividLime0 -
Help with Duplicate Content Issue for pages...
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
Content Development | | pauledwards
There are about 300 pages affected by duplicate content currently. Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index? The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google. Any advice much appreciated. Kind Regards,0