Duplicate content on my domain
-
I have several pages on my domain that are using the same content except for changing a paragraph or sentence. Do I need to create unique content, even though much of the information pertains to a feature and is related?
-
Whatever you do don't memorize the content in your efforts to alter it in each variatio. Leave your desk/laptop and have a cup of coffee/tea. Write down the content in your own words. That doen't always work for me, but I am able to get good variations when it does. Another thing I rarely do, but find amusing is I getting a cup of coffee and going where my colleagues go for a smoke. While eavesdropping on their conversation, I try to replace the subject of their discussion with the subject the page I'm stuck on. It is often bizarre but I get a few good expressions and synonyms.
-
There is a very good chance that some of these pages will be filtered from the search results.
So, your current content has some risk of being filtered. It might not happen but it could.
It is always best to produce unique content.
-
I would write as much unique content as possible, even if there are feature sets that are the same, explain how they are applied to each different product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page title contents
In my page title, I have my product name. Is it beneficial to also include another keyword like: Buy wedding dress online Australia: e..g. (page title) amelie wedding dress | buy wedding dress online Australia. Or is it better just using: Amelie wedding dress
On-Page Optimization | | CostumeD0 -
Duplicate ecommerce domains and canonical
Hi everybody! I'd like to discuss the SEO strategy I've thought regarding a client of mine and ask for help because it's a serious case of duplicate content. There is a main website (the business model one) where he compares the cost of medicines in several pharmacies, to show the cheapest shopping cart to the customer. But the shopping has to been made in another domain, within the selected pharmacie, because my country's law in Europe says that is compulsory to sell the medicines only on the pharmacy website. So my client has started to create domains, one for each pharmacy, where the differences between them are only some products, the business information of the pharmacy and the template's colour. But all of them shares the same product data base. My aim is to rank the comparing website (it contains all the products), not each pharmacy, so I've started to create different content for this one. Should I place rel=canonical in the pharmacies domains t the original one? For instance: www.pharmacie1.com >> www.originaltorank.com www.pharmacie2.com >> www.originaltorank.com www.pharmacie1.com/product-10 >> www.originaltorank.com/product-10 I've already discuss the possibilities to focus all the content in only one website, but it's compulsory to have different domains in order to sell medicines By the way, I can't redirect 301 because I need these websites exist for the same reason (the law) He is creating 1-3 new domains every week so obviously he has had a drop in his SEO traffic that I have to solve this fast. Do you think the canonical will be the best solution? I dont want to noindex these domains beacuse we're creating Google Local pages for each one in order to be found in their villages. Please, I'll appreciate any piece of advice. Thanks!
On-Page Optimization | | Estherpuntu0 -
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
Product Attribute pages and Duplicate content
Hiya I have two queries is about a jewellery shop running on wordpress and woocommerce. 1. I am a little indecisive on how to index the product categories without creating duplicate pages which will get me into trouble. For example: All earrings are listed on the category page: chainsofgold.co.uk/buy/earrings/ We also have product attribute pages which lists all the subcategories for the earrings: chainsofgold.co.uk/earrings/creoles/
On-Page Optimization | | bongoheads
chainsofgold.co.uk/earrings/drop/
chainsofgold.co.uk/earrings/studs/ I have the category URL and the product attribute URLs set to be indexed on my sitemaps. Will this get me into trouble creating duplicate content with the main category page? Should I only have the main category indexed and "no-index, follow" all the product attribute pages? 2. I am also thinking about incorporating these product attribute URLS into my menu so when people hover over earrings they get shown the types of earrings they can buy. However, I have the woocommerce faceted navigation working on the category pages. So if someone is visiting the page chainsofgold.co.uk/buy/earrings/ The user can click on the left hand side, and select "drops". The URL they will get though is one which is not indexed: http://www.chainsofgold.co.uk/buy/earrings/?filter_earrings=123 Can I link to those product attribute pages without the risk of getting accused of creating duplicate content? Thank you for your help. Carolina0 -
Duplicate Content Again
Hello Good People. I know that this is another duplicate post about duplicate content (boring) but i am going crazy with this.. SeoMoz crawl and other tools tells me that i have a duplicate content between site root and index.html. The site is www.sisic-product.com i am going crazy with this... the server is IIS so cannot use htaccess please help... thanks
On-Page Optimization | | Makumbala0 -
Duplicate content list by SEOMOZ
Hi Friends, I am seeing lot of duplicate (about 10%) from the crawl report of SEOMOZ. The report says, "Duplicate Page Content" But the urls it listed have different title, different url and also different content. I am not sure how to fix this issue.. My site has both Indian cinema news and photo gallery. The problme mainly coming in photo gallery posts. for example: this is the main url of a post. apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos . But in this post, each image is a link to its enlarged images (default wordpress). The problem is coming with each individual image with in this post. examples of SEOMOZ report 3 individual urls as duplicate content...from the same above post.: http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-4 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-3 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-2 Some body please advise me.. Appreciate your help.
On-Page Optimization | | ksnath0 -
How to avoid duplicate content on ecommerce pages?
I am currently building the site architecture for a very large ecommerce site. I am wondering how I should build it out if I have products that I want to include in multiple categories within my site. For example: Lets say I sell fitness equipment and I have categories for things such as: Treadmill, Exercise Bike, Stair Stepper, Weight Benches etc. But then I also have specific brand category pages such a: Precor, Life Fitness, Hammer, Body Solid So my question is how do I structure this so I am building this correctly? If I sell a Precor Treadmill I will want to include that product under the "Treadmill" category page as well as under the "Precor Equipment" category page. Can I get some advice for the best way to structure this? It's obviously something I want to avoid at all costs of doing improperly and having to fix later. Thank you Jake
On-Page Optimization | | PEnterprises0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0