No-index pages with duplicate content?
-
Hello,
I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers.
It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have.
Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages?
Thanks a lot for your help!
-
We recommend to such clients that they apply the robots noindex,follow meta tag on the duplicated pages until they get rewritten. We aim for 20% of all products on the site to be completely unique in content, and indexable. The other 80% can be rewritten gradually over time and released back into the index as they are rewritten.
So to answer you question: Yes, I think your plan is perfectly acceptable, and is what I would do myself if I were in the same situation.
-
Duplicate content is not a penalty, it's a filter. Deindexing will ensure that they never rank, leave them indexed and they have a chance of ranking, worst case scenario is they don't rank well because of it.
-
I think Devanur gives some good advice regarding the gradual improvement of the content, though you're stuck in a bit of a catch-22 with regard to how Google views websites: You want to be able to sell lots of products, but don't have the resources for your company present them in a unique or engaging fashion. This is something that Google wants webmasters to do, but the reality of your situation paints a completely different picture of what will give your company decent ROI for updating vast amounts of product content.
If there isn't an obvious Panda problem, I wouldn't just noindex lots of pages without some thought and planning first. Before noindexing the pages I would look at what SEO traffic they're getting. noindexing alone seems like a tried and tested method of bypassing potential Panda penalties and although PageRank will still be passed, there's a chance that you are going to remove pages from the index that are driving traffic (even if it's long tail).
In addition to prioritising content production for indexed pages per Devanur's advice, I would also do some keyword analysis and prioritise the production of new content for terms which people are actually searching for before they purchase.
There's a Moz discussion here which might help you: http://moz.com/community/q/noindex-vs-page-removal-panda-recovery.
Regards
George
@methodicalweb
-
Hi, the suggestion was not to get the quality articles written that take an hour to write each but I meant to change the products descriptions that were copied and pasted with little variation so that they don't look like a copy, paste job.
Now, coming to the de-indexing part, let us look at a scenario:
Suppose I built a website to promote Amazon products through Amazon associates program. I populated its pages using Amazon API through a plugin like WProbot or Protozon. In this case, the content will be purely scraped from Amazon and other places. After a while, I realize that my site has not been performing well in the search engines because of the scraped content but haven't seen any penalty levied or manual action taken. As of now, I have about 3000 pages in Google's index. Now I want to tackle the duplicate content issue. This is what I would do to be on a safer side from a possible penalty in future like Panda:
1. First, will make the top pages unique.
2. Add, noindex to the rest of the duplicate content pages.
3. Keep on making the pages unique in phases, removing the noindex tag to the ones that were updated with unique content.
4. Would repeat the above step till I fix all the duplicate content pages on the website.
It greatly depends on the level of content duplication and few other things so, we will be able to suggest better if we can have a look at the website in question. You can send a private message if you want any of us to have a look at it.
-
Hello,
Like I said in my first post, this has already been done. I was asking a specific question.
on another topic, 300 quality pages of content is not possible in the month. We're talking about articles that take at least an hour to write.
That being said, I'll ask my question again: once I have done, let's say, 750 pages of unique content, should I no-index the rest or not. is there something better to do that doesn't involve writing content for 20 000 pages?
Thanks.
-
Very true my friend. If you look at your top pages for last 30 days, there won't be more than 2000 approximately. So you can make the content unique on these over a period of six months or a bit more going at 300 per month. Trust me, this would be an effort well spent.
-
Hello,
I agree with you that it would be the best but like Isaid, writting content for 20 000 pages is not an option. Thanks for your answer!
-
Going off of what Devanur said. Giving your product pages unique content is the way to go. But this can include pictures, sizes, material and etc... I am in the rug business and this is how we pull it off and also how RugsUSA does as well. If you do not however, I would do what Devanur referred to with changing descriptions of your top selling products first.
All the best!
-
Hi,
While its not recommended to have duplicate content on your pages that is found else where, it is also not a good thing to de-index pages from Google. If I were you, I would have tried to beef-up these duplicate pages a little bit with unique content or at least rewritten the existing content so that it becomes unique.
Please go ahead and initiate the task of rewriting the product descriptions in phases starting with the ones that get the most traffic as per your web analytics data. Those were my two cents my friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mixing up languages on the same page + possible duplicate content
I have a site in English hosted under .com with English info, and then different versions of the site under subdirectories (/de/, /es/, etc.) Due to budget constraints we have only managed to translate the most important info of our product pages for the local domains. We feel however that displaying (on a clearly identified tab) the detailed product info in English may be of use for many users that can actually understand English, and may help us get more conversions to have that info. The problem is that this detailed product info is already used on the equivalent English page as well. This basically means 2 things: We are mixing languages on pages We have around 50% of duplicate content of these pages What do you think that the SEO implications of this are? By the way, proper Meta Titles and Meta Descriptions as well as implementation of href lang tag are in place.
Intermediate & Advanced SEO | | lauraseo0 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
Rich snippet on main index page
Hello, I am using a 3rd party company to generate reviews for my website. I want to optimize my site for the index page to see a star rating in the SERP. I am pulling the the count of the number of reviews and the average rating from my review partner and rendering this on the page. It is not visible to a visitor to the site. My page has been marked up correctly as you can see using the rich snippet testing tool http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.jsshirts.com.au However the stars are not showing in SERP's. Does anyone have any ideas as to why the stars are not showing. Many thanks, Jason
Intermediate & Advanced SEO | | mullsey0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0