No-index pages with duplicate content?
-
Hello,
I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers.
It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have.
Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages?
Thanks a lot for your help!
-
We recommend to such clients that they apply the robots noindex,follow meta tag on the duplicated pages until they get rewritten. We aim for 20% of all products on the site to be completely unique in content, and indexable. The other 80% can be rewritten gradually over time and released back into the index as they are rewritten.
So to answer you question: Yes, I think your plan is perfectly acceptable, and is what I would do myself if I were in the same situation.
-
Duplicate content is not a penalty, it's a filter. Deindexing will ensure that they never rank, leave them indexed and they have a chance of ranking, worst case scenario is they don't rank well because of it.
-
I think Devanur gives some good advice regarding the gradual improvement of the content, though you're stuck in a bit of a catch-22 with regard to how Google views websites: You want to be able to sell lots of products, but don't have the resources for your company present them in a unique or engaging fashion. This is something that Google wants webmasters to do, but the reality of your situation paints a completely different picture of what will give your company decent ROI for updating vast amounts of product content.
If there isn't an obvious Panda problem, I wouldn't just noindex lots of pages without some thought and planning first. Before noindexing the pages I would look at what SEO traffic they're getting. noindexing alone seems like a tried and tested method of bypassing potential Panda penalties and although PageRank will still be passed, there's a chance that you are going to remove pages from the index that are driving traffic (even if it's long tail).
In addition to prioritising content production for indexed pages per Devanur's advice, I would also do some keyword analysis and prioritise the production of new content for terms which people are actually searching for before they purchase.
There's a Moz discussion here which might help you: http://moz.com/community/q/noindex-vs-page-removal-panda-recovery.
Regards
George
@methodicalweb
-
Hi, the suggestion was not to get the quality articles written that take an hour to write each but I meant to change the products descriptions that were copied and pasted with little variation so that they don't look like a copy, paste job.
Now, coming to the de-indexing part, let us look at a scenario:
Suppose I built a website to promote Amazon products through Amazon associates program. I populated its pages using Amazon API through a plugin like WProbot or Protozon. In this case, the content will be purely scraped from Amazon and other places. After a while, I realize that my site has not been performing well in the search engines because of the scraped content but haven't seen any penalty levied or manual action taken. As of now, I have about 3000 pages in Google's index. Now I want to tackle the duplicate content issue. This is what I would do to be on a safer side from a possible penalty in future like Panda:
1. First, will make the top pages unique.
2. Add, noindex to the rest of the duplicate content pages.
3. Keep on making the pages unique in phases, removing the noindex tag to the ones that were updated with unique content.
4. Would repeat the above step till I fix all the duplicate content pages on the website.
It greatly depends on the level of content duplication and few other things so, we will be able to suggest better if we can have a look at the website in question. You can send a private message if you want any of us to have a look at it.
-
Hello,
Like I said in my first post, this has already been done. I was asking a specific question.
on another topic, 300 quality pages of content is not possible in the month. We're talking about articles that take at least an hour to write.
That being said, I'll ask my question again: once I have done, let's say, 750 pages of unique content, should I no-index the rest or not. is there something better to do that doesn't involve writing content for 20 000 pages?
Thanks.
-
Very true my friend. If you look at your top pages for last 30 days, there won't be more than 2000 approximately. So you can make the content unique on these over a period of six months or a bit more going at 300 per month. Trust me, this would be an effort well spent.
-
Hello,
I agree with you that it would be the best but like Isaid, writting content for 20 000 pages is not an option. Thanks for your answer!
-
Going off of what Devanur said. Giving your product pages unique content is the way to go. But this can include pictures, sizes, material and etc... I am in the rug business and this is how we pull it off and also how RugsUSA does as well. If you do not however, I would do what Devanur referred to with changing descriptions of your top selling products first.
All the best!
-
Hi,
While its not recommended to have duplicate content on your pages that is found else where, it is also not a good thing to de-index pages from Google. If I were you, I would have tried to beef-up these duplicate pages a little bit with unique content or at least rewritten the existing content so that it becomes unique.
Please go ahead and initiate the task of rewriting the product descriptions in phases starting with the ones that get the most traffic as per your web analytics data. Those were my two cents my friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
How would you handle these pages? Should they be indexed?
If a site has about 100 pages offering specific discounts for employees at various companies, for example... mysite.com/discounts/target mysite.com/discounts/kohls mysite.com/discounts/jcpenney and all these pages are nearly 100% duplicates, how would you handle them? My recommendation to my client was to use noindex, follow. These pages tend to receive backlinks from the actual companies receiving the discounts, so obviously they are valuable from a linking standpoint. But say the content is nearly identical between each page; should they be indexed? Is there any value for someone at Kohl's, for example, to be able to find this landing page in the search results? Here is a live example of what I am talking about: https://www.google.com/search?num=100&safe=active&rlz=1C1WPZB_enUS735US735&q=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&oq=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&gs_l=serp.3...7812.8453.0.8643.6.6.0.0.0.0.174.646.3j3.6.0....0...1c.1.64.serp..0.5.586...0j35i39k1j0i131k1j0i67k1j0i131i67k1j0i131i46k1j46i131k1j0i20k1j0i10i3k1.RyIhsU0Yz4E
Intermediate & Advanced SEO | | FPD_NYC0 -
Does Google View "SRC", "HREF", TITLE and Alt tags as Duplicate Content on Home Page Slider?
Greetings MOZ Community. A keyword matrix was developed by my SEO firm. I am in the process of integrating primary, secondary and terciary phrases into the text and am also sprinkling three or four other terms. Using a keyword density tool (http://www.webconfs.com/keyword-density-checker.php) the results were somewhat unexpected after I optimized. So I then looked at the source code and noticed text from HREF, ALT and SRC tags that may be effecting how Google would interpret text on the page. Our home page (www.nyc-officespace-leader.com) contains a slider with commercial real estate listings. Would Google index the SRC, HREF, TITLE and ALT tags in these slider items? Would this be detrimental to SEO? The code for one listing (and there are 7-8 in the slider) looks like this: | href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York">Class A Fifth Avenue Offices class="blockLeft"><a< p=""></a<> href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York"> src="http://dr0nu3l9a17ym.cloudfront.net/wp-content/uploads/fsrep/houses/125x100/305.jpg" alt="Lease a Prestigious Fifth Avenue Office - Manhattan, New York" width="125" height="94" /> 1,340 Sq. Ft. $5,918 / month Fifth Avenue Midtown / Grand Central <a< p=""></a<> | Could the repetition of the title text ("lease a Prestigious Fifth...") trigger a duplicate content penalty? Should the slider content be blocked or set to no-index by some kind of a Java script? We have worked very hard to optimize the home page so it would be a real shame if through some technical oversight we got hit by a Google Panda penalty. Thanks, Alan Thanks
Intermediate & Advanced SEO | | Kingalan10 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
Why do my https pages index while noindexed?
I have some tag pages on one of my sites that I meta noindexed. This worked for the http version, which they are canonical'd to but now the https:// version is indexing. The https version is both noindexed and has a canonical to the http version, but they still show up! I even have wordpress set up to redirect all https: to http! For some reason these pages are STILL showing in the SERPS though. Any experience or advice would be greatly appreciated. Example page: https://www.michaelpadway.com/tag/insurance-coverage/ Thanks all!
Intermediate & Advanced SEO | | MarloSchneider0 -
Indexing specified entry pages
Hi,We are currently working on location based info.Basically, when someone searches from Florida they will get specific Florida results and when they search from California they will specific California results.How does this location based info affect crawling and indexing?Lets say we have location info for googlebot, sometimes they crawl from a New York ip address, sometimes they do it from Texas and sometimes from California. In this case google will index 3 different pages with 3 different prices and a bit different text, and I'm afraid they might see these as some kind of cloaking or suspicious movement because we serve different versions of the page. What's the best way to handle this?
Intermediate & Advanced SEO | | SEODinosaur0 -
Duplicate content on sub-domains?
I have 2 subdamains intented for 2 different countries (Colombia and Venezuela) ve.domain.com and co.domain.com. The site it's an e-commerce with over a million products available so they have the same page with the same content on both sub-domains....the only differences are the prices a payment options. Does google take that as duplicate content? Thanks
Intermediate & Advanced SEO | | daniel.alvarez0 -
Duplicated Pages (Travel Industry)
Hi, I have 6 duplicated pages. I want to know the best way to resolve this issue. The title for all pages is 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 |
Intermediate & Advanced SEO | | Melia
| | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 http://www.paradisus.com/booking-template.php |
| | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 http://www.paradisus.com/booking-template.php |
| | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |0