Noindex large productpages on webshop to counter Panda
-
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update.
One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage.
There is however no capacity to do this on short notice. So now I'm wondering if the following is effective.
Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category.
My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure.
What do you think?
-
I see. There's a pretty thorough discussion on a very similar situation here: http://moz.com/community/q/can-i-use-nofollow-tag-on-product-page-duplicated-content. Everett endorsed Monica's answer with, "... you might consider putting a Robots Noindex,Follow meta tag on the product pages. You'll need to rely on category pages for rankings in that case, which makes sense for a site like this." Monica's long term solution was to also work on getting specific user-generated content on as many product pages as possible. Cheers!
-
@Ryan, thx for your answer. The pagerank flow is indeed one of the things I worry about when deindexing large parts of the site. Especcialy since the category pages will be full of internal links to productpages that are excluded from indexation by robots.txt or robots meta.
The problem I am trying to solve however has nothing to do with pagerank sculpting. I suspect an algorithmic drop due to thin, duplicate and syndicated content. The drop is sitewide. Assuming that the drop is due to panda I suspect the percentage of low quality pages should be optimized. Would outlinking and better DA really be sufficient to counter a suspected Panda problem? Or is it required to make the 10.000 product pages of better quality, I would think the latter. Since there is no budget to do so I wonder if it is possible to drop these low quality pages from the index (but keep them in the website). Would this strenghten the remaining pages to bounce back up, assuming these remaining pages are of good quality offcourse.
Since SEO is not the only factor to be taken into account I'd rather not delete these pages from the website.
-
Matt Cutts speaks to part of what you're thinking about doing here: https://www.mattcutts.com/blog/pagerank-sculpting/ and it's important to note that it's not nearly as effective. The thing I would focus more on is the DA and quality of referrals to your site. Secondly, linking out from pages is actually a positive strength indicator when done in the right way, per Cutts in the same article, "In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites." Perhaps your product pages could be strengthened further by this as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing a large number of unnecessary pages from a site
Hi all, I got a big problem with my website. I have a lot of page, duplicate page made from various combinations of selects, and for all this duplicate content we've be hit by a panda update 2 years ago. I don't want to bring new content an all of these pages, about 3.000.000, because most of them are unnecessary. Google indexed all of them (3.000.000), and I want to redirect the pages that I don't need anymore to the most important ones. My question, is there any problem in how google will see this change, because after this it will remain only 5000-6000 relevant pages?
Technical SEO | | Silviu0 -
Are nofollow, noindex meta tags healthy for this particular situation?
Hi mozzers, I am conducting an audit for a client and found over 200 instances using meta tags(nofollow, noindex). These were implemented essentially on Blog pages/category pages, tags not blog posts which is a good sign. I believe that tagged URLs aren't something to worry about since these create tons of duplicates. In regards to the category page, i feel that it falls in the same basket as tags but I am not positive about it. Can someone tell me if these are fine to have noindex, nofollow? Also on the website subnav which are related to the category pages mentioned prev, the webmaster have implemented noindex,follow(screenshot below), which seems ok to me? am i right? Thanks 8egLLbo.png?1
Technical SEO | | Ideas-Money-Art0 -
"INDEX,FOLLOW" then later in the code "NOINDEX,NOFOLLOW" which does google follow?
background info: we have an established closed E-commerce system which the company has been using for years. I have only just started and reviewing the system, I don't have direct access to the code, but can request changes, but it could take months before the changes are in effect (or done at all), and we won't can't change to a new E-commerce system for the short to mid term. While reviewing the site (with help of seomoz crawl diagnostics) I noticed that some of the existing "landing pages" have in the code: <meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">INDEX,FOLLOW</a>" /> then a few lines later <meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">NOINDEX,NOFOLLOW</a>" /> Which the crawl diagnostics flagged up, but in the webmaster tools says
Technical SEO | | PaddyDisplays
"We didn't detect any issues with non-indexable content on your site." so the question is which instructions does google follow? the first or 2nd? note: clearly this is need fixed, but I have a big list of changes for the system so I need to know how important this is tthanks0 -
Noindex nofollow issue
Hi, For some reason 2 pages on my website, time to time get noindex nofollow tags they disappear from search engine, i have to log in my thesis wp theme and uncheck box for "noindex" "nofollow" and them update, in couple days my website is back up. here is screen shot http://screencast.com/t/A6V6tIr2Cb6 Is that something in thesis theme that cause the problem? even though i unchecked the box and updated but its still stays checked http://screencast.com/t/TnjDcYfsH4sq appreciated for your help!
Technical SEO | | tonyklu0 -
Panda Victim still looking for recovery looking for help
I am an internet retailer hit by Panda and have made many changes to my site since first being hit on feb 2011. I had a slight recovery last september but have since slipped back again. I have scoured the internet for panda recoveries for internet retailers like me but I have not seen any. If anyone knows of recoveries of a site like mine (wackyplanet.com) -- we are on a yahoo store platform I would aappreciate any info as I am looking for an SEO who has experience with Panda as it relates to sites like mine.
Technical SEO | | bobforesi0 -
Noindex, follow duplicate pages
I have a series of websites that all feature a library of the same content. These pages don't make up the majority of the sites content, maybe 10-15% of the total pages. Most of our clients won't take the time to rewrite the content, but it's valuable to their site. So I decided to noindex, follow all of the pages. Outside of convincing them all to write their own versions of the content, is this the best method? I could also block the pages with robots.txt, but then I couldn't pass any link juice through the pages. Any thoughts?
Technical SEO | | vforvinnie0