Noindex vs. page removal - Panda recovery
-
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site?
I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm?
Thanks very much in advance for your thoughts, and corrections on my assumptions
-
I think it can get pretty complicated, but a couple of observations:
(1) In my experience, NOINDEX does work - indexation is what Google cares about primarily. Eventually, you do need to trim the crawl paths, XML sitemaps, etc., but often it's best to wait until the content is de-indexed.
(2) From an SEO perspective (temporarily ignoring Panda), a 301 consolidates link juice - so, if a page has incoming links or traffic, that's generally the best way to go. If the page really has no value at all for search, either a 404 or NOINDEX should be ok (strictly from an SEO perspective). If the page is part of a path, then NOINDEX,FOLLOW could preserve the flow of link juice, whereas a 404 might cut it off (not to that page, but to the rest of the site and deeper pages).
(3) From a user perspective, 301, 404, and NOINDEX are very different. A 301 is a good alternative to pass someone to a more relevant or more current page (and replace an expired one), for example. If the page really has no value at all, then I think a 404 is better than NOINDEX, just in principle. A NOINDEX leaves the page lingering around, and sometimes it's better to trim your content completely.
So, the trick is balancing (2) and (3), and that's often not a one-sized fits all solution. In other words, some groups of pages may have different needs than others.
-
Agreed - my experience is that NOINDEX definitely can have a positive impact on index dilution and even Panda-level problems. Google is mostly interested in index removal.
Of course, you still need to fix internal link structures that might be causing bad URLs to roll out. Even a 404 doesn't remove a crawl path, and tons of them can cause crawler fatigue.
-
I disagree with everyone The reason panda hit you is because you were ranking for low quality pages you were telling Google wanted them to index and rank.
When you
a) remove them from sitemap.xmls
b) block them in robots.txt
c) noindex,follow or noindex, nofollow them in metas
you are removing them from Googles index and from the equation of good quality vs low quality pages indexed on your site.
That is good enough. You can still have them return a 200 and be live on your site AND be included in your user navigation.
One example is user generated pages when users signup and get their own URL www.mysite.com/tom-jones for example.Those pages can be live but should not be indexed because they have no content usually other than a name.
As long as you are telling Google - don't index them I don't want them to be considered in the equation of pages to show up in the index, you are fine with keeping these pages live!
-
Thanks guys
-
I would agree noindex is not as good as removing the content but it still can work as long as there are no links or sitemaps that lead Google back to the low quality content.
I worked on a site that was badly affected by Panda in 2011. I had some success by noindexing genuine duplicates (pages that looked really alike but did need to be there) and removing low quality pages that were old and archived. I was left with about 60 genuine pages that needed to be indexed and rank well so I had to pay a copywriter to rewrite all those pages (originally we had the same affiliate copy on there as lots of other sites). That took about 3 months for Google to lift or at least reduce the penalty and our rankings to return to the top 10.
Tom is right that just noindexing is not enough. If pages are low quality or duplicates then keep them out of sitemaps and navigation so you don't link to them either. You'll also nned redirects in case anyone else links to them. In my experience, eventually Google will drop them from the index but it doesn't happen overnight.
Good luck!
-
Thanks Tom
Understand your points. The idea behind noindexing is that you're telling Google not to take any notice of the page.
I guess the question is whether that works:
- Not at all
- A little bit
- A lot
- Is as good as removing the content
I believe it's definitely not as good as actually removing the content, but not sure about the other three possibilities.
We did notice that we got a small improvement in placement when we noindexed a large amount of the site and took several hundred other pages actually down. Hard to say which of those two things caused the improvement.
We've heard of it working for others, which is why I'm asking...
Appreciate your quick response
Phil
-
I don't see how noindexing pages would help with regards to a Panda recovery if you're already penalised.
Once the penalty is in place, my understanding is that it will remain so until all offending pages have been removed or changed to unique content. Therefore, noindexing would not work - particularly if that page is accessible via an HTML/XML sitemap or a site navigation system. Even then, I would presume that Google will have the URL logged and if it remained as is, any penalty removable would not be forthcoming.
Noindexing pages that has duplicate content but hasn't been penalised yet would probably prevent (or rather postpone) any penalty - although I'd still rather avoid the issue outright where possible. Once a penalty is in place, however, I'm pretty sure it will remain until removed, even if noindexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old pages - should I remove them from serps?
Hi guys, I need an advice from you, a recommendation. I have some old LPs from old campaigns, around 70 pages indexed on Google, campaigns that are not available anymore. I have removed them from my DB, but they still remained on server so Google still sees them as URLs on my site, witch I totally agree. What should I do with this pages? Should I remove them completely? (url removal tool) or use rel=canonical? How will this affect my domain authority and rankings? This pages doesn't bring traffic any more, maybe a view now and then, but overall this pages don't bring traffic.
Technical SEO | | catalinmoraru0 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
How to Delete a Page on the Web?
Google reports and I have confirmed that the following old page is presenting on the Web. http://www.audiobooksonline.com/The_Great_American_Baseball_Box_Greatest_Moments_from_the_Last_80_Years_original_audio_collection_compact_discs.html This page hasn't been in our site's directory for some time and is no longer needed by us. What is the best way to fix this Google reported crawl error?
Technical SEO | | lbohen0 -
Noindex search result pages Add Classifieds site
Dear All, Is it a good idea to noindex the search result pages of a classified site?
Technical SEO | | te_c
Taking into account that category pages are also search result pages, I would say it is not a good idea, but the whole information is in the sitemap, google can index individual listings (which are index, follow) anyway. What would you do? What kind of effects has in the indexing of the site, marking the search result pages as "search results" with schema.org microdata? Many thanks for your help, Best Regards, Daniel0 -
Pages noindex'ed. Submit removal request too?
We had a bunch of catalog pages "noindex,follow" 'ed. Now should we also submit removal request in WMT for these pages? Thank you! LL
Technical SEO | | LocalLocal0 -
CamelCase vs lowernodash
I'm in the process of reviewing on-site URL structure on a few sites, and I've run into something I can't decide between. I am forced to choose between the two examples: MediaRoom/CaseStudies.aspx (camel case) mediaroom/casestudies (all lower case, mashed, no dashes) I would personally rather see: media-room/case-studies/ However implementing the dashes would require manually re-writing about ~10,000 URLs. Implementing 301s from the existing structure to whatever I choose would be trivial, so there is no concern there. Given the choice between CamelCase and lower-mashed, which would you choose? Why?
Technical SEO | | MRCSearch0 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300