Noindex vs. page removal - Panda recovery
-
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site?
I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm?
Thanks very much in advance for your thoughts, and corrections on my assumptions
-
I think it can get pretty complicated, but a couple of observations:
(1) In my experience, NOINDEX does work - indexation is what Google cares about primarily. Eventually, you do need to trim the crawl paths, XML sitemaps, etc., but often it's best to wait until the content is de-indexed.
(2) From an SEO perspective (temporarily ignoring Panda), a 301 consolidates link juice - so, if a page has incoming links or traffic, that's generally the best way to go. If the page really has no value at all for search, either a 404 or NOINDEX should be ok (strictly from an SEO perspective). If the page is part of a path, then NOINDEX,FOLLOW could preserve the flow of link juice, whereas a 404 might cut it off (not to that page, but to the rest of the site and deeper pages).
(3) From a user perspective, 301, 404, and NOINDEX are very different. A 301 is a good alternative to pass someone to a more relevant or more current page (and replace an expired one), for example. If the page really has no value at all, then I think a 404 is better than NOINDEX, just in principle. A NOINDEX leaves the page lingering around, and sometimes it's better to trim your content completely.
So, the trick is balancing (2) and (3), and that's often not a one-sized fits all solution. In other words, some groups of pages may have different needs than others.
-
Agreed - my experience is that NOINDEX definitely can have a positive impact on index dilution and even Panda-level problems. Google is mostly interested in index removal.
Of course, you still need to fix internal link structures that might be causing bad URLs to roll out. Even a 404 doesn't remove a crawl path, and tons of them can cause crawler fatigue.
-
I disagree with everyone The reason panda hit you is because you were ranking for low quality pages you were telling Google wanted them to index and rank.
When you
a) remove them from sitemap.xmls
b) block them in robots.txt
c) noindex,follow or noindex, nofollow them in metas
you are removing them from Googles index and from the equation of good quality vs low quality pages indexed on your site.
That is good enough. You can still have them return a 200 and be live on your site AND be included in your user navigation.
One example is user generated pages when users signup and get their own URL www.mysite.com/tom-jones for example.Those pages can be live but should not be indexed because they have no content usually other than a name.
As long as you are telling Google - don't index them I don't want them to be considered in the equation of pages to show up in the index, you are fine with keeping these pages live!
-
Thanks guys
-
I would agree noindex is not as good as removing the content but it still can work as long as there are no links or sitemaps that lead Google back to the low quality content.
I worked on a site that was badly affected by Panda in 2011. I had some success by noindexing genuine duplicates (pages that looked really alike but did need to be there) and removing low quality pages that were old and archived. I was left with about 60 genuine pages that needed to be indexed and rank well so I had to pay a copywriter to rewrite all those pages (originally we had the same affiliate copy on there as lots of other sites). That took about 3 months for Google to lift or at least reduce the penalty and our rankings to return to the top 10.
Tom is right that just noindexing is not enough. If pages are low quality or duplicates then keep them out of sitemaps and navigation so you don't link to them either. You'll also nned redirects in case anyone else links to them. In my experience, eventually Google will drop them from the index but it doesn't happen overnight.
Good luck!
-
Thanks Tom
Understand your points. The idea behind noindexing is that you're telling Google not to take any notice of the page.
I guess the question is whether that works:
- Not at all
- A little bit
- A lot
- Is as good as removing the content
I believe it's definitely not as good as actually removing the content, but not sure about the other three possibilities.
We did notice that we got a small improvement in placement when we noindexed a large amount of the site and took several hundred other pages actually down. Hard to say which of those two things caused the improvement.
We've heard of it working for others, which is why I'm asking...
Appreciate your quick response
Phil
-
I don't see how noindexing pages would help with regards to a Panda recovery if you're already penalised.
Once the penalty is in place, my understanding is that it will remain so until all offending pages have been removed or changed to unique content. Therefore, noindexing would not work - particularly if that page is accessible via an HTML/XML sitemap or a site navigation system. Even then, I would presume that Google will have the URL logged and if it remained as is, any penalty removable would not be forthcoming.
Noindexing pages that has duplicate content but hasn't been penalised yet would probably prevent (or rather postpone) any penalty - although I'd still rather avoid the issue outright where possible. Once a penalty is in place, however, I'm pretty sure it will remain until removed, even if noindexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removed Product page on our website, what to do
We just removed an entire product category on our website, (product pages still exist, but will be removed soon as well) Should we be setting up re-directs, or can we simply delete this category and product
Technical SEO | | DutchG
pages and do nothing? We just received this in Google Webmasters tools: Google detected a significant increase in the number of URLs that return a 404 (Page Not Found) error. We have not updated the sitemap yet...Would this be enough to do or should we do more? You can view our website here: http://tinyurl.com/6la8 We removed the entire "Spring Planted Category"0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Noindex large productpages on webshop to counter Panda
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?
Technical SEO | | oeroek0 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
Duplicate pages on wordpress
I am doing SEO on a site which is running on WP. And it has all pages and categories duplicates on domain.com/site/ However, as it got crawled I saw that all domain.com/ pages have rel=canonical with main page tag (does it mean something?). Thing is I will fix permalinks structure and I think WP automatically redirects if it is changed from /?page_id= to /%category%/%postname%/ or /%postname%/ Isn't there something I miss? Second problems is a forum. After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ (fortunately, there are no domain.com/site/forum duplicates). What could be an easy solution to this?
Technical SEO | | OVJ0 -
New EMD update effected my mom's legit author page? From page 1 in SERP to nowhere for her name
I think my mom's site, MargaretTerry.com was hit by this update for her name "Margaret Terry". Went from bouncing around the first page on google.com and .ca all the time to nowhere on the index. The results are now very strange, a mix of Youtube, linked in, and small book stores that she has done events at recently to promote her first book. I was checking after some of my SEO buddys were freaking out about their EMD's getting hit on Sunday. She is an aspiring author with a book coming out this month. There is obviously no ads or spam content on the site... I have never done SEO for it either except a bit of on page I guess. It sucks that people might be grabbing her book soon and when they Google her name nothing shows up. This couldn't have really happened at a worse time. Not to mention the hours spent building the site to her liking, free of charge of course 🙂 Is there anyone I can contact there to help me out? Shouldn't and EMD that is someones name still rank when you search their name?
Technical SEO | | Operatic0 -
Duplicates on the page
Hello SEOMOZ, I've one big question about one project. We have a page http://eb5info.com/eb5-attorneys and a lot of other similar pages. And we got a big list of errors, warnings saying that we have duplicate pages. But in real not all of them are same, they have small differences. For example - you select "State" in the left sidebar and you see a list on the right. List on the right panel is changing depending on the what you selecting on the left. But on report pages marked as duplicates. Maybe you can give some advices how to improve quality of the pages and make SEO better? Thanks Igor
Technical SEO | | usadvisors0 -
Too Many On-Page Links
Hello. My Seomoz report this week tells me that I have about 500 pages with Too Many On-Page Links One of the examples is this one: https://www.theprinterdepo.com/hp-9000mfp-refurbished-printer (104 links) If you check, all our products have a RELATED products section and in some of them the related products can be UP to 40 Products. I wonder how can I solve this. I thought that putting nofollow on the links of the related products might fix all of these warnings? Putting NOFOLLOW does not affect SEO?
Technical SEO | | levalencia10