NOINDEX content still showing in SERPS after 2 months
-
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content:
name="robots" content="NOINDEX" />
It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com
I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options.
- 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right?
- Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden.
- Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code.
Please advise and thanks for reading.
-
Just wanted to let you know that submitting all the sites I wanted removed into an XML sitemap worked. I then submitted that sitemap to webmaster tools and listed it in the robots.txt. When doing query "site:domain.com" index pages went from 20k+ down to 700 in a matter of days.
-
I could link to them then, but what about creating a custom sitemap for just content that I want removed? Would that have the same effect?
-
If they are not linked to then spiders will not find the noindex code. They could suffer in the SERPs for months and months.
-
If all these pages are under a directory structure than you have the option to remove a complete directory in URL removal option. See if that is feasible in your case.
-
I suppose I'll wait longer. Crawl rate over the last 90 days is a high of 3,285 and average of 550 with a low of 3 according to webmaster tools.
-
Yeah the pages are low PR and are not linked to at all from the site. I've never heard of removing a page via webmaster tools. How do I do that? I also have to remove several thousand.
*edit: It looks like I have to remove them one at a time which is not feasible in my case. Is there a faster way?
-
If you want a page out of the index fast the best way is to do it through webmaster tools. It's easy and lasts for about six months. Then, if they find your page again it will register the noindex and you should be fine.
As EGOL said, if it's a page that isn't crawled very often then it could be a LONG time before it gets deindexed.
-
I removed some pages from the index and used the same line of code...
name="robots" content="NOINDEX" />
My pages dropped from the index within 2 or 3 days - but this is a site that has very heavy spider activity.
If your site is not crawled very much or these are low PR pages (such as PR1, PR2) it could take google a while to revisit and act upon your noindex instructions - but two months seems a bit long.
Is your site being crawled vigorously? Look in webmaster tools to see if crawling declined abruptly when your rankings fell. Check there also for crawl problems.
If I owned your site and the PR of these pages is low I would wait a while longer before doing anything. If my patience was wearing thin I would do the 301 redirect because that will transfer the linkjuice from those pages to the target URL of the redirect - however, you might wait quite a while to see the redirect take effect. That's why my first choice would be to wait longer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is this on SERP results?
Hi Guys, Does anyone know what this is called: https://d.pr/i/RA6RsG And how Google pulls it from a page? Do you need some kind of markup? Cheers.
Intermediate & Advanced SEO | | nattyhall0 -
Content suggestions from MOZ
Hello, I checked moz content suggestions and for one of my keywords “Burgundy bike tours”. It gives me expressions such as “Burgundy France” and “Burgundy wine”. My question is whether I should include the exact expression “Burgundy wine” in a sentence or if include burgundy somewhere in my text and wine somewhere else if it is fine ? PS : What is the real difference between marketmuse and moz ? and why do they sometimes give different suggestions ?
Intermediate & Advanced SEO | | seoanalytics0 -
Navigation is not showing up when disabling JS?
HI Mozzers, I am checking at a site Navigation and when disabling JS, I can see that navigation is not present. When checking at individual nav items, they are true HTML I believe since they carry attribute but somehow not crawlable. I am confused since JS is disabled and Nav is not showing up, isn't it suppose to mean that Nav is 100% JS? Can someone tell if the Nav is good or bad based on the fact it has true HTML? I am not sure because couple years ago I knew that Google was not crawling well JS but seems that things have change based on http://searchengineland.com/tested-googlebot-crawls-javascript-heres-learned-220157. Thanks for letting me know!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Is my website is having enough content on it to rank?
I have less content on my website, is this okay or I need to add more content on my pages? Website is - brandstenmedia.com.au Any other suggestions for the website?
Intermediate & Advanced SEO | | Green.landon0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Does link building through content syndication still actually work?
I stumbled across this old SEOmoz whilteboard http://www.seomoz.org/blog/whiteboard-friday-leveraging-syndicated-content-effectively and was wondering if this is still a valid technique given the Panda & Penguin updates. Is anyone here still doing this (and seeing results)?
Intermediate & Advanced SEO | | nicole.healthline0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0