NOINDEX content still showing in SERPS after 2 months
-
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content:
name="robots" content="NOINDEX" />
It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com
I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options.
- 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right?
- Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden.
- Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code.
Please advise and thanks for reading.
-
Just wanted to let you know that submitting all the sites I wanted removed into an XML sitemap worked. I then submitted that sitemap to webmaster tools and listed it in the robots.txt. When doing query "site:domain.com" index pages went from 20k+ down to 700 in a matter of days.
-
I could link to them then, but what about creating a custom sitemap for just content that I want removed? Would that have the same effect?
-
If they are not linked to then spiders will not find the noindex code. They could suffer in the SERPs for months and months.
-
If all these pages are under a directory structure than you have the option to remove a complete directory in URL removal option. See if that is feasible in your case.
-
I suppose I'll wait longer. Crawl rate over the last 90 days is a high of 3,285 and average of 550 with a low of 3 according to webmaster tools.
-
Yeah the pages are low PR and are not linked to at all from the site. I've never heard of removing a page via webmaster tools. How do I do that? I also have to remove several thousand.
*edit: It looks like I have to remove them one at a time which is not feasible in my case. Is there a faster way?
-
If you want a page out of the index fast the best way is to do it through webmaster tools. It's easy and lasts for about six months. Then, if they find your page again it will register the noindex and you should be fine.
As EGOL said, if it's a page that isn't crawled very often then it could be a LONG time before it gets deindexed.
-
I removed some pages from the index and used the same line of code...
name="robots" content="NOINDEX" />
My pages dropped from the index within 2 or 3 days - but this is a site that has very heavy spider activity.
If your site is not crawled very much or these are low PR pages (such as PR1, PR2) it could take google a while to revisit and act upon your noindex instructions - but two months seems a bit long.
Is your site being crawled vigorously? Look in webmaster tools to see if crawling declined abruptly when your rankings fell. Check there also for crawl problems.
If I owned your site and the PR of these pages is low I would wait a while longer before doing anything. If my patience was wearing thin I would do the 301 redirect because that will transfer the linkjuice from those pages to the target URL of the redirect - however, you might wait quite a while to see the redirect take effect. That's why my first choice would be to wait longer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong target content in the SERP regarding language
Hi Guys! I'm currently under an SEO issue and need some advices about it. My problem is that, Google doesn't show the good pages in the SERPs regarding the languages. In fact, I translated some content in Italian, German, French etc ... When someone use the branding name of the project to find it by google, if this guy is French, German, or something else, Google shows the English version in the results. I of course would like google showing the German version for a German guy in the SERP ... I already made properly my hreflang tags. Some tips to fix it? Thanks a lot in advance! And hope everybody had a merry christmas!
Intermediate & Advanced SEO | | SEOBubble0 -
Domain migration via Wordpress - organic still 60% down four months later
Hi guys!Almost four months ago I performed a Wordpress domain migration. Three pet-based sites were migrated into a new pet-based one that incorporated them all - the new site is petskb.com and 240 posts were migrated.The site migration was performed via 301 wildcard re-directs using the .htaccess files in the old domains, which are still in place and working. I also used the site move tool in GSC. Afterwards, I performed an audit of the new site to ensure that all the old urls were being re-directed to the new one, which they were (and are). There have been no manual actions reported in GSC.The results have been very poor. A small few of the articles that were in the top 10 moved over and I quickly claimed the same positions in the new site. Most did not though and still sit >100 in the SERP or absolutely nowhere (or even omitted) using the main keyword.I've created about 60 new articles (using the same SEO analysis I did previously) since that time on the new site and not one of them has ranked <100 in all that time, whereas on the old sites they would initially rank somewhere in the top 50 after a couple of days and work their way up over the months. These new posts haven't moved though. The posts that were published on the new site four months ago are still in the exact same position.So, I've created new content, re-submitted the sitemap and manually requested re-indexing of the posts. Nothing has changed. I've hired SEO's and not one has found any problems with my site or how I performed the migration. Clearly there is a problem though. The original posts that were ranking previously and all the new posts have not moved in the SERP. There were a few spammy links pointing to the new domain but nothing significant, I did disavow these though - no more than on the old sites though.As a test, I created a new post on another domain which has no posts with the same long-tail keyword as one that has been on my new site for almost four months. The one I posted on the new domain out-ranked the one on petskb after just two days.Can anyone help? If you can I will personally travel to where you live and buy you several beers.Thanks,Matt
Intermediate & Advanced SEO | | mattpettitt1 -
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
Huge httaccess with old 301 redirects. Is it safe to delete all redirects with no traffic in last 2 months?
We have a huge httaccess file over several MB which seems to be the cause for slow server response time. There are lots of 301 redirects related to site migration from 9 months ago where all old URLs were redirected to new URL and also lots of 301 redirects from URL changes accumulated over the last 15 years. Is it safe to delete all 301 redirects which did not receive any traffic in last 2 months ? Or would you apply another criteria for identifying those 301 that can be safely deleted? Any way to get in google analytics or webmaster tools all 301 that received traffic in the last 2 months or any other easy way to identify those, apart from checking the apache log files ?
Intermediate & Advanced SEO | | lcourse0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Keyword/Content Consistency
My question is: If you have a keyword that is searched more when it's spelled wrong then when it's spelled right - what do you do? Do you do the misspelled word or keep true to the spelling and say oh well to SEO? Also - Along the same lines of that question: What if you have a keyword that has a - in the middle of it. For instance: website and web-site (this isn't the keyword just an example). and drupal website is searched more then drupal web-site but wordpress web-site is searched more then wordpress website. Technically website is the correct spelling and way to write it, but people put web-site (again not the case in reality - just an example).
Intermediate & Advanced SEO | | blackrino0 -
301 redirect for duplicate content
Hey, I have just started working on a site which is a video based city guide, with promotional videos for restaurants, bars, activities,etc. The first thing that I have noticed is that every video on the site has two possible urls:- http://www.domain.com/venue.php?url=rosemarino
Intermediate & Advanced SEO | | AdeLewis
http://www.domain.com/venue/rosemarino I know that I can write a .htaccess line to redirect one to the other:- redirect 301 /venue.php?url=rosemarino http://www.domain.com/venue/rosemarino but this would involve creating a .htaccess line for every video on the site and new videos that get added may get missed. Does anyone know a way of creating a rule to rewrite these urls? Any help would be most gratefully received. Thanks. Ade.0 -
Why would the PageRank for all of our websites show the same?
The last time I checked (early this year), the PageRank on the sites I manage varied, with the highest showing as 6. It made sense as the PR6 site has loads of links and has been around for a long time, whereas the other sites hadn't. Now all of our websites are showing the same PageRank - 6, even one that has recently launched and another that has barely any links/traffic or anything to it. I didn't check the PR of that one last time (I'd be surprised if it was 2), but the sites now showing as 6 ranged from PR3 to PR6 back then. We changed server in February...so could this issue be something to do with all of the sites being stored on the same server? It doesn't seem right but it's the only thing I can think of. At the moment, the Domain Authority for these six websites ranges from 27 to 62.
Intermediate & Advanced SEO | | Alex-Harford0