Why are pages still showing in SERPs, despite being NOINDEXed for months?
-
We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs.
Here is just one example....
http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"?
Any help is very much appreciated.
-
Thanks for your reply,
Let me know if you are able to deindex those pages. I will wait. Also please share what you have implemented to deindex those pages.
-
A page can have a link to it, and still not be indexed, so I disagree with you on that.
But thanks for using the domain name. That will teach me to use a URL shortener...
-
Hm, that is interesting. So you're saying that it will get crawled, and thus will eventually become deindexed (as noindex is part of the content="none" directive), but since it's a dead end page, it just takes an extra long time for that particular page to get crawled?
-
Just to add to the other answers, you can also remove the URLs (or entire directory if necessary) via the URL removal tool in Webmaster Tools, although Google prefers you to use it for emergencies of sorts (I've had no problems with it).
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=164734
-
No, nofollow will only tell the bot that the page is a dead end - that the bot should not follow any links on page. And that means any inks from those pages won't be visited by the bot - that is slowing the crawling process overall for those pages.
If you block a page in robots.txt and the page is already in the index - that will remain in the index as the noindex or content=none won't be seen by the bot so it won't be removed from the index - it will just won't be visited anymore.
-
Ok, so, nofollow is stopping the page from being read at all? I thought that nofollow just means the links on the page will not be followed. Is meta nofollow essentially the same as blocking a page in robots.txt?
-
Hi Howard,
The page is in Google index because you are still linking to that page from your website. Here is the page from where that page links:
http://www.2mcctv.com/product_print-productinfoVeiluxVS70CDNRDhtml.html
As you are linking that page Google indexing the page. Google come to know about "noindex" tag before that he has already indexed it. Sorry for bad English.
Lindsay has written awesome post about it here:
http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
After reading above blog post, my all doubts about noindex, follow, robots.txt get clear.
Thanks Lindsay
-
We always use the noindex code in our robot.txt file.
-
Hi,
In order to deindex you should use noindex as content=none also means nofollow. You do need to follow now in order to reach all other pages and see the no index tag and remove those from the index.
When you have all of them out of the index you can set the none back on.
This is the main reason "none" as attribute is not very wide in usage as "shooting yourself in the foot" with it it's easy.
On the otehr hand you need to see if google bot is actually reaching those pages:
-
see if you don't have any robots.txt restrictions first
-
see when google's bot last have a hit on any of the pages - that will give you a good idea and you can do a prediction.
If those pages are in the sup index you can wait for some time for Google bit to revisit.
One last note: build xml sitemaps with all of those pages and submit those via WMT - that will help at 100% to get those in front of the firing squad and also to be able to monitor those better.
Hope it helps.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a page that's 301 redirected get indexed / show in search results?
Hey folks, have searched around and haven't been able to find an answer to this question. I've got a client who has very different search results when including his middle initial. His bio page on his company's website has the slug /people/john-smith; I'm wondering if we set up a duplicate bio page with his middle initial (e.g. /people/john-b-smith) and then 301 redirect it to the existent bio page, whether the latter page would get indexed by google and show in search results for queries that use the middle initial (e.g. "john b smith"). I've already got the metadata based on the middle initial version but I know the slug is a ranking signal and since it's a direct match to one of his higher volume branded queries I thought it might help to get his bio page ranking more highly. Would that work or does the 301'd page effectively cease to exist in Google's eyes?
Technical SEO | | Greentarget0 -
Moving Some Content From Page A to Page B
Page A has written content, pictures, videos. The written content from Page A is being moved to Page B. When Google crawls the pages next time around will Page B receive the content credit? Will there not be any issues that this content originally belonged to Page A? Page A is not a page I want to rank for (just have great pictures and videos for users). Can I 301 redirect from Page A to B since the written content from A has been deleted or no need? Again, I intent to keep Page A live because good value for users to see pictures and videos.
Technical SEO | | khi50 -
Google showing https:// page in search results but directing to http:// page
We're a bit confused as to why Google shows a secure page https:// URL in the results for some of our pages. This includes our homepage. But when you click through it isn't taking you to the https:// page, just the normal unsecured page. This isn't happening for all of our results, most of our deeper content results are not showing as https://. I thought this might have something to do with Google conducting searches behind secure pages now, but this problem doesn't seem to affect other sites and our competitors. Any ideas as to why this is happening and how we get around it?
Technical SEO | | amiraicaew0 -
Should i index or noindex a contact page
Im wondering if i should noindex the contact page im doing SEO for a website just wondering if by noindexing the contact page would it help SEO or hurt SEO for that website
Technical SEO | | aronwp0 -
Dynamic page
I have few pages on my site that are with this nature /locator/find?radius=60&zip=&state=FL I read at Google webmaster that they suggest not to change URL's like this "According to Google's Blog (link below) they are able to crawl the simplified dynamic URL just fine, and it is even encouraged to use a simple dynamic URL ( " It's much safer to serve us the original dynamic URL and let us handle the problem of detecting and avoiding problematic parameters. " ) _http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html _It can also actually lead to a decrease as per this line: " We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. "The URLs are already simplified without any extra parameters, which is the recommended structure from Google:"Does that mean I should avoid rewriting dynamic URLs at all?
Technical SEO | | ciznerguy
That's our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems" I would love to get some opinions on this also please consider that those pages are not cached by Google for some reason.0 -
Best practice for rich snippet product data - which page shows up?
We have a website with thousands of pages that rank locally for a specific service we offer. What I'd like to do is add rich snippets to these pages. I'd like to setup the services we offer as 'products' in the rich snippets, so that our 2 services show up below the url as rich snippets. I guess I'm not sure if the markup is supposed to be on the product page itself, or if I should use the offerurl tag, to create a separate page on the site whose only purpose is to have a long list of the services we offer pointing to the local pages as the offer url's. What do I do with this page? what are best practices for this offer aggregator? Are there any resources I can look at? Am I even doing this right? I'm new to having markup pages, and I'm hoping that the markup code doesn't actually need to be on the product offer page itself, but that the product offer page is the one that shows up on the results - that is my last question actually - which page will show up? the offerurl link, or the actual markup page.
Technical SEO | | ilyaelbert0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
Pages plummeting in ranking
Hi all, I have a question, which i hope you can answer for me. I have a site www.betxpert.com (a danish betting site) and we have tried to do some SEO to improve conversions. One of the steps we have taken was to link to all of our bookmaker reviews in our menu (a mega menu). All of our bookmakers have an img and text link in the menu. The menu is shown on every page of the site. Since we have made this change we have been plumeting down the SERPs. For the search "betsafe" this page http://www.betxpert.com/bookmakere/betsafe is no longer in the top 50. We also added the "stars" so that the google result will show our over all review for the bookmaker, in order to stand out in the SERPs. Can anyone explain to me what the problem might be? Over extensive internal linking or?
Technical SEO | | rasmusbang0