Old pages still in index
-
Hi Guys,
I've been working on a E-commerce site for a while now. Let me sum it up :
- February new site is launched
- Due to lack of resources we started 301's of old url's in March
- Added rel=canonical end of May because of huge index numbers (developers forgot!!)
- Added noindex and robots.txt on at least 1000 urls.
- Index numbers went down from 105.000 tot 55.000 for now, see screenshot (actual number in sitemap is 13.000)
Now when i do site:domain.com there are still old url's in the index while there is a 301 on the url since March!
I know this can take a while but I wonder how I can speed this up or am doing something wrong. Hope anyone can help because I simply don't know how the old url's can still be in the index.
-
Hi Dan,
Thanks for the answer!
Indexation is already back to 42.000 so slowly going back to normal
And thanks for the last tip, that's totally right. I just discovered that several pages had duplicate url's generated so by continually monitoring we'll fix it !
-
Hi There
To noindex pages there are a few methods;
-
use a meta noindex without robots.txt - I think that is why some may not be removed. The robots.txt block crawling so they can not see the noindex.
-
use a 301 redirect - this will eventually kill off the old pages, but it can definitely take a while.
-
canonical it to another page. and as Chris says, don't block the page or add extra directives. If you canonical the page (correctly), I find it usually drops out of the index fairly quickly after being crawled.
-
use the URL removal tool in webmaster tools + robots.txt or 404. So if you 404 a page or block it with robots.txt you can then go into webmaster tools and do a URL removal. This is NOT recommended though in most normal cases, as Google prefers this be for "emergencies".
The only method that removes pages within a day or two guaranteed is the URL removal tool.
I would also examine your site since it is new, for something that is causing additional pages to be generated and indexed. I see this a lot with ecommerce sites where they have lots of pagination, facets, sorting, etc and those can generate lots of other pages which get indexed.
Again, as Chris says, you want to be careful to not mix signals. Hope this all helps!
-Dan
-
-
Hi Chris,
Thanks for your answer.
I'm either using a 301 or noindex, not both of course.
Still have to check the server logs, thanks for that!
Another weird thing. While the old url is still in the index, when i check the cache date it's a week old. That's what i don't get. Cache date is a week old but Google still has the old url in the index.
-
It can take months for pages to fall out of Google's index have you looked at your log files to verify that googlebot is crawling those pages?. Things to keep in mind:
- If you 301 a page, the rel=canonical on that page will not be seen by the bot (no biggie in your case)
- If you 301 a page, a meta noindex will not be seen by the bot
- It is suggested not to use the robots.txt to no index a page that is being 301 redirected--as the redirect may not be seen by Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
Can't support IE 7,8,9, 10\. Can we redirect them to another page that's optimized for those browsers so that we can have our site work on modern browers while still providing a destination of IE browsers?
Hi, Our site can't support IE 7,8,9, 10. Can we redirect them to another page that's optimized for those browsers so that we can have our site work on modern broswers while still providing a destination of IE browsers? Would their be an SEO penalty? Thanks!
Intermediate & Advanced SEO | | dspete0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
22 Pages 7 Indexed
So I submitted my sitemap to Google twice this week the first time everything was just peachy, but when I went back to do it again Google only indexed 7 out of 22. The website is www.theinboundspot.com. My MOZ Campaign shows no issues and Google Webmaster shows none. Should I just resubmit it?
Intermediate & Advanced SEO | | theinboundspot1 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Google de-indexed a page on my site
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms). Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week. As a test I added a similar page to one of my other sites and it ranks fine. I've checked webmaster tools and there is nothing of note there. I'm not really sure what to do at this stage. Any advice would me much appreciated!
Intermediate & Advanced SEO | | deelo5550 -
Thousands of Web Pages Disappered from Google Index
The site is - http://shop.riversideexports.com We checked webmaster tools, nothing strange. Then we manually resubmitted using webmaster tools about a month ago. Now only seeing about 15 pages indexed. The rest of the sites on our network are heavily indexed and ranking really well. BUT the sites that are using a sub domain are not. Could this be a sub domain issue? If so, how? If not, what is causing this? Please advise. UPDATE: What we can also share is that the site was cleared twice in it's lifetime - all pages deleted and re-generated. The first two times we had full indexing - now this site hovers at 15 results in the index. We have many other sites in the network that have very similar attributes (such as redundant or empty meta) and none have behaved this way. The broader question is how to do we get the indexing back ?
Intermediate & Advanced SEO | | suredone0 -
Traffic drop off and page isn't indexed
In the last couple weeks my impressiona and clicks have dropped off to about half what it used to be. I am wondering if Google is punishing me for something... I also added two new pages to my site in the first week of June and they still aren't indexed. In the past it seemed like new pages would be indexed in a couple days. Is there any way to tell if Google is unhappy with my site? WMT shows 3 server errors, 3 Access denied, and 122 not found errors. Could those not found pages be killing me? Thanks for any advise, Greg www.AntiqueBanknotes.com
Intermediate & Advanced SEO | | Banknotes0