How to Stop Google from Indexing Old Pages
-
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact).
These pages no longer exist on the site and there are no internal or external links pointing to these pages.
Google has crawled the site since the go live, but continues to try and crawl these pages.
What are my next steps?
-
All my clients are impatient with Google's crawl. I think the speed of life on the web has spoiled them. Assuming your site isn't a huge e-commerce or subject-matter site...you will get crawled but not right away. Smaller, newer sites take time.
Take any concern and put it towards link building to the new site so Google's crawlers find it faster (via their seed list). Get it up on DMOZ, get that Twitter account going, post videos to Youtube, etc. Get some juicy high-PR inbound links and that could help speed up the indexing. Good luck!
-
Like Mike said above, there still isn't enough info provided for us to give you a very clear response, but I think he is right to point out that you shouldnt really care about the extinct pages in Google's index. They should, at some point, expire.
You can specify particular URLs to remove in GWT, or your robots.txt file, but that doesn't seem the best option for you. My recommendation is to just prepare the new site in the new location, upload a good clean sitemap.xml to GWT, and let them adjust. If you have much of the same content as well, Google will know due to the page creation date which is the newer and more appropriate site. Hate to say "trust the engines" but in this case, you should.
You may also consider a rel="author" tag in your new site to help Google prioritize the new site. But really the best thing is a new site on a new domain, a nice sitemap.xml, and patience.
-
To further clear things up...
I can 301 every page from the old .php site to our new homepage (However, I'm concerned about Google's impression of our overall user experience).
Or
I can 410 every page from the old .php site (Wouldn't this tell Google to stop trying to crawl these pages? Although these pages technically still exist, they just have a different URL and directory structure. Too many to set up individual 301's tho).
Or
I can do nothing and wait for these pages to drop off of Google's radar
What is the best option?
-
After reading the further responses here I'm wondering something...
You switched to a new site, can't 301 the old pages, and have no control over the old domain... So why are you worried about pages 404ing on an unused site you don't control anymore?
Maybe I'm missing something here or not reading it right. Who does control the old domain then? Is the old domain just completely gone? Because if so, why would it matter that Google is crawling non-existent pages on a dead site and returning 404s and 500s? Why would that necessarily affect the new site?
Or is it the same site but you switched to Java from PHP? If so, wouldn't your CMS have a way of redirecting the old pages that are technically still part of your site to the newer relevant pages on the site?
I feel like I'm missing pertinent info that might make this easier to digest and offer up help.
-
Sean,
Many thanks for your response. We have submitted a new, fresh site map to Google, but it seems like it's taking them forever to digest the changes.
We've been keeping track of rankings, and they've been going down, but there are so many changes going on at once with the new site, it's hard to tell what is the primary factor for the decline.
Is there a way to send Google all of the pages that don't exist and tell them to stop looking for them?
Thanks again for your help!
-
You would need access to the domain to set up the 301. If you no longer can edit files on the old domain, then your best bet is to update Webmaster Tools with the new site info and a sitemap.xml and wait for their caches to expire and update.
Somebody can correct me on this if I'm wrong, but getting so many 404s and 500's already has probably impacted your rankings so significantly, that you may be best served to approach the whole effort as a new site. Again, without more data, I'm left making educated guesses here. And if you aren't tracking your rankings (as you asked how much it is impacting...you should be able to see), then I would let go of the old site completely and build search traffic fresh on the new domain. You'd probably generate better results in the long term by jettisoning a defunct site with so many errors.
I confess, without being able to dig into the site analytics and traffic data, I can't give direct tactical advice. However, the above is what I would certainly do. Resubmitting a fresh sitemap.xml to GWT and deleting all the info to the old site in there is probably your best option. I defer to anyone with better advice. What a tough position you are in!
-
Thanks all for the feedback.
We no longer have access to the old domain. How do we institute a 301 if we can no longer access the page?
We have over 200,000 pages throwing 404's and over 70,000 pages throwing 500 errors.
This probably doesn't look good to Google. How much is this impacting our rankings?
-
Like others have said, a 301 redirect and updating Webmaster Tools should be most of what you need to do. You didn't say if you still have access to the old domain (where the pages are still being crawled) or if you get a 404, 503, or some other error when navigating to those pages. What are you seeing or can you provide a sample URL? That may help eliminate some possibilities.
-
You should implement 301 redirects from your old pages to their new locations. It's sounds like you have a fairly large site, which means Google has tons of your old pages in its index that it is going to continue to crawl for some time. It's probably not going to impact you negatively, but if you want to get rid of the errors sooner I would throw in some 301s. \
With the 301s you'll also get any link value that the old pages may be getting from external links (I know you said there are none, but with 200K+ pages it's likely that at least one of the pages is being linked to from somewhere).
-
Have you submitted a new sitemap to Webmaster Tools? Also, you could consider 301 redirecting the pages to relevant new pages to capitalize on any link equity or ranking power they may have had before. Otherwise Google should eventually stop crawling them because they are 404. I've had a touch of success getting them to stop crawling quicker (or at least it seems quicker) by changing some 404s to 410s.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website stopped being in the Google Index
Hi there, So My website is two weeks old, and I published it and it was ranking at about page 10 or 11 for a week maybe a bit longer. The last few days it dropped off the rankings, which I assumed was the google algorithm doing its thing but when I checked Google Search Console it says my domain is not in the index. 'This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed.' I click request indexing, then after a bit, it goes green saying it was successfully indexed. Then when I refresh the website it gives me the same message 'This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed.' Not sure why it says this, any ideas or help is appreciated cheers.
Technical SEO | | sydneygardening0 -
Home Pages of Several Websites are disappearing / reappearing in Google Index
Hi, I periodically use the Google site command to confirm that our client's websites are fully indexed. Over the past few months I have noticed a very strange phenomenon which is happening for a small subset of our client's websites... basically the home page keeps disappearing and reappearing in the Google index every few days. This is isolated to a few of our client's websites and I have also noticed that it is happening for some of our client's competitor's websites (over which we have absolutely no control). In the past I have been led to believe that the absence of the home page in the index could imply a penalty of some sort. This does not seem to be the case since these sites continue to rank the same in various Google searches regardless of whether or not the home page is listed in the index. Below are some examples of sites of our clients where the home page is currently not indexed - although they may be indexed by the time you read this and try it yourself. Note that most of our clients are in Canada. My questions are: 1. has anyone else experienced/noticed this? 2. any thoughts on whether this could imply some sort of penalty? or could it just be a bug in Google? 3. does Google offer a way to report stuff like this? Note that we have been building websites for over 10 years so we have long been aware of issues like www vs. non-www, canonicalization, and meta content="noindex" (been there done that in 2005). I could be wrong but I do not believe that the site would keep disappearing and reappearing if something like this was the issue. Please feel free to scrutinize the home pages to see if I have overlooked something obvious - I AM getting old. site:dietrichlaw.ca - this site has continually ranked in the top 3 for [kitchener personal injury lawyers] for many years. site:burntucker.com - since we took over this site last year it has moved up to page 1 for [ottawa personal injury lawyers] site:bolandhowe.com - #1 for [aurora personal injury lawyers] site:imranlaw.ca - continually ranked in the top 3 for [mississauga immigration lawyers]. site:canadaenergy.ca - ranks #3 for [ontario hydro plans] Thanks in advance! Jim Donovan, President www.wethinksolutions.com
Technical SEO | | wethink0 -
Old pages - should I remove them from serps?
Hi guys, I need an advice from you, a recommendation. I have some old LPs from old campaigns, around 70 pages indexed on Google, campaigns that are not available anymore. I have removed them from my DB, but they still remained on server so Google still sees them as URLs on my site, witch I totally agree. What should I do with this pages? Should I remove them completely? (url removal tool) or use rel=canonical? How will this affect my domain authority and rankings? This pages doesn't bring traffic any more, maybe a view now and then, but overall this pages don't bring traffic.
Technical SEO | | catalinmoraru0 -
How to fix Google index after fixing site infected with malware.
Hi All Upgraded a Joomla site for a customer a couple of months ago that was infected with malware (it wasn't flagged as infected by google). Site is fine now but still noticing search queries for "cheap adobe" etc with links to http://domain.com/index.php?vc=201&Cheap_Adobe_Acrobat_xi in web master tools (about 50 in total). These url's redirect back to home page and seem to be remaining in the index (I think Joomla is doing this automatically) Firstly, what sort of effect would these be having on on their rankings? Would they be seen by google as duplicate content for the homepage (moz doesn't report them as such as there are no internal links). Secondly what's my best plan of attack to fix them. Should I setup 404's for them and then submit them to google? Will resubmitting the site to the index fix things? Would appreciate any advice or suggestions on the ramifications of this and how I should fix it. Regards, Ian
Technical SEO | | iragless0 -
Has Google stopped rendering author snippets on SERP pages if the author's G+ page is not actively updated?
Working with a site that has multiple authors and author microformat enabled. The image is rendering for some authors on SERP page and not for others. Difference seems to be having an updated G+ page and not having a constantly updating G+ page. any thoughts?
Technical SEO | | irvingw0 -
Can I redirect when Google is showing these as 2 different pages?
Hi Guys, Google webmaster is showing 1000 duplicate title tags because its picking up our pages like this. How can I correct this? Please explain in detail please. Thank You Tim /store/ICICLES_NO_7_CLEAR_WITH_PINK_NUBBY//store/ICICLES_NO_7_CLEAR_WITH_PINK_NUBBY
Technical SEO | | fasctimseo0 -
GWT indexing wrong pages
Hi SEOMoz I have a listings site. In a part of the page, I have 3 comboboxes, for state, county and city. On the change event, the javascript redirects the user to the page of the selected location. Parameters are passed via GET, and my URL is rewrited via htaccess. Example: http:///www.site.com/state/county/city.html The problem is, there is A LOT(more than 10k) of 404 errors. It is happenning because the crawler is trying to index the pages, sometimes WITHOUT a parameter, like http:///www.site.com/state//city.html I don't know how to stop it, and I don't wanna remove it, once it's very clicked by the users. What should I do?
Technical SEO | | elias990 -
Some site pages are removed from Google Index
Hello, Some pages of my clients website are removed from Google Index. We were in top 10 position for some keywords but now I cannot find those pages neither in top 1000. Any idea what to do in order to get these pages back? thank you
Technical SEO | | besartbajrami0