I'm looking for a bulk way to take off from the Google search results over 600 old and inexisting pages?
-
When I search on Google site:alexanders.co.nz still showing over 900 results.
There are over 600 inexisting pages and the 404/410 errrors aren't not working.
The only way that I can think to do that is doing manually on search console using the "Removing URLs" tool but is going to take ages.
Any idea how I can take down all those zombie pages from the search results?
-
Just here to add some to Will almost complete answer:
The 'site:' often shows results that won't be displayed in Google search results and don't represent the entirely nor precisely the pages that are indexed.I'd suggest to you:
1- If those pages are already serving 404 or 410, then wait for a little. Google won't show them in search results and eventually won't be seen in a site: search. You can check whether those URLs are being shown in searches through search console.
2- There is a script made by a webmaster that helps you using the GSC URL removal tool for a big list of URLs. Please, use it carefully and try it first within a riskless GSC propertyHope it helps.
Best luck.
Gaston -
What is the business issue this is causing? Are you seeing these 404 / 410 pages appearing in actual searches?
If it's just that they remain technically indexed, I'd be tempted not to be too worried about it - they will drop out eventually.
Unfortunately, most of the ways to get pages (re-)indexed are only appropriate for real pages that you want to have remain in the index (e.g.: include in a new sitemap file and submit that) or are better for individual pages which has the same downside as removing them via search console one by one.
You can remove whole folders at a time via search console, if that would speed things up - if the removed pages are grouped neatly into folders?
Otherwise, I would probably consider prioritising the list (using data about which are getting visits or visibility in search) and removing as many as you can be bothered to work through.
Hope that helps.
-
Hi, Thanks for that, the problem is that those pages are really old they are generating 0 traffic so we set up a 404 error page a long time ago but Google is not going to remove those pages because without traffic there is not crawl and without a few crawls Google is not going to know that those pages don't exist anymore. They are literally zombie pages! Any idea?
-
What about creating a load of 301 redirects, from the none existent URLs to the still active ones , &/ or, updating your 404 pages to better inform users what happened to the "missing" pages. regardless Google will just stop indexing them after a short while.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Suddenly keywords Disappeared from Google Search Results
Hello Guys Please help me, suddenly all of my site's keywords are disappeared from google search result, most of keywords are no.1 on google but today after 6pm i see the traffic decreasing and when i search my keywords there is no any keywords in search result. Only homepage keyword is showing. Please Help what is Happening with me.
Intermediate & Advanced SEO | | mianazeem4180 -
Fetch as Google -- Does not result in pages getting indexed
I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked. The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year. Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc). I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.
Intermediate & Advanced SEO | | jplehmann0 -
Hey there, i'm working on search results in dutch.
My biggest competitor who's number 1 in main keywords in google has almost only links from 'linkfarms' and blog comments. How is he ranked that high? Would it be a good idea to add a bit the best of these in my mix, while i work on the real good quality content?
Intermediate & Advanced SEO | | ValleyofTea0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Should we 301 redirect old events pages on a website?
We have a client that has an events category section that is filled to the brim with past events webpages. Another issue is that these old events webpages all contain duplicate meta description tags, so we are concerned that Google might be penalizing our client's website for this issue. Our client does not want to create specialized meta description tags for these old events pages. Would it be a good idea to 301 redirect these old events landing pages to the main events category page to pass off link equity & remove the duplicate meta description tag issue? This seems drastic (we even noticed that searchmarketingexpo.com is keeping their old events pages). However it seems like these old events webpages offer little value to our website visitors. Any feedback would be much appreciated.
Intermediate & Advanced SEO | | RosemaryB0 -
Google Indexed Old Backups Help!
I have the bad habit of renaming a html page sitting on my server, before uploading a new version. I usually do this after a major change. So after the upload, on my server would be "product.html" as well as "product050714".html. I just stumbled on the fact G has been indexing these backups. Can I just delete them and produce a 404?
Intermediate & Advanced SEO | | alrockn0 -
Sitemap - % of URL's in Google Index?
What is the average % of links from a sitemap that are included in the Google index? Obviously want to aim for 100% of the sitemap urls to be indexed, is this realistic?
Intermediate & Advanced SEO | | stats440