I'm looking for a bulk way to take off from the Google search results over 600 old and inexisting pages?
-
When I search on Google site:alexanders.co.nz still showing over 900 results.
There are over 600 inexisting pages and the 404/410 errrors aren't not working.
The only way that I can think to do that is doing manually on search console using the "Removing URLs" tool but is going to take ages.
Any idea how I can take down all those zombie pages from the search results?
-
Just here to add some to Will almost complete answer:
The 'site:' often shows results that won't be displayed in Google search results and don't represent the entirely nor precisely the pages that are indexed.I'd suggest to you:
1- If those pages are already serving 404 or 410, then wait for a little. Google won't show them in search results and eventually won't be seen in a site: search. You can check whether those URLs are being shown in searches through search console.
2- There is a script made by a webmaster that helps you using the GSC URL removal tool for a big list of URLs. Please, use it carefully and try it first within a riskless GSC propertyHope it helps.
Best luck.
Gaston -
What is the business issue this is causing? Are you seeing these 404 / 410 pages appearing in actual searches?
If it's just that they remain technically indexed, I'd be tempted not to be too worried about it - they will drop out eventually.
Unfortunately, most of the ways to get pages (re-)indexed are only appropriate for real pages that you want to have remain in the index (e.g.: include in a new sitemap file and submit that) or are better for individual pages which has the same downside as removing them via search console one by one.
You can remove whole folders at a time via search console, if that would speed things up - if the removed pages are grouped neatly into folders?
Otherwise, I would probably consider prioritising the list (using data about which are getting visits or visibility in search) and removing as many as you can be bothered to work through.
Hope that helps.
-
Hi, Thanks for that, the problem is that those pages are really old they are generating 0 traffic so we set up a 404 error page a long time ago but Google is not going to remove those pages because without traffic there is not crawl and without a few crawls Google is not going to know that those pages don't exist anymore. They are literally zombie pages! Any idea?
-
What about creating a load of 301 redirects, from the none existent URLs to the still active ones , &/ or, updating your 404 pages to better inform users what happened to the "missing" pages. regardless Google will just stop indexing them after a short while.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://moz.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
Getting too many links on Google search results, how do I fix?
I'm a total newbie so I apologize for what I am sure is a dumb question — I recently followed Moz suggestions for increasing visibility on my site for a specific keyword by including that keyword in more verbose page descriptions for multiple pages. This worked TOO well as now that keyword is bringing up too many results in Google for these different pages on my site . . . is there a way to compile them into one result with the subpages like for instance, the attached image for a search on Apple? Do I need to change something in my robots.txt file to direct these to my main page? Basically, I am a photographer and a search for my name now brings up each of my different photo gallery pages in multiple results, it's a little over the top. Thanks for any and all help! CNPJZgb
Intermediate & Advanced SEO | | jason54540 -
Google's 'related:' operator
I have a quick question about Google's 'related:' operator when viewing search results. Is there reason why a website doesn't produce related/similar sites? For example, if I use the related: operator for my site, no results appear.
Intermediate & Advanced SEO | | ecomteam_handiramp.com
https://www.google.com/#q=related:www.handiramp.com The site has been around since 1998. The site also has two good relevant DMOZ inbound links. Any suggestions on why this is and any way to fix it? Thank you.0 -
Rank Tracker Result Not Reflected In Google
I'm tracking keyword results in Rank Tracker, but I can't confirm the positions when I do a Google search for the tracked keywords. Does anybody know why RT says the site should be #23, but is not actually in Google? Is there a way to check Google results from different data centers? If I recall, Google allowed the option to view results from different cities, though I don't know if they still allow this.
Intermediate & Advanced SEO | | alrockn0 -
How long does google index old urls?
Hey guys, We are currently in the process of redesigning a site but in two phases as the timeline issues. So there will be up to a 4 week gap between the 1st and 2nd set of redirects. These urls will be idle 4 weeks before the phase content is ready. What effect if any will this have on the domain and page authority? Thanks Rob
Intermediate & Advanced SEO | | daracreative0 -
Why is this page not being delivered for Google search result?
Hey folks, Figured I would try to get an experts insight on this. On google search result for BLACK TITANIUM RINGS + TITANIUM-JEWELRY.COM the page that I "think" should show up is this one: http://www.titanium-jewelry.com/black-titanium-rings.html However, it does not. Imho, this page is highly relevant. I used Rank Tracker here on seomoz.org and the page is not even in top 50 of search engine results for google. Our 'About Black Titanium Rings' page ranks #2 (http://www.titanium-jewelry.com/about-black-titanium.html) but the /black-titanium-rings.html page doesn't even rank. Any suggestions on what I could look at to figure out why this page is being penalized? We are not under a manual penalty (anymore!). Thanks! Ron
Intermediate & Advanced SEO | | yatesandcojewelers0 -
Best way to re-order page elements based on search engine users
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else. Questions: Is it cloaking? what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet? Is there any better way to re-order elements based on search engine? Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
Intermediate & Advanced SEO | | StickyRiceSEO0