Speed up the process of removing URLs from Google Index
-
Hi guys,
We have done some work to try to remove pages from Google index. We have done the following:
1. Noindex tag
2. Make pages returning a 404 response.
Is there anyway to notify Google about these changes so we can speed up the process of removing these pages from Google index?
Also regarding the URL removal tool, Google says that it's used to remove URLs from search results, does it mean the URLs are removed from their index too?
Many thanks guys
David
-
Do the pages have things like credit card or social security numbers where you need them out of the SERPs right away, or is it just stuff you want gone for some other reason? There is a URL removal tool you can use, but Google does prefer it to be used for things that have to be gone immediately, and to do what you're doing now for the rest of the items.
My personal experience (from a while back) is that when a robots.txt changed and unblocked items from being crawled, they were back in the SERPs right away. Google seems to still know about all of these URLs (and doesn't just wipe them from memory entirely) even if they don't show them in the search results. I don't have any resources to point you to on that right now though.
-
I would resubmit your sitemap after you've removed the page. Time will make it go away.
-
Unfortunately I don't know any way to inform google. What i think you should do is to change the sitemap and also include this page at your robots.txt file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which URL do I request Google News inclusion for: the http or the non-http?
In Google WMT/Search Console, I've marked the non-www. version of my site as the preferred. But I haven't run into a choice between http:// and non-http:// before. Should I choose the one listed at the top, which is the non-http (AND the non-www) version? Thanks! Unknown.png
Technical SEO | | christyrobinson1 -
Why Google ranks a page with Meta Robots: NO INDEX, NO FOLLOW?
Hi guys, I was playing with the new OSE when I found out a weird thing: if you Google "performing arts school london" you will see w w w . mountview . org. uk at the 3rd position. The point is that page has "Meta Robots: NO INDEX, NO FOLLOW", why Google indexed it? Here you can see the robots.txt allows Google to index the URL but not the content, in article they also say the meta robots tag will properly avoid Google from indexing the URL either. Apparently, in my case that page is the only one has the tag "NO INDEX, NO FOLLOW", but it's the home page. so I said to myself: OK, perhaps they have just changed that tag therefore Google needs time to re-crawl that page and de-index following the no index tag. How long do you think it will take to don't see that page indexed? Do you think it will effect the whole website, as I suppose if you have that tag on your home page (the root domain) you will lose a lot of links' juice - it's totally unnatural a backlinks profile without links to a root domain? Cheers, Pierpaolo
Technical SEO | | madcow780 -
Will blocking the Wayback Machine (archive.org) have any impact on Google crawl and indexing/SEO?
Will blocking the Wayback Machine (archive.org) by adding the code they give have any impact on Google crawl and indexing/SEO? Anyone know? Thanks! ~Brett
Technical SEO | | BBuck0 -
WordPress post indexation speed
Has anyone noticed any increases in the length of time it takes for WP posts to get indexed by Google? I have a website with the following: domain.com - CMS with lots of pages/content blog.domain.com - subdomain for the blog using WP It's odd.. the pages on the main site are indexed almost immediately. The posts on the blog are taking between 2-5 days. The blog posts are all unique content, here's an example of a recent one: blog.looksfishy.co.uk/2013/three-rivers-angling/
Technical SEO | | edwardlewis0 -
Google Index Speed Opinions
Hello Everyone, Under normal circumstances, new posts to my site are indexed almost instantly by Google. I know this because an occasional search with quotation marks surrounding the 1st paragraph of text displays my newly published page. I use this tactic from time to time to ensure contributors aren't syndicating content. My question is this: I've noticed over the last day or so that my newly published articles are not yet indexed. For example, an article that was published over 24 hours ago does not appear to be indexed yet. Is this cause for concern? Is there an average wait time for indexation? XML issue? Thanks in advance for the help/insight.
Technical SEO | | JSOC0 -
Crawl reveals hundreds of urls with multiple urls in the url string
The latest crawl of my site revealed hundreds of duplicate page content and duplicate page title errors. When I looked it was from a large number of urls with urls appended to them at the end. For example: http://www.test-site.com/page1.html/page14.html or http://www.test-site.com/page4.html/page12.html/page16.html some of them go on for a hundred characters. I am totally stymied, as are the people at my ISP and the person who talked to me on the phone from SEOMoz. Does anyone know what's going on? Thanks So much for any help you can offer! Jean
Technical SEO | | JeanYates0 -
Google Has Indexed Most of My Site, why won't Bing?
We've got 600K+ pages indexed by Google and have submitted our same sitemap.xml's to Bing, but have only seen 100-200 pages get indexed by Bing. Is this fairly typical? Is there anything further we can do to increase indexation on Bing?
Technical SEO | | jamesti0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0