Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is 404'ing a page enough to remove it from Google's index?
- 
					
					
					
					
 We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these? 
- 
					
					
					
					
 Nice information John. I hadn't thought of adding a temporary page with a noindex tag but that sounds like a way to go for faster results. I know Google has automatically removed 404 pages in the past. I noticed the issue Michelle is sharing and the information you shared offers great details on the process. 
- 
					
					
					
					
 Setting pages to 404 should be enough to remove them after Google indexes your page enough times. Google has to be careful about this, because when many sites crash or have site maintenance, they return 404 instead of 503, so Google wouldn't want to remove pages from their index until they're sure the page is gone. Google talks about removing pages from there index here. The Google Webmaster Tools URL removal tool is only intended for pages that urgently need to be removed, so I wouldn't recommend that. Google recommends: - If the page no longer exists, make sure that the server returns a 404 (Not Found) or 410 (Gone) HTTP status code. This will tell Google that the page is gone and that it should no longer appear in search results.
- If the page still exists but you don't want it to appear in search results, use robots.txt to prevent Google from crawling it. Note that in general, even if a URL is disallowed by robots.txt we may still index the page if we find its URL on another site. However, Google won't index the page if it's blocked in robots.txt and there's an active removal request for the page.
- Alternatively, you can use a noindex meta tag. When we see this tag on a page, Google will completely drop the page from our search results, even if other pages link to it. This is a good solution if you don't have direct access to the site server. (You will need to be able to edit the HTML source of the page).
 Is there a reason you are 404'ing these pages rather than redirecting them? If these pages have new pages with similar content, you should do a 301 redirect to keep the link juice flowing and to take advantage of these pages being linked to. If you do continue returning 404 for these pages (or even if you don't...), make sure your 404 page is a useful one, that helps users find the page they're looking for (Google help article). Also, Ryan, I'd be interested in hearing the results of using the 410 status code. I would imagine that status code would do the trick! I'm surprised I haven't read about this more, or why it's not mentioned in the help file linked to above. 
- 
					
					
					
					
 I have experienced this same issue with Google. I just began a test by making a change on my site to one of the URLs. I am bookmarking this Q&A and will try to remember to update it if I see a change. It can take Google some time to check any individual link so it could take weeks. In case you are curious, I have added a 410 status code for one of the pages involved. 410 means the resource is gone, while 404 is simply not found. Perhaps the 410 header code will send the right message to Google. 
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
 I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend? Intermediate & Advanced SEO | | rickyporco0
- 
		
		
		
		
		
		How does educational organization schema interact with Google's knowledge graph?
 Hi there! I was just wondering if the granular options of the Organization schema, like Educational Organization (http://schema.org/EducationalOrganization) and CollegeOrUniversity (http://schema.org/CollegeOrUniversity) schema work the same when it comes to pulling data into the knowledge graph. I've typically always used the Organization schema for customers but was wondering if there are any drawbacks for going deep into the hierarchy of schema. Cheers 😄 Intermediate & Advanced SEO | | Corbec8880
- 
		
		
		
		
		
		Google Indexing Of Pages As HTTPS vs HTTP
 We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks? Intermediate & Advanced SEO | | vikasnwu1
- 
		
		
		
		
		
		Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters
 We have several vanity domains that forward to various pages on our primary domain. Intermediate & Advanced SEO | | SS.Digital
 e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200) These forwards have been in place for months or even years and have worked fine. As of yesterday, we have seen the following problem. We have made no changes in the forwarding settings. Now, inconsistently, they sometimes resolve and sometimes they do not. When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters. (e.g. VGuTD) EXAMPLE:
 www.vanity.com (302, Found) -->
 www.vanity.com/xxxxx (302, Found) -->
 www.vanity.com/xxxxx (302, Found) -->
 www.vanity.com/xxxxx/xxxxx (302, Found) -->
 www.mydomain.com/sub-page/xxxxx (404, Not Found) This is just one example, the amount of redirects, vary wildly. Sometimes there is only 1 redirect, sometimes there are as many as 5. Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above). We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc... This leads us to believe that it is not at the device or host level. Our Registrar is Godaddy. They have not encountered this issue before, and have no idea what this 5 character string is from. I tend to believe them because per our analytics, we have determined that this problem only started yesterday. Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past? We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time. Really hoping to fix the cause of the problem instead of just treating the symptom.0
- 
		
		
		
		
		
		What's the best way to redirect categories & paginated pages on a blog?
 I'm currently re-doing my blog and have a few categories that I'm getting rid of for housecleaning purposes and crawl efficiency. Each of these categories has many pages (some have hundreds). The new blog will also not have new relevant categories to redirect them to (1 or 2 may work). So what is the best place to properly redirect these pages to? And how do I handle the paginated URLs? The only logical place I can think of would be to redirect them to the homepage of the blog, but since there are so many pages, I don't know if that's the best idea. Does anybody have any thoughts? Intermediate & Advanced SEO | | kking41200
- 
		
		
		
		
		
		Whats the best way to remove search indexed pages on magento?
 A new client ( aqmp.com.br/ )call me yestarday and she told me since they moved on magento they droped down more than US$ 20.000 in sales revenue ( monthly)... I´ve just checked the webmaster tool and I´ve just discovered the number of crawled pages went from 3.260 to 75.000 since magento started... magento is creating lots of pages with queries like search and filters. Example: http://aqmp.com.br/acessorios/lencos.html http://aqmp.com.br/acessorios/lencos.html?mode=grid http://aqmp.com.br/acessorios/lencos.html?dir=desc&order=name Add a instruction on robots.txt is the best way to remove unnecessary pages of the search engine? Intermediate & Advanced SEO | | SeoMartin10
- 
		
		
		
		
		
		Can too many "noindex" pages compared to "index" pages be a problem?
 Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio Intermediate & Advanced SEO | | fablau0
- 
		
		
		
		
		
		Indexed Pages in Google, How do I find Out?
 Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞 Intermediate & Advanced SEO | | JohnPeters0
 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				