Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What to do with removed pages and 404 error
-
I recently removed about 600 'thin' pages from my site which are now showing as 404 errors in WMT as expected. As I understand it I should just let these pages 404 and eventually they'll be dropped from the index. There are no inbound links pointing at them so I don't need to 301 them. They keep appearing in WMT as 404's though so should I just 'mark as fixed' until they stop appearing? Is there any other action I need to take?
-
If they are truly gone, then a 410 would be the best option for you. Since they are indexed even if there are no links pointing at them, people can still find them besed upon what they are searching for. You never know when your link will show up, because you dont know how long google is going to take to get rid of the links.
http://www.checkupdown.com/status/E410.html
"The 410 error is primarily intended to assist the task of Web maintenance by notifying the client system that the resource is intentionally unavailable and that the Web server wants remote links to the URL to be removed. Such an event is common for URLs which are effectively dead i.e. were deliberately time-limited or simply orphaned. The Web server has complete discretion as to how long it provides the 410 error before switching to another error such as 404"
We did this for a client that needed old defunct pages removed. Once you set the pages to return a 410, and use Google url removal tool, you should see them dropping off really quick. (all of ours were gone within a month) Having that many pages return a 404 may be hurting the experience of your users as when they see a 404, they go right for the back button.
-
410 is the recommended way to tell search engines the page is gone. all of the things mentioned above are a facet of how you should deal with this issue. sorry for the brevity and terrible punction. moz forum is a pretty iffy thing via mobile. my eggs are getting cold.
-
Hi!
The reason why these pages keep popping up in WMT is that they have already been indexed. You could try to remove them from Google's index by using the removal tool in WMT (https://www.google.com/webmasters/tools/url-removal) or by setting up "301 Redirect" for them to more ideal pages.
Hope this helps

Anders -
Hi,
I would look at this from two perspectives.
1. These thins pages could have been beefed-up with some unique content or at least the content could have been re-written to make them unique. Personally, I prefer to make the duplicate pages unique instead of deleting them.This of course depends on the number of pages and the level of duplication.
2. Now that these pages have been removed from the website, you should be erasing all the links to these from within the website from all the places like, sitemaps and internal linking so that the search engines do not find a link pointing to them that might end-up in a 404 error. You should also consider if there have been any references left to these pages from third-party web properties.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Removing Zombie pages effect on domain authority?
Hi. Recently I got a project (removing zombie pages here: https://www.alamto.com/ ) As you can see this site has about 20k indexed page on google and it seems I should remove about 6000 useless indexed page. does removing (Noindex) these pages affect on the site metrics? Which metrics would affected? and how? Thanks.
On-Page Optimization | | jafarfahi1 -
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
Remove all stop words from permalink?
I saw many websites theses days remove stop words from the URL, How important is to remove stop words from the URL?
On-Page Optimization | | varunrupal0 -
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Noindex child pages (whose content is included on parent pages)?
I'm sorry if there have been questions close to this before... I've using WordPress less like a blogging platform and more like a CMS for years now... For content management purposes we organize a lot of content around Parent/Child page (and custom-post-type) relationships; the Child pages are included as tabbed content on the Parent page. Should I be noindexing these child pages, since their content is already on the site, in full, on their Parent pages (ie. duplicate content)? Or does it not matter, since the crawlers may not go to all of the tabbed content? None of the pages have shown up in Moz's "High Priority Issues" as duplicate content but it still seems like I'm making the Parent pages suffer needlessly... Anything obvious I'm not taking into consideration? By the by, this is my first post here @ Moz, which I'm loving; this site and the forums are such a great resource! Anyways, thanks in advance!
On-Page Optimization | | rsigg0 -
Does Google look at page design
Hi everybody, At the moment i'm creating several webshops and websites with the same layout, so visitors can recognize the websites are from the same company. But i was wondering: Does google look at the layout of a webpage that it's not a copy of another website? This because loads of website have the same wordpress/joomla templates etc, or doesn't this effect rankingpositions? Thank you,
On-Page Optimization | | iwebdevnl0 -
URL for location pages
Hello all We would like to create clean, easy URLs for our large list of Location pages. If there are a few URLs for each of the pages, am I right when I'm saying we would like this to be the canonical? Right now we would like the URL to be: For example
On-Page Optimization | | Ferguson
Domain.com/locations/Columbus I have found some instances where there might be 2,3 or more locations in the same city,zip. My conclusion for these would be: adding their Branch id's on to the URL
Domain.com/locations/Columbus/0304 Is this an okay approach? We are unsure if the URL should have city,State,zip for SEO purposes?
The pages will have all of this info in it's content
BUT what would be best for SEO and ranking for a given location? Thank you for any info!0