What to do with 404 errors when you don't have a similar new page to 301 to ??
-
Hi
If you have 404 errors for pages that you dont have similar content pages to 301 them to, should you just leave them (the 404's are optimised/qood quality with related links & branding etc) and they will eventually be de-indexed since no longer exist or should you 'remove url' in GWT ?
Cheers
Dan
-
Thanks Chaps !!
Have a great weekend
CHeers
Dan
-
If no pages are related, I typically take them to a customized 404 page. You don't need to remove it from GWT:
When a page is updated or removed, it will automatically fall out of our search results. You don’t need to do anything to make this happen.
-
If i do not have a relevant page to redirect them to i always send them back to the homepage. This way the end user, search engines and link equity are still brought back to the highest level page. It isn't the best case scenario but it is better than losing your equity and end users getting an error.
Hope this helps!
- Kyle
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old pages not mobile friendly - new pages in process but don't want to upset current traffic.
Working with a new client. They have what I would describe as two virtual websites. Same domain but different coding, navigation and structure. Old virtual website pages fail mobile friendly, they were not designed to be responsive ( there really is no way to fix them) but they are ranking and getting traffic. New virtual website pages pass mobile friendly but are not SEO optimized yet and are not ranking and not getting organic traffic. My understanding is NOT mobile friendly is a "site" designation and although the offending pages are listed it is not a "page" designation. Is this correct? If my understanding is true what would be the best way to hold onto the rankings and traffic generated by old virtual website pages and resolve the "NOT mobile friendly" problem until the new virtual website pages have surpassed the old pages in ranking and traffic? A proposal was made to redirect any mobile traffic on the old virtual website pages to mobile friendly pages. What will happen to SEO if this is done? The pages would pass mobile friendly because they would go to mobile friendly pages, I assume, but what about link equity? Would they see a drop in traffic ? Any thoughts? Thanks, Toni
Technical SEO | | Toni70 -
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
Moz is returning some of my pages as 404 but why when they are live?
hi guys, i would appreciate some advice on this. here are some example pages where i am getting a 404 status;
Technical SEO | | gezzagregz
http://www.colourbanners.co.uk/printed-boards/correx-boards.html
http://www.colourbanners.co.uk/printed-boards/foamex-boards.html There are quite a few, but thes a live pages so why is this happening? Also our site has dropped in the SERPS, i was wondering if this has something to do with it? many thanks Gerry0 -
When testing the on page report I'm having a few problems
First of all, is this test checking my seo optimization over the whole website or just over one site: Ie. when I type in www.joelolson.ca...is it also checking sites like www.joelolson.ca/realtorresources... Secondly. I have found that it won't find specific websites on my page and says they can't be found when clearly they exist
Technical SEO | | JoelOlson0 -
Too Many On Page Links Error On Wordpress Blog
I have a wordpress blog. I am getting an error message from SEOmoz "too many on page links" However SEOmoz is counting a full month of blogs as one page. For example-3 onpage internal links in each blog times 30 different blog article in a month is recorded as 90 on page links. Is there any mechanism to fix this on wordpress
Technical SEO | | wianno1680 -
Weird 404 Errors in Webmaster Tools
Hi, In a regular check with Webmaster Tools, I have noticed a sudden increase in the number of "not found-404" errors. So I have been looking at them and noticed something weird has been going on. There are well over 100 pages with 404-errors. The funny thing is, none of the ULR's are correct, For example, if the actual url is something like www.domain.com/latest-reviews , the 404-error points to a non-existent URL like www.domain.com/latest-re And when I checked where they were linked from, they are all from these spammy sites. Anyone know what could be causing these links, why would anyone link on purpose to a non-existent page? cheers,
Technical SEO | | Gamer070