Why are Pages returning 404 errors not being dropped?
-
Our webmaster tools continues to return anywhere upwards of 750 pages that have 404 errors. These are from pages of a previous site no longer used.
However this was over 1 year ago these pages were dropped along with the 301 re-directs. Why is Google not clearing these from webmaster tools but re-listing them again after 3 month cycle? Is it because external sites have links to these pages?
If so should I put a 301 in place (most of these site are forums and potentially dodgy directories etc from previous poor link building programs) or ask for a manual removal?
-
Thanks tom for all your help.
Regards
Craig
-
Very good point you've raised - 301ing those URLs effectively makes the links to your site "live" again. If the links sit on a dodgy/spammy/poor quality page, then it could harm your site and I wouldn't put the redirect in place.
By in large, if you're beginning to doubt whether the link is worthwhile or not, chances are its not. So if you have a bit of doubt about the link, then don't put the 301 in place.
-
Hi Tom,
That more than explains it and gives me the answers. If I put 301 redirects in place what will happen if any of these external links are bad, will it harm our site? Its taken me many months to deal with duplicate content issues, canonicalisation of the site and much more. It was a complete mess and I don;t want to harm any good that come of all this.
-
Hi Craig
You touched on one of the reasons this is happening in your post - you could external links to these pages. Also, they could still be appearing in the sitemap.
If you go into Webmaster tools > Health > Crawl Errors > Not Found and then click on one of the URLs, you can check whether or not the page is in the sitemap or whether it is being linked to from somewhere.
If you have external links, you have four options. First, you could attempt to change the URLs on the pages they're being linked from. This could be difficult and/or long. Second, as you say, you could 301 redirect. This would be useful if people are coming through those sites still, as you'll be fixing their user journey. It would also pass on any link "juice" that page has to another. Third would be to start returning a 410 error. This explains 410 response codes - it basically tells the Googlebot to treat the URL as gone permanently. This can be a bit tricky to setup and you have to be sure you want use the URL again in the future.
Finally, you could leave the 404s in place. If none of the pages have any strength, no referral traffic is coming from them and they aren't interrupting a user journey in any way, I would simply leave them. Google knows that 404s are just a matter of process and so recognises that 404 errors are simply a natural occurrence. It would only ever be a problem if you returned tens of thousands of them, so you may just want to leave them be.
I would probably 301 redirect any old pages carrying strength to relevant equivalents (if not, the root domain) and leave the other 404s in place. I would rewrite ASAP any URL that is interrupting a user journey.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page speed - what do you aim for?
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download? Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Intermediate & Advanced SEO | | McTaggart
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec) Thanks, Luke0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Rankings drop - we've added user reviews, are they causing over optimisation on page?
Hello Hopefully can get a few opinions on this. We've added some user reviews to our website for key products. We added these approximately 3-4 weeks ago. In the last week we've seen keyword rankings drop on the pages they've been added to. For example see: http://www.naturalworldsafaris.com/wildlife/primates.aspx This page ranked well for both gorilla safari and gorilla safaris but both terms have dropped considerably (12 to 20 checking Google UK on the Moz rank checker). Due to the formatting required for the Rich Snippets (and we have the user review stars in the SERPS) the term "Gorilla safari" is perhaps becoming a bit spammy on the page. Another example would be "Borneo holidays" (up and down in the SERPS between 12-18) on this page: http://www.naturalworldsafaris.com/destinations/far-east/borneo.aspx Do you feel that these fluctuations in keyword ranking could be to do with this? Thanks
Intermediate & Advanced SEO | | KateWaite0 -
404 Errors
Do 404 Errors really have a lot of impact on rankings and over all authority of the site with google? Say you have a site that all the pages have moved apart from the home page which is exactly the same before moving? So most of your pages are showing 404 errros.
Intermediate & Advanced SEO | | summer3000 -
Facebook page optimization
I'm working with a client who is "under attack" by one unhappy customer. That customer created a Facebook page to share her outrage, and her page is outranking my client's (consistently immediately above his FB page). I've checked all of the obvious things... page name page URL About section, and all business-related data He has MANY more "Likes" than she does, makes posts far more frequently (with much better Engagement), references his company name in almost every Post (as she does), and on and on. My main question is this... are there one or two factors that seem to have the most impact on how a given FB page ranks? Thanks for your help, Moz family! 🙂
Intermediate & Advanced SEO | | measurableROI0 -
Why Would This Old Page Be Penalized?
Here's an old page on a trustworthy domain with no apparent negative SEO activity according to OSE and ahrefs: http://www.gptours.com/Monaco-Grand-Prix They went from page 1 to page 13 for "monaco grand prix" within about 4 weeks. Week 2 we pulled out all the duplicate content in the history section. When rank slipped further, we put it back. Yet it's still moving down, while other pages on the website are holding strong. Next steps will be to add some schema.org/Event microformats, but beyond that, do you have any ideas?
Intermediate & Advanced SEO | | stevewiideman0 -
To land page or not to land page
Hey all, I wish to increase my sites rankings on a variety of keywords within sub categories but I'm unsure where to be spending the time in SEO. Here's an example of the website page structure: General Home Page > Sub Category 1 Home Page
Intermediate & Advanced SEO | | DPSSeomonkey
> Searching / Results pages
- Sub Category 1
- Sub Category 2
- Sub Category 3
- Sub Category 4 > Sub Category 2 Home Page
> Searching / Results pages
- Sub Category 1
- Sub Category 2
- Sub Category 3
- Sub Category 4 We've newly introduced the Sub Category Home Pages and I was wondering if SEO is best performed on these pages or should landing pages be built, one for each of the 4 sub categories in each section. Those landing pages would have links to the "Searching / Results pages" for that sub category. Thanks!0