Why are Pages returning 404 errors not being dropped?
-
Our webmaster tools continues to return anywhere upwards of 750 pages that have 404 errors. These are from pages of a previous site no longer used.
However this was over 1 year ago these pages were dropped along with the 301 re-directs. Why is Google not clearing these from webmaster tools but re-listing them again after 3 month cycle? Is it because external sites have links to these pages?
If so should I put a 301 in place (most of these site are forums and potentially dodgy directories etc from previous poor link building programs) or ask for a manual removal?
-
Thanks tom for all your help.
Regards
Craig
-
Very good point you've raised - 301ing those URLs effectively makes the links to your site "live" again. If the links sit on a dodgy/spammy/poor quality page, then it could harm your site and I wouldn't put the redirect in place.
By in large, if you're beginning to doubt whether the link is worthwhile or not, chances are its not. So if you have a bit of doubt about the link, then don't put the 301 in place.
-
Hi Tom,
That more than explains it and gives me the answers. If I put 301 redirects in place what will happen if any of these external links are bad, will it harm our site? Its taken me many months to deal with duplicate content issues, canonicalisation of the site and much more. It was a complete mess and I don;t want to harm any good that come of all this.
-
Hi Craig
You touched on one of the reasons this is happening in your post - you could external links to these pages. Also, they could still be appearing in the sitemap.
If you go into Webmaster tools > Health > Crawl Errors > Not Found and then click on one of the URLs, you can check whether or not the page is in the sitemap or whether it is being linked to from somewhere.
If you have external links, you have four options. First, you could attempt to change the URLs on the pages they're being linked from. This could be difficult and/or long. Second, as you say, you could 301 redirect. This would be useful if people are coming through those sites still, as you'll be fixing their user journey. It would also pass on any link "juice" that page has to another. Third would be to start returning a 410 error. This explains 410 response codes - it basically tells the Googlebot to treat the URL as gone permanently. This can be a bit tricky to setup and you have to be sure you want use the URL again in the future.
Finally, you could leave the 404s in place. If none of the pages have any strength, no referral traffic is coming from them and they aren't interrupting a user journey in any way, I would simply leave them. Google knows that 404s are just a matter of process and so recognises that 404 errors are simply a natural occurrence. It would only ever be a problem if you returned tens of thousands of them, so you may just want to leave them be.
I would probably 301 redirect any old pages carrying strength to relevant equivalents (if not, the root domain) and leave the other 404s in place. I would rewrite ASAP any URL that is interrupting a user journey.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 from old site to new one , Should I point to home page or sub category page ?
Hey Seo Experts, I have a small website ranking for few terms like cabinets sale, buy etc . However what i have now decided is to launch a New website with more different products like living room furniture, wardrobes etc . Out of all these categories on new website Cabinets is one of the SubCategory . Now I do not want to have 2 websites . So wanted to 301 from small cabinets website to newly created website. Some of the doubts I have at the moment is ? 1 Should I REDIRECT 301 to sub category (i,e cabinets) which is purely related to Cabinets or Do a Redirect to HOME PAGE . As I also need more Authority to home page as well , as this is relatively new website ? 2 Second question related to this. If you have multiple sub domains does it divide the total authority & TF.Or it is just Ok to have multiple Sub domains if needed ? Any advice appreciated !! Thanks .
Intermediate & Advanced SEO | | aus00070 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
Is a 404, then a meta refresh 301 to the home page OK for SEO?
Hi Mozzers I have a client that had a lot of soft 404s that we wanted to tidy up. Basically everything was going to the homepage. I recommended they implement proper 404s with a custom 404 page, and 301 any that really should be redirected to another page. What they have actually done is implemented a 404 (without the custom 404 page) and then after a short delay 301 redirected to the homepage. I understand why they want to do this as they don't want to lose the traffic, but is this a problem with SEO and the index? Or will Google treat as a hard 404 anyway? Many thanks
Intermediate & Advanced SEO | | Chammy0 -
PDF or HTML Page?
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
Intermediate & Advanced SEO | | Sika220 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
How should I go about repairing 400,000 404 error pages?
My thinking is to make a list of most linked to and most trafficked error pages, and just redirect those, but I don't know how to get all that data because i can't even download all the error pages from Webmaster Tools, and even then, how would i get backlink data except by checking each link manually? Are there any detailed step-by-step instructions on this that I missed in my Googling? Thanks for reading!!
Intermediate & Advanced SEO | | DA20130 -
Why Would This Old Page Be Penalized?
Here's an old page on a trustworthy domain with no apparent negative SEO activity according to OSE and ahrefs: http://www.gptours.com/Monaco-Grand-Prix They went from page 1 to page 13 for "monaco grand prix" within about 4 weeks. Week 2 we pulled out all the duplicate content in the history section. When rank slipped further, we put it back. Yet it's still moving down, while other pages on the website are holding strong. Next steps will be to add some schema.org/Event microformats, but beyond that, do you have any ideas?
Intermediate & Advanced SEO | | stevewiideman0 -
Do I need a canonical tag on the 404 error page?
Per definition, a 404 is displayed for different url (any not existing url ...). As I try to clean my website following SEOmoz pro advices, SEOmoz notify me of duplicate content on urls leading to a 404 🙂 This is I guess not that important, but just curious: should we add a cononical tag to the template returning the 404, with a canonical url such as www.mysite.com/404 ?
Intermediate & Advanced SEO | | nuxeo0