404's - Do they impact search ranking/how do we get rid of them?
-
Hi,
We recently ran the Moz website crawl report and saw a number of 404 pages from our site come back. These were returned as "high priority" issues to fix. My question is, how do 404's impact search ranking? From what Google support tells me, 404's are "normal" and not a big deal to fix, but if they are "high priority" shouldn't we be doing something to remove them?
Also, if I do want to remove the pages, how would I go about doing so? Is it enough to go into Webmaster tools and list it as a link no to crawl anymore or do we need to do work from the website development side as well?
Here are a couple of examples that came back..these are articles that were previously posted but we decided to close out:
http://loyalty360.org/resources/article/mark-johnson-speaks-at-motivation-show
Thanks!
-
Hi
As far as I know there is no way to do this in webmaster tools. You can test your robots.txt file with the Robots.txt Tester - but you need to actually update the real file to block URLs from being crawled.
At any rate, normally you would not block 404s from being crawled - Google with either stop crawling them on their own, or this way if they are indexed they can drop out of the index.
-
By submit to webmaster tools, I meant submit the link so Google will not crawl it again.
-
What do you mean by "submit links to Google Webmaster Tools"? As far as I know there isn't a way to submit 404 URLs in there.
The way to solve 404s are;
- make the URL a real page again (if it broke by accident)
- remove links pointing at the bad page
- 301 redirect the 404 page to one that works
- you can opt to leave it alone if there was nothing important on that page and there is no good page to redirect it to
404s might hurt rankings, but only in extreme cases where it was a popular page and now you're losing the back link value or referral traffic etc. I'd say in 90/100 cases 404s will not hurt your rankings.
-
Interesting - good to know! So even when we submit these links to Google Webmaster tools, that doesn't solve the problem, correct? Even if Google isn't crawling these links (eventually) will it still hurt SEO rankings overall?
-
Got it. So I guess we need to decide what makes sense work-load wise and what is best for the site. If we do 301 redirects, is that seen as more beneficial than an "engaging" 404 page that allows people to go to another page?
It seems like the 404 page would be one project where constantly adding in 301 redirects would be a lot of work.
-
Theoretically a 404 error is a deleted page. To get rid of the 404 error you have to redirect the broken link, or deleted page.
-
Is there no way to just completely remove or delete a page/404 or it will always exist on some level?
-
Hey There
Google's webmaster documentation says;
"Generally, 404 errors don’t impact your site’s ranking in Google, and you can safely ignore them."
When Google says "generally" this tends to mean "in most cases" or "not directly" or "there may be secondary effects"... you get the idea.
But I think they are assuming you need to be smart enough to know if the 404 was intentional, and if not why it happened. For example - if you had a really popular piece of content with back links directly to that URL, and then the URL 404s - you supposed may lose the "link juice" pointing into that article. So in that regard 404s can hurt rankings secondarily.
But as other have said, you can redirect your 404s to a similar page (Google recommends not the homepage).
I am not sure why the Moz report puts them in "high priority" - perhaps they mean "high priority" from a general web best practice point of view, and not strictly SEO.
-
With that many I would suggest redirecting them to a relevant page rather than just stopping the indexing of them by submitting the links to Google Webmaster Tools. From what I've experienced, keeping the link juice flowing through your site by redirecting them is better for your overall SEO efforts.
Of course it's faster to submit the links to GWT…but that doesn't necessarily mean it's better. Regardless of what you do or how you do it, eliminating your crawl errors is very important.
-
https://www.youtube.com/watch?v=9tz7Eexwp_A
This is video by Matt Cutts that gives some great advice. My goal is always to redirect them, even if it is back to the main article category page or even the home page. I hate the thought of losing a potential customer to a 404 error. This has to be your decision though.
Errors are not good, no matter what kind of error they are. Best practice is to remove any error you can. When your bounce rate increases you lose ranking power. When you have broken links, you lose searchers. That is the simplest way to put it.
-
Fix them, redirect them back to a relevant page and then mark them as fixed in GWT.
-
When we ran the MOZ report it said we had more than a couple...probably around 50 or so. Our website has been around 5-6 years and I don't think we have ever done anything with any of them. With this many errors, what is your suggestion? Would it be faster to submit the link to Google Webmaster tools than waiting for them to be crawled again?
-
404's can reduce your ability to rank highly for keywords when they effect your bounce rate and lower your impressions. Consider it giving your website a bad reputation. Again, it takes a lot of them to do this.
-
We are using Expression Engine. A lot of the links are within our own site - they are articles we once posted, but then we decided to close for one reason or another, and now they are throwing a 404 error. We don't necessarily have anything to redirect them to since they are mostly just random article pieces, which is why we were looking into deleting them completely.
-
There's tons of documentation stating that 404's negatively affect SEO. It's definitely debatable and there are obviously other factors involved. My main point is that it's important to deal with any and all crawl errors.
-
adamxj2 re: "... having too many at once can negatively affect your rankings...."
???
on what testing do you quote that? As my own SEO world includes no such assumptions or proof of same!
WHAT a 404 will affect is conversions...no one who shows up on a site after taking a link into same and finding a 404 will ever get a feeling other than if a site can't fix it's 404's then why would I belive they can sell me something etc.
404's do NOT affect rankings....they disappear on their own it's true...but I always fix same asap!
-
Hello!
Although 404's will eventually stop being crawled by Google, having too many at once can negatively affect your rankings. The most important thing is that you do not want to be linking to these 404s anywhere from within your site. If so, you want to definitely remove those links.
If I have one or two 404s in my crawl errors, I typically will just leave them be and wait for them to be dropped out of being indexed. Some other solutions I've utilized are:
1. Make an engaging 404 page so that when users find the page they will be encouraged to stay on the website. Having a search box or some of the most popular links on the page is a good place to start
2. 301 redirect the pages to relevant pages that do exist. This will help your link juice flow and will make for a good user experience since they are reaching a relevant page.
Hope that helps!
-
I would log in to GWT and look at your 404 errors under crawl errors. In there you will see where the links are still linked from. If they are pointing at external sites, I would redirect them. I don't know what platform you are using, but you should be able to do this in the admin section of your platform.
If they aren't linked externally, you should probably still redirect them. I know the Google says that 404 errors are harmless, but if you have dead links on your site and someone clicks on it, it most likely results in a lost searcher.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ridding of taxonomies, so that articles enhance related page's value
Hello, I'm developing a website for a law firm, which offers a variety of services. The site will also feature a blog, which would have similarly-named topics. As is customary, these topics were taxonomies. But I want the articles to enhance the value of the service pages themselves and because the taxonomy url /category/divorce has no relationship to the actual service page url /practice-areas/divorce, I'm worried that if anything, a redundantly-titled taxonomy url would dilute the value of the service page it's related to. Sure, I could show some of the related posts on the service page but if I wanted to view more, I'm suddenly bounced over to a taxonomy page which is stealing thunder away from the more important service page. So I did away with these taxonomies all together, and posts are associatable with pages directly with a custom db table. And now if I visit the blog page, instead of a list of category terms, it would technically be a list of the service pages and so if a visitor clicks on a topic they are directed to /practice-areas/divorce/resources (the subpages are created dynamically) and the posts are shown there. I'll have to use custom breadcrumbs to make it all work. Just wondering if you guys had any thoughts on this. Really appreciate any you might have and thanks for reading
Intermediate & Advanced SEO | | utopianwp0 -
NGinx rule for redirecting trailing '/'
We have successfully implemented run-of-the-mill 301s from old URLs to new (there were about 3,000 products). As normal. Like we do on every other site etc. However, recently search console has started to report a number of 404s with the page names with a trailing forward slash at the end of the .html suffix. So, /old-url.html is redirecting (301) to /new-url.html However, now for some reason /old-url.html/ has 'popped up' in the Search Console crawl report as a 404. Is there a 'blobal' rule you can write in nGinx to say redirect *.html/ to */html (without the forward slash) rather than manually doing them all?
Intermediate & Advanced SEO | | AbsoluteDesign0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
'Nofollow' footer links from another site, are they 'bad' links?
Hi everyone,
Intermediate & Advanced SEO | | romanbond
one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..0 -
Google's Structured Data Testing Tool? No Data
I'm stumped as to why some of the pages on my website return no data from Google's Structured Data Testing Tool while other pages work fine and return the appropriate data. My home page http://www.parkseo.net returns no data while many inner pages do. http://www.parkseo.net Returns No Data http://www.parkseo.net/citation-submission.html Does Return Data. I have racked my brains out trying to figure out why some pages return data and others don't. Any help on this issue would be greatly appricated. Cheers!
Intermediate & Advanced SEO | | YMD
Gary Downey0 -
We will be switching our shopping cart platform from volusion to magento and really cautious / nervous about our rankings / seo stuff. any advice for anyone that has migrated stores, etc. these urls are years old, etc.
shopping cart platform switch and SEO. What do you suggest? What's the best way to ensure we keep rankings.
Intermediate & Advanced SEO | | PaulDylan0 -
.com Outranking my ccTLD's and cannot figure out why.
So I have a client that has a number of sites for a number of different countries with their specific ccTLD. They also have a .com in the US. The problem is that the UK site hardly ranks for anything while the .com ranks for a ton in the UK. I have setup GWT for the UK and the .com to be specific to their geographic locations. So I have the ccTLD and I have GWT showing where I want these sites to rank. Problem is it apparently is not working....Any clues as to what else I could do?
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
How to get the 'show map of' tag/link in Google search results
I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0