Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
404's - Do they impact search ranking/how do we get rid of them?
-
Hi,
We recently ran the Moz website crawl report and saw a number of 404 pages from our site come back. These were returned as "high priority" issues to fix. My question is, how do 404's impact search ranking? From what Google support tells me, 404's are "normal" and not a big deal to fix, but if they are "high priority" shouldn't we be doing something to remove them?
Also, if I do want to remove the pages, how would I go about doing so? Is it enough to go into Webmaster tools and list it as a link no to crawl anymore or do we need to do work from the website development side as well?
Here are a couple of examples that came back..these are articles that were previously posted but we decided to close out:
http://loyalty360.org/resources/article/mark-johnson-speaks-at-motivation-show
Thanks!
-
Hi
As far as I know there is no way to do this in webmaster tools. You can test your robots.txt file with the Robots.txt Tester - but you need to actually update the real file to block URLs from being crawled.
At any rate, normally you would not block 404s from being crawled - Google with either stop crawling them on their own, or this way if they are indexed they can drop out of the index.
-
By submit to webmaster tools, I meant submit the link so Google will not crawl it again.
-
What do you mean by "submit links to Google Webmaster Tools"? As far as I know there isn't a way to submit 404 URLs in there.
The way to solve 404s are;
- make the URL a real page again (if it broke by accident)
- remove links pointing at the bad page
- 301 redirect the 404 page to one that works
- you can opt to leave it alone if there was nothing important on that page and there is no good page to redirect it to
404s might hurt rankings, but only in extreme cases where it was a popular page and now you're losing the back link value or referral traffic etc. I'd say in 90/100 cases 404s will not hurt your rankings.
-
Interesting - good to know! So even when we submit these links to Google Webmaster tools, that doesn't solve the problem, correct? Even if Google isn't crawling these links (eventually) will it still hurt SEO rankings overall?
-
Got it. So I guess we need to decide what makes sense work-load wise and what is best for the site. If we do 301 redirects, is that seen as more beneficial than an "engaging" 404 page that allows people to go to another page?
It seems like the 404 page would be one project where constantly adding in 301 redirects would be a lot of work.
-
Theoretically a 404 error is a deleted page. To get rid of the 404 error you have to redirect the broken link, or deleted page.
-
Is there no way to just completely remove or delete a page/404 or it will always exist on some level?
-
Hey There
Google's webmaster documentation says;
"Generally, 404 errors don’t impact your site’s ranking in Google, and you can safely ignore them."
When Google says "generally" this tends to mean "in most cases" or "not directly" or "there may be secondary effects"... you get the idea.
But I think they are assuming you need to be smart enough to know if the 404 was intentional, and if not why it happened. For example - if you had a really popular piece of content with back links directly to that URL, and then the URL 404s - you supposed may lose the "link juice" pointing into that article. So in that regard 404s can hurt rankings secondarily.
But as other have said, you can redirect your 404s to a similar page (Google recommends not the homepage).
I am not sure why the Moz report puts them in "high priority" - perhaps they mean "high priority" from a general web best practice point of view, and not strictly SEO.
-
With that many I would suggest redirecting them to a relevant page rather than just stopping the indexing of them by submitting the links to Google Webmaster Tools. From what I've experienced, keeping the link juice flowing through your site by redirecting them is better for your overall SEO efforts.
Of course it's faster to submit the links to GWT…but that doesn't necessarily mean it's better. Regardless of what you do or how you do it, eliminating your crawl errors is very important.
-
https://www.youtube.com/watch?v=9tz7Eexwp_A
This is video by Matt Cutts that gives some great advice. My goal is always to redirect them, even if it is back to the main article category page or even the home page. I hate the thought of losing a potential customer to a 404 error. This has to be your decision though.
Errors are not good, no matter what kind of error they are. Best practice is to remove any error you can. When your bounce rate increases you lose ranking power. When you have broken links, you lose searchers. That is the simplest way to put it.
-
Fix them, redirect them back to a relevant page and then mark them as fixed in GWT.
-
When we ran the MOZ report it said we had more than a couple...probably around 50 or so. Our website has been around 5-6 years and I don't think we have ever done anything with any of them. With this many errors, what is your suggestion? Would it be faster to submit the link to Google Webmaster tools than waiting for them to be crawled again?
-
404's can reduce your ability to rank highly for keywords when they effect your bounce rate and lower your impressions. Consider it giving your website a bad reputation. Again, it takes a lot of them to do this.
-
We are using Expression Engine. A lot of the links are within our own site - they are articles we once posted, but then we decided to close for one reason or another, and now they are throwing a 404 error. We don't necessarily have anything to redirect them to since they are mostly just random article pieces, which is why we were looking into deleting them completely.
-
There's tons of documentation stating that 404's negatively affect SEO. It's definitely debatable and there are obviously other factors involved. My main point is that it's important to deal with any and all crawl errors.
-
adamxj2 re: "... having too many at once can negatively affect your rankings...."
???
on what testing do you quote that? As my own SEO world includes no such assumptions or proof of same!
WHAT a 404 will affect is conversions...no one who shows up on a site after taking a link into same and finding a 404 will ever get a feeling other than if a site can't fix it's 404's then why would I belive they can sell me something etc.
404's do NOT affect rankings....they disappear on their own it's true...but I always fix same asap!
-
Hello!
Although 404's will eventually stop being crawled by Google, having too many at once can negatively affect your rankings. The most important thing is that you do not want to be linking to these 404s anywhere from within your site. If so, you want to definitely remove those links.
If I have one or two 404s in my crawl errors, I typically will just leave them be and wait for them to be dropped out of being indexed. Some other solutions I've utilized are:
1. Make an engaging 404 page so that when users find the page they will be encouraged to stay on the website. Having a search box or some of the most popular links on the page is a good place to start
2. 301 redirect the pages to relevant pages that do exist. This will help your link juice flow and will make for a good user experience since they are reaching a relevant page.
Hope that helps!
-
I would log in to GWT and look at your 404 errors under crawl errors. In there you will see where the links are still linked from. If they are pointing at external sites, I would redirect them. I don't know what platform you are using, but you should be able to do this in the admin section of your platform.
If they aren't linked externally, you should probably still redirect them. I know the Google says that 404 errors are harmless, but if you have dead links on your site and someone clicks on it, it most likely results in a lost searcher.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
How to Get Rid of Dates Shown In Google Search Results
When I enter "Site: URL" to check what a search how Google displays search result, a date appears at the very front. This takes away several characters, really valuable real estate. How can I stop Google from displaying these dates? There are certain Wordpress plugins like "WP Date Remover" however the seem to only apply to blog posts. Dates are appearing on results on all my Wordpress pages. Is there an internal setting in Wordpress that will allow me to remove dates for these non blogpost pages?
Intermediate & Advanced SEO | | Kingalan11 -
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
Intermediate & Advanced SEO | | CayenneRed890 -
Does DMCA protection actually improve search rankings (assuming no one's stolen my content)
Hello Moz Community, I had a conversation with someone who claimed that implementing a DMCA protection badge, such as those offered at http://www.dmca.com/ for $10/mo, will improve a site's Google rankings. Is this true? I know that if my content is stolen it can hurt my rankings (or the stolen content can replace mine), but I'm asking if merely implementing the badge will help my rankings. Thanks! Bill
Intermediate & Advanced SEO | | Bill_at_Common_Form0 -
Will multiple domains from the same company rank for the same keyword search?
I'm trying to convince people that we need good marketing reasons for starting multiple domains, as it will be more difficult to rank multiple sites. Does anyone know if Google actively discourages multiple domains from the same company appearing in the search results for the same keyword? We are creating a separate content website which is related to an existing company website. Would you agree that is best to have these sites on one domain with the content site on a sub-domain perhaps? I'm worried about duplication of effort and cross-keyword targeting in particular. These sites would not have duplicate content.
Intermediate & Advanced SEO | | RG_SEO0 -
Ranking for local searches without city specific keywords?
Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!
Intermediate & Advanced SEO | | RickyShockley0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0