Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
404's - Do they impact search ranking/how do we get rid of them?
-
Hi,
We recently ran the Moz website crawl report and saw a number of 404 pages from our site come back. These were returned as "high priority" issues to fix. My question is, how do 404's impact search ranking? From what Google support tells me, 404's are "normal" and not a big deal to fix, but if they are "high priority" shouldn't we be doing something to remove them?
Also, if I do want to remove the pages, how would I go about doing so? Is it enough to go into Webmaster tools and list it as a link no to crawl anymore or do we need to do work from the website development side as well?
Here are a couple of examples that came back..these are articles that were previously posted but we decided to close out:
http://loyalty360.org/resources/article/mark-johnson-speaks-at-motivation-show
Thanks!
-
Hi
As far as I know there is no way to do this in webmaster tools. You can test your robots.txt file with the Robots.txt Tester - but you need to actually update the real file to block URLs from being crawled.
At any rate, normally you would not block 404s from being crawled - Google with either stop crawling them on their own, or this way if they are indexed they can drop out of the index.
-
By submit to webmaster tools, I meant submit the link so Google will not crawl it again.
-
What do you mean by "submit links to Google Webmaster Tools"? As far as I know there isn't a way to submit 404 URLs in there.
The way to solve 404s are;
- make the URL a real page again (if it broke by accident)
- remove links pointing at the bad page
- 301 redirect the 404 page to one that works
- you can opt to leave it alone if there was nothing important on that page and there is no good page to redirect it to
404s might hurt rankings, but only in extreme cases where it was a popular page and now you're losing the back link value or referral traffic etc. I'd say in 90/100 cases 404s will not hurt your rankings.
-
Interesting - good to know! So even when we submit these links to Google Webmaster tools, that doesn't solve the problem, correct? Even if Google isn't crawling these links (eventually) will it still hurt SEO rankings overall?
-
Got it. So I guess we need to decide what makes sense work-load wise and what is best for the site. If we do 301 redirects, is that seen as more beneficial than an "engaging" 404 page that allows people to go to another page?
It seems like the 404 page would be one project where constantly adding in 301 redirects would be a lot of work.
-
Theoretically a 404 error is a deleted page. To get rid of the 404 error you have to redirect the broken link, or deleted page.
-
Is there no way to just completely remove or delete a page/404 or it will always exist on some level?
-
Hey There
Google's webmaster documentation says;
"Generally, 404 errors don’t impact your site’s ranking in Google, and you can safely ignore them."
When Google says "generally" this tends to mean "in most cases" or "not directly" or "there may be secondary effects"... you get the idea.
But I think they are assuming you need to be smart enough to know if the 404 was intentional, and if not why it happened. For example - if you had a really popular piece of content with back links directly to that URL, and then the URL 404s - you supposed may lose the "link juice" pointing into that article. So in that regard 404s can hurt rankings secondarily.
But as other have said, you can redirect your 404s to a similar page (Google recommends not the homepage).
I am not sure why the Moz report puts them in "high priority" - perhaps they mean "high priority" from a general web best practice point of view, and not strictly SEO.
-
With that many I would suggest redirecting them to a relevant page rather than just stopping the indexing of them by submitting the links to Google Webmaster Tools. From what I've experienced, keeping the link juice flowing through your site by redirecting them is better for your overall SEO efforts.
Of course it's faster to submit the links to GWT…but that doesn't necessarily mean it's better. Regardless of what you do or how you do it, eliminating your crawl errors is very important.
-
https://www.youtube.com/watch?v=9tz7Eexwp_A
This is video by Matt Cutts that gives some great advice. My goal is always to redirect them, even if it is back to the main article category page or even the home page. I hate the thought of losing a potential customer to a 404 error. This has to be your decision though.
Errors are not good, no matter what kind of error they are. Best practice is to remove any error you can. When your bounce rate increases you lose ranking power. When you have broken links, you lose searchers. That is the simplest way to put it.
-
Fix them, redirect them back to a relevant page and then mark them as fixed in GWT.
-
When we ran the MOZ report it said we had more than a couple...probably around 50 or so. Our website has been around 5-6 years and I don't think we have ever done anything with any of them. With this many errors, what is your suggestion? Would it be faster to submit the link to Google Webmaster tools than waiting for them to be crawled again?
-
404's can reduce your ability to rank highly for keywords when they effect your bounce rate and lower your impressions. Consider it giving your website a bad reputation. Again, it takes a lot of them to do this.
-
We are using Expression Engine. A lot of the links are within our own site - they are articles we once posted, but then we decided to close for one reason or another, and now they are throwing a 404 error. We don't necessarily have anything to redirect them to since they are mostly just random article pieces, which is why we were looking into deleting them completely.
-
There's tons of documentation stating that 404's negatively affect SEO. It's definitely debatable and there are obviously other factors involved. My main point is that it's important to deal with any and all crawl errors.
-
adamxj2 re: "... having too many at once can negatively affect your rankings...."
???
on what testing do you quote that? As my own SEO world includes no such assumptions or proof of same!
WHAT a 404 will affect is conversions...no one who shows up on a site after taking a link into same and finding a 404 will ever get a feeling other than if a site can't fix it's 404's then why would I belive they can sell me something etc.
404's do NOT affect rankings....they disappear on their own it's true...but I always fix same asap!
-
Hello!
Although 404's will eventually stop being crawled by Google, having too many at once can negatively affect your rankings. The most important thing is that you do not want to be linking to these 404s anywhere from within your site. If so, you want to definitely remove those links.
If I have one or two 404s in my crawl errors, I typically will just leave them be and wait for them to be dropped out of being indexed. Some other solutions I've utilized are:
1. Make an engaging 404 page so that when users find the page they will be encouraged to stay on the website. Having a search box or some of the most popular links on the page is a good place to start
2. 301 redirect the pages to relevant pages that do exist. This will help your link juice flow and will make for a good user experience since they are reaching a relevant page.
Hope that helps!
-
I would log in to GWT and look at your 404 errors under crawl errors. In there you will see where the links are still linked from. If they are pointing at external sites, I would redirect them. I don't know what platform you are using, but you should be able to do this in the admin section of your platform.
If they aren't linked externally, you should probably still redirect them. I know the Google says that 404 errors are harmless, but if you have dead links on your site and someone clicks on it, it most likely results in a lost searcher.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Readd/Reindex a page that was 410'd
A script of ours had an error that caused some pages we didn't wish 410'd to be 410'd, we caught it in about 12 hours but for some pages it was too late. My question is, will those pages be reindexed again and how will that affect their page ranking will they eventually be back where they were? Would submitting a site map with them help, or what would be the best way to correct this error (submit the links to google indexer maybe?).
Intermediate & Advanced SEO | | Wana-Ryd0 -
Moved company 'Help Center' from Zendesk to Intercom, got lots of 404 errors. What now?
Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/*** is where all our articles used to sit https://help.rewind.io/*** is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!
Intermediate & Advanced SEO | | henrycabrown1 -
Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters
We have several vanity domains that forward to various pages on our primary domain.
Intermediate & Advanced SEO | | SS.Digital
e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200) These forwards have been in place for months or even years and have worked fine. As of yesterday, we have seen the following problem. We have made no changes in the forwarding settings. Now, inconsistently, they sometimes resolve and sometimes they do not. When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters. (e.g. VGuTD) EXAMPLE:
www.vanity.com (302, Found) -->
www.vanity.com/xxxxx (302, Found) -->
www.vanity.com/xxxxx (302, Found) -->
www.vanity.com/xxxxx/xxxxx (302, Found) -->
www.mydomain.com/sub-page/xxxxx (404, Not Found) This is just one example, the amount of redirects, vary wildly. Sometimes there is only 1 redirect, sometimes there are as many as 5. Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above). We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc... This leads us to believe that it is not at the device or host level. Our Registrar is Godaddy. They have not encountered this issue before, and have no idea what this 5 character string is from. I tend to believe them because per our analytics, we have determined that this problem only started yesterday. Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past? We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time. Really hoping to fix the cause of the problem instead of just treating the symptom.0 -
Why some websites can rank the keywords they don't have in the page?
Hello guys, Yesterday, I used SEMrush to search for the keyword "branding agency" to see the SERP. The Liquidagency ranks 5th on the first page. So I went to their homepage but saw no exact keywords "branding agency", even in the page source. Also, I didn't see "branding agency" as a top anchor text in the external links to the page (from the report of SEMrush). I am an SEO newbie, can someone explain this to me, please? Thank you.
Intermediate & Advanced SEO | | Raymondlee0 -
How much does dirty html/css etc impact SEO?
Good Morning! I have been trying to clean up this website and half the time I can't even edit our content without breaking the WYSIWYG Editor. Which leads me to the next question. How much, if at all, is this impacting our SEO. To my knowledge this isn't directly causing any broken pages for the viewer, but still, it certainly concerns me. I found this post on Moz from last year: http://moz.com/community/q/how-much-impact-does-bad-html-coding-really-have-on-seo We have a slightly different set of code problems but still wanted to revisit this question and see if anything has changed. I also can't imagine that all this broken/extra code is helping our page load properly. Thanks everybody!
Intermediate & Advanced SEO | | HashtagHustler0 -
Posing QU's on Google Variables "aclk", "gclid" "cd", "/aclk" "/search", "/url" etc
I've been doing a bit of stats research prompted by read the recent ranking blog http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables There are a few things that have come up in my research that I'd like to clear up. The below analysis has been done on my "conversions". 1/. What does "/aclk" mean in the Referrer URL? I have noticed a strong correlation between this and "gclid" in the landing page variable. Does it mean "ad click" ?? Although they seem to "closely" correlate they don't exactly, so when I have /aclk in the referrer Url MOSTLY I have gclid in the landing page URL. BUT not always, and the same applies vice versa. It's pretty vital that I know what is the best way to monitor adwords PPC, so what is the best variable to go on? - Currently I am using "gclid", but I have about 25% extra referral URL's with /aclk in that dont have "gclid" in - so am I underestimating my number of PPC conversions? 2/. The use of the variable "cd" is great, but it is not always present. I have noticed that 99% of my google "Referrer URL's" either start with:
Intermediate & Advanced SEO | | James77
/aclk - No cd value
/search - No cd value
/url - Always contains the cd variable. What do I make of this?? Thanks for the help in advance!0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0