404's - Do they impact search ranking/how do we get rid of them?
-
Hi,
We recently ran the Moz website crawl report and saw a number of 404 pages from our site come back. These were returned as "high priority" issues to fix. My question is, how do 404's impact search ranking? From what Google support tells me, 404's are "normal" and not a big deal to fix, but if they are "high priority" shouldn't we be doing something to remove them?
Also, if I do want to remove the pages, how would I go about doing so? Is it enough to go into Webmaster tools and list it as a link no to crawl anymore or do we need to do work from the website development side as well?
Here are a couple of examples that came back..these are articles that were previously posted but we decided to close out:
http://loyalty360.org/resources/article/mark-johnson-speaks-at-motivation-show
Thanks!
-
Hi
As far as I know there is no way to do this in webmaster tools. You can test your robots.txt file with the Robots.txt Tester - but you need to actually update the real file to block URLs from being crawled.
At any rate, normally you would not block 404s from being crawled - Google with either stop crawling them on their own, or this way if they are indexed they can drop out of the index.
-
By submit to webmaster tools, I meant submit the link so Google will not crawl it again.
-
What do you mean by "submit links to Google Webmaster Tools"? As far as I know there isn't a way to submit 404 URLs in there.
The way to solve 404s are;
- make the URL a real page again (if it broke by accident)
- remove links pointing at the bad page
- 301 redirect the 404 page to one that works
- you can opt to leave it alone if there was nothing important on that page and there is no good page to redirect it to
404s might hurt rankings, but only in extreme cases where it was a popular page and now you're losing the back link value or referral traffic etc. I'd say in 90/100 cases 404s will not hurt your rankings.
-
Interesting - good to know! So even when we submit these links to Google Webmaster tools, that doesn't solve the problem, correct? Even if Google isn't crawling these links (eventually) will it still hurt SEO rankings overall?
-
Got it. So I guess we need to decide what makes sense work-load wise and what is best for the site. If we do 301 redirects, is that seen as more beneficial than an "engaging" 404 page that allows people to go to another page?
It seems like the 404 page would be one project where constantly adding in 301 redirects would be a lot of work.
-
Theoretically a 404 error is a deleted page. To get rid of the 404 error you have to redirect the broken link, or deleted page.
-
Is there no way to just completely remove or delete a page/404 or it will always exist on some level?
-
Hey There
Google's webmaster documentation says;
"Generally, 404 errors don’t impact your site’s ranking in Google, and you can safely ignore them."
When Google says "generally" this tends to mean "in most cases" or "not directly" or "there may be secondary effects"... you get the idea.
But I think they are assuming you need to be smart enough to know if the 404 was intentional, and if not why it happened. For example - if you had a really popular piece of content with back links directly to that URL, and then the URL 404s - you supposed may lose the "link juice" pointing into that article. So in that regard 404s can hurt rankings secondarily.
But as other have said, you can redirect your 404s to a similar page (Google recommends not the homepage).
I am not sure why the Moz report puts them in "high priority" - perhaps they mean "high priority" from a general web best practice point of view, and not strictly SEO.
-
With that many I would suggest redirecting them to a relevant page rather than just stopping the indexing of them by submitting the links to Google Webmaster Tools. From what I've experienced, keeping the link juice flowing through your site by redirecting them is better for your overall SEO efforts.
Of course it's faster to submit the links to GWT…but that doesn't necessarily mean it's better. Regardless of what you do or how you do it, eliminating your crawl errors is very important.
-
https://www.youtube.com/watch?v=9tz7Eexwp_A
This is video by Matt Cutts that gives some great advice. My goal is always to redirect them, even if it is back to the main article category page or even the home page. I hate the thought of losing a potential customer to a 404 error. This has to be your decision though.
Errors are not good, no matter what kind of error they are. Best practice is to remove any error you can. When your bounce rate increases you lose ranking power. When you have broken links, you lose searchers. That is the simplest way to put it.
-
Fix them, redirect them back to a relevant page and then mark them as fixed in GWT.
-
When we ran the MOZ report it said we had more than a couple...probably around 50 or so. Our website has been around 5-6 years and I don't think we have ever done anything with any of them. With this many errors, what is your suggestion? Would it be faster to submit the link to Google Webmaster tools than waiting for them to be crawled again?
-
404's can reduce your ability to rank highly for keywords when they effect your bounce rate and lower your impressions. Consider it giving your website a bad reputation. Again, it takes a lot of them to do this.
-
We are using Expression Engine. A lot of the links are within our own site - they are articles we once posted, but then we decided to close for one reason or another, and now they are throwing a 404 error. We don't necessarily have anything to redirect them to since they are mostly just random article pieces, which is why we were looking into deleting them completely.
-
There's tons of documentation stating that 404's negatively affect SEO. It's definitely debatable and there are obviously other factors involved. My main point is that it's important to deal with any and all crawl errors.
-
adamxj2 re: "... having too many at once can negatively affect your rankings...."
???
on what testing do you quote that? As my own SEO world includes no such assumptions or proof of same!
WHAT a 404 will affect is conversions...no one who shows up on a site after taking a link into same and finding a 404 will ever get a feeling other than if a site can't fix it's 404's then why would I belive they can sell me something etc.
404's do NOT affect rankings....they disappear on their own it's true...but I always fix same asap!
-
Hello!
Although 404's will eventually stop being crawled by Google, having too many at once can negatively affect your rankings. The most important thing is that you do not want to be linking to these 404s anywhere from within your site. If so, you want to definitely remove those links.
If I have one or two 404s in my crawl errors, I typically will just leave them be and wait for them to be dropped out of being indexed. Some other solutions I've utilized are:
1. Make an engaging 404 page so that when users find the page they will be encouraged to stay on the website. Having a search box or some of the most popular links on the page is a good place to start
2. 301 redirect the pages to relevant pages that do exist. This will help your link juice flow and will make for a good user experience since they are reaching a relevant page.
Hope that helps!
-
I would log in to GWT and look at your 404 errors under crawl errors. In there you will see where the links are still linked from. If they are pointing at external sites, I would redirect them. I don't know what platform you are using, but you should be able to do this in the admin section of your platform.
If they aren't linked externally, you should probably still redirect them. I know the Google says that 404 errors are harmless, but if you have dead links on your site and someone clicks on it, it most likely results in a lost searcher.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webpage has bombed outside of Top 50 for search term in one week. What's the cause?
I've been monitoring the performance of some pages via the email Moz sends every week, and until this week two pages that I've managed to get ranking have ranked between 20 and 23 for the specific term. However, today on the email one of the pages for one search term has bombed out of the top 50 while the other page has remained unaffected. What could be the cause for this? I've looked at Google Webmasters for an indication of a penalty of some sort but there is nothing glaringly obvious. I've no messages on there, and I haven't bought a load of spam links at all. What else could I check?
Intermediate & Advanced SEO | | mickburkesnr0 -
Website Ranks and gets de indexed ??
Hi My website is almost 3-4 months old . Whats strange is that as soon as it get Crawled it ranks for few terms for 1-2 days and all of a sudden gets de Indexed for these same terms or Rank drops like drops from page 5 to page 10 . Nothing shows up in Webmater tools under Manual Action . Assuming its a Algorithmic penalty, How to deal with this kind of stuff. Should I stop working on this site all together ? Or assuming its a New website, google does not want it to rank for medium or high volume keywords ? What keywords I am after have 300 -2k searches per month .
Intermediate & Advanced SEO | | aus00070 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
Realistic Expectation for my DA/ PR ranking within a year?
My website is brand new, has an A Grade on-page optimization report for my two keywords. Now, the problem that all of my competitors have an average Domain Authority of 50 with a Page Authority of 40. Now, I don't want to enter something I just cannot win, what is a Realistic Expectation for my DA ranking within a three months? Just for clarification, I try to build 5 new links a week, update my blog and I am quite active within forums.
Intermediate & Advanced SEO | | Paul_Tovey0 -
Is it possible to lose rank because my site's IP changed?
I manage a site on the 3dCart e-commerce platform. I recently updated the SSL certificate. Today, when I tried to log-in via FTP, I couldn't connect. The reason I couldn't connect was because my IP had changed. Last week the site experienced almost across the board rankings drops on lmost every important keyword. Not gigantic drops, a lot just lost 2-4 postiions, but that's a lot when you were #2 and you drop to #4 or # 6. Initially I thought it was because I was attempting to markup my product pages using structured data following guidelines from schema.org. I am not a coder so it was a real struggle, especially trying to navigate 3dCart's listing templates. I thought the rankings drops were Google slapping me for bad code, but now I wonder....could I really have dropped down because of that IP address change? Does anyone have a take on this? Thanks!
Intermediate & Advanced SEO | | danatanseo0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Posing QU's on Google Variables "aclk", "gclid" "cd", "/aclk" "/search", "/url" etc
I've been doing a bit of stats research prompted by read the recent ranking blog http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables There are a few things that have come up in my research that I'd like to clear up. The below analysis has been done on my "conversions". 1/. What does "/aclk" mean in the Referrer URL? I have noticed a strong correlation between this and "gclid" in the landing page variable. Does it mean "ad click" ?? Although they seem to "closely" correlate they don't exactly, so when I have /aclk in the referrer Url MOSTLY I have gclid in the landing page URL. BUT not always, and the same applies vice versa. It's pretty vital that I know what is the best way to monitor adwords PPC, so what is the best variable to go on? - Currently I am using "gclid", but I have about 25% extra referral URL's with /aclk in that dont have "gclid" in - so am I underestimating my number of PPC conversions? 2/. The use of the variable "cd" is great, but it is not always present. I have noticed that 99% of my google "Referrer URL's" either start with:
Intermediate & Advanced SEO | | James77
/aclk - No cd value
/search - No cd value
/url - Always contains the cd variable. What do I make of this?? Thanks for the help in advance!0 -
Pro's & Con's of registering your customers?
I know that making a user register will drop the the conversion rate. However, there are a lot of sites that still stand by making users register before you can purchase. I was wondering if they know something that I don't that would outweigh the loss of those conversions. What exactly are the Pro's & Con's of making your customers register before being able to purchase an item?
Intermediate & Advanced SEO | | HCGDiet0