Can 404 Errors Be Affecting Rankings
-
I have a client that we recently (3 months ago) designed, developed, and launch a new site at a "new" domain. We set up redirects from the old domain to the new domain and kept an eye on Google Webmaster Tools to make sure the redirects were working properly. Everything was going great, we maintained and improved the rankings for the first 2 months or so.
In late January, I started noticing a great deal of 404 errors in Webmaster Tools for URLs from the new site. None of these URLs were actually on the current site so I asked my client if he had previously used to domain. It just so happens that he used the domain a while back and none of the URLs were ever redirected or removed from the index. I've been setting up redirects for all of the 404s appearing in Webmaster tools but we took a pretty decent hit in rankings for February. Could those errors (72 in total) been partially if not completely responsible for the hit in rankings? All other factors have been constant so that lead me to believe these errors were the culprits.
-
72 errors in my opinion is very low. If no links from within the site then you should not worry about it.
The issue with the traffic drops lies elsewhere.
-
The site has about 30 pages in total so it's definitely on the smaller side.
We haven't set up a custom 404 yet but it's definitely on my To Do List!
-
-There was not a significant drop off in traffic month over month
-We don't have any links on the current site that link to the 404 pages.
-
I doubt it - 72 errors is not a lot in my opinion.I guess its relative vs. the overall size of the site in question though and the number of total pages that makes it up. If I had a site of any great size that only had 72 404 errors I would be very happy.
Does the new site have a custom 404 page? sometimes its not even worth 301 redirecting those URLs (i.e., if there isn't a natural fit for the old page on the new site or if there is no real link equity built up against those legacy URLs). A custom 404 will allow you to remove those old legacy URLs from the search index and still provide an improved route into fresh content for the search bots in the meantime.
Ultimately though I think you probably need to look deeper into the ranking drops than the 404 errors.
Best of luck with it.
Ben
-
A few questions:
-
did the traffic dropped suddenly or it was progressively going down ?
-
if it was sudden, in what day it happen ?
-
404 do affect ranking but mainly if those 404 pages are linked from pages within the site and since it's bad for UX it also affects rankings. Are there any pages that return 404 linked from the current site (are there any live pages with links to those 404 pages ?) I know you said it's a new site and the 404 are from the old site but are there any old pages that are still online ?
-
-
Its hard to say if that alone is causing your ranking drop (72 is alot!), but Google certainly isnt going to reward you for having an abundance of 404 pages on your site. Thats just like telling them "I dont care about my pages or website" - There is an error on your website and its not fixed.
I would say the best thing to do at this point is to keep on 301'ing them or rebuilding content on those pages if possible and see what happens.
There are many blog posts on this subject, I would list them all here but it seems Google has most of them on page 1.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page not being ranked properly
Hi, Wondering if someone could possibly shed some light on why some of our pages are not being ranked properly on Google. For example this page https://www.mypetzilla.co.uk/dog-breeds Keyword "Dog Breeds" we can't be found on and we are absolutely baffled why? Could it be that we are listing all 100 and something dog breeds on one page? Should we introduce pagination or load more as user scrolls down. This page has been up for at least 4 years. Any suggestion or advice would be much appreciated. Many thanks
Intermediate & Advanced SEO | | Mypetzilla0 -
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Do 403 Forbidden errors from website pages hurt rankings?
Hi All, I noticed that our website has lot of 403 errors across different pages using the tool http://www.deadlinkchecker.com/. Do these errors hurt website rankings? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
How can I stop my facets being crawled?
Hi If my facets are being crawled, how can I stop this? Or set them up so they are SEO friendly - this is new to me as I haven't had to deal with lots of facets in the past. Here's an example of a page on the site - https://www.key.co.uk/en/key/lift-tables Here's an example of a facet URL - https://www.key.co.uk/en/key/lift-tables#facet:-1002779711011711697110,-700000000000001001651484832107103,-700000000000001057452564832109109&productBeginIndex:0&orderBy:5&pageView:list& I've been trying to read up on URL parameters etc, I'm new to it so it's taking some time to understand Any advice would be great!
Intermediate & Advanced SEO | | BeckyKey0 -
Low on Google ranking despite error-free!?
Hi all, I'm following up on a recent post i've made about our indexing and especially ranking problems in Google: http://moz.com/community/q/seo-impact-classifieds-website Thanks to all good comments we managed to get rid of most of our crawl errors and as a result our high priority /duplicated content decreased from +22k to 270. In short, we created canonical urls, run an xml sitemap, used url parameters in GWT, created h1 and meta description for each ad posted by users etc. I then used google fetch a few times (3 weeks ago and last week) both for desktop and mobile version for re-approval. Nothing really improves in google rankings (all our core keywords are ranked +50)since months now: yet yahoo and bing organic traffic went up and is 3x higher than google's. In the meanwhile we're running paid campagins on facebook and adwords since months already to keep traffic consistent, yet this is eating up our budget, even though our ctr and conversion rates are good. I realize we might have to create more content on-site and through social media, but right now our social media traffic is already around 50% and we are using more of twitter and google+ as well since recently. Our organic traffic is only 14%; with google only a third of that. In the end, I believe this breakdown should look more something like organic 50%-70%, (paid)social,referral and direct traffic. 50%-30%... I can't believe we are hit by a penalty although this looks like it is the case. Especially while yahoo and bing traffic goes up and google does not. Should I wait for a signal once our site is "approved" again through GWT fetch? Or am i missing something that i need to check as well to improve these rankings? Thanks for your help! Ivor ps: ask me for additional stats or info in a pm if needed!
Intermediate & Advanced SEO | | ivordg0 -
Moving blog to a subdomain, how can I help it rank?
Hi all, We recently moved our blog to a sub-domain where it is hosted on Wordpress. It was very recent and we're actively working on the SEO, but any pointers on getting the subdomain to rank higher than the old blog posts would be terrific. Thanks!
Intermediate & Advanced SEO | | DigitalMoz0 -
Ranking History Reports
I like that every week I can go into my campaign and see how I did. If I want to keep tabs (reports) every week and continuous record keeping how would I do that? For example, I want to see how we did last month on a particular keyword, should I set up to run a report every week for that keyword and have it emailed to me. Is that the only way to do that or does Moz keep previous history somewhere else? Thanks. PS Another website I help out on recently had a huge jump in pageviews this month. I don't track them in SEOMOZ currently is there anyway to figure out where that traffic is coming from? I am guessing perhaps they moved up in Google. Is there a way to see previous history? i.e. they are 33 last month for a certain keyword?
Intermediate & Advanced SEO | | greenhornet770 -
404 Redirecting to the home page
One of my clients that is managing their own server and website recently moved servers. Which then broke their custom 404 page. Instead of fixing this or putting the site back to the old server they redirected the 404 to the home page. I've been working on getting their 404's appropriately redirected, or old urls redirection using a 301 for a month or two. I read the HTTP Status Codes best practices. It just discusses usability. What technical seo back lash can happen?
Intermediate & Advanced SEO | | triveraseo0