404 Errors
-
Do 404 Errors really have a lot of impact on rankings and over all authority of the site with google? Say you have a site that all the pages have moved apart from the home page which is exactly the same before moving? So most of your pages are showing 404 errros.
-
Hi Adul,
Just to follow up on this in case you're wondering why the answer is being downvoted. Blocking the pages that 404 in robots.txt will only stop Google getting a 404 because they can't reach the page. Users will still get a 404 so this isn't ideal. Also, if you don't 301 redirect the old pages to the new one, you lose any equity that those pages built up over the years.
Hope that helps,
Craig
-
Go to google webmaster tools, > Crawl > Crawl Error .. Download all page
Open excel and paste notepad robots.txt
User-agent: *
disallow: /page1.html
Disallow: /page2.html -
If no one can access your site except for the home page, that is pretty bad.
As to rankings, look at it from a broad perspective. A user clicks a link in search results. That link goes to a 404. They immediately go back and find someone else's site or link to click on. Another user clicks another link for the same broken site. They get a 404 error and do the same thing. Google bot comes along and sees that the site in question has a very low on-page time, and users frequently leave and go somewhere else. They also see a large quanity of the pages dont work.
If you were Google, would you give that site much weight or credit? Or would you hand it to a site that works? I don't think they openly express that it can hurt you, or that they will hurt your ranking for having 404 errors. IMO they do, it's just not as transparent as the rest of the things they state to do to improve your ranking.
-
OP, your case is an extreme one in that every page on the site but the homepage 404s. That means you moved but didn't do any 301 redirects, so that's an issue.
But generally, 404s have no impact on your site's ranking and that's been stated on record multiple times.
-
Hi, the 404 errors are pretty bad and en user experience standpoint and so Google does not like them. During domain migrations, the most important aspect is to control the number of 404 errors to the possible extent if not possible to make them zero.
When pages are moved, you should go in for a one-to-one or page-to-page server-side, 301 permanent redirection from the old pages to the corresponding new locations so that the old pages do not end-up in 404 errors and with 301 in place, Google will know that the old pages are no more in force and that they have been replaced by the new corresponding destinations. This will also make the old ones replaced by the new ones in search engine indices.
So to conclude, 404 errors are bad from both, the users and the search engines.
Hope it helps my friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moved company 'Help Center' from Zendesk to Intercom, got lots of 404 errors. What now?
Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/*** is where all our articles used to sit https://help.rewind.io/*** is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!
Intermediate & Advanced SEO | | henrycabrown1 -
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
I need thoughts on how to chase a suspected Hosting Issue with Simple Helix and 524 errors, also some site speed data mixed in...
So the back story on this project is we've been working as PPC and SEO managers with an ecoomerce site (Magento Enterprise based) that crashed in April. After the issue they fired their developer and switched hosting to Simple Helix at the recommendation of the new developer. Since the change we have seen a plummeting ecommerce conversion rate especially on weekends. Every time something seems really bad, the Developer gives us a "nothing on our end causing it." So doing more research we found site speed in GA was reporting crazy numbers of 25+ seconds for page loads, when we asked Simple Helix gave us answers back that it was "Baidu spiders" crawling the site causing the slowdown. I knew that wasn't the issue. In all of this the developer keeps reporting back to the site owner that there is no way it is hosting. So the developer finally admitted the site could be slowing down from a Dos attack or some other form of probing. So they installed Cloudflare. Since then the site has been very fast, and we haven't seen turbulence in the GA site speed data. What we have seen though is the appearance of 524 and 522 errors in Search Console. Does anyone have experience with Cloudflare that seeing those types of errors are common in usage? Is there any other thought what might be causing that and what that means from the servers, because the developer reports back that Simple Helix has had no issues during this time. This has been a super frustrating project and we've tried a lot different tests, but there is really abnormal conversion data as I said especially during peak times on the weekend. Any ideas of what to chase would be appreciated.
Intermediate & Advanced SEO | | BCutrer0 -
Duplicate Content Errors new website. How do you know which page to put the rel canonical tag on?
I am having problems with duplicate content. This is a new website and all the pages have the same page and domain rank, the following is an example of the homepage. How do you know which page to use the canonical tag on? http://medresourcesupply.com/index.php http://medresourcesupply.com/ Would this be the correct way to use this? Here is another example where Moz says these are duplicates. I can't figure out why because they have different url's and content. http://medresourcesupply.com/clutching_at_the_throat http://medresourcesupply.com/index.php?src=gendocs&ref=detailed_specfications &category=Main
Intermediate & Advanced SEO | | artscube.biz0 -
Hreflang Tags with Errors in Google Webmaster Tools
Hello, Google Webmaster tools is giving me errors with Hreflang tags that I can't seem to figure out... I've double checked everything: all the alternate and canonical tags, everything seems to match yet Google finds errors. Can anyone help? International Targeting | Language > 'fr' - no return tags
Intermediate & Advanced SEO | | GlobeCar
URLs for your site and alternate URLs in 'fr' that do not have return tags.
Status: 7/10/15
24 Hreflang Tags with Errors Please see attached pictures for more info... Thanks, Karim KQgb3Pn0 -
Google Seeing 301 as 404
Hi all, We recently migrated a few small sites into one larger site and generally we had no problems. We read a lot of blogs before hand, 301'd the old links etc and we've been keeping an eye on any 404s. What we have found is that Webmaster is picking up quite a few 404s, yet when we investigate these 404s they are 301'd and work fine. This isn't for every url, but Google is finding more and I just want to catch any problems before they get out of hand. Is there any reason why Google would count a 301 as a 404? Thanks!
Intermediate & Advanced SEO | | HB170 -
VisitSweden indexing error
Hi all Just got a new site up about weekend travel for VisitSweden, the official tourism office of Sweden. Everything went just fine except som issues with indexing. The site can be found here at weekend.visitsweden.com/no/ For some weird reason the "frontpage" of the site does not get indexed. What I have done myself to find the issue: Added sitemaps.xml Configured and added site to webmaster tools Checked 301s so they are not faulty By doing a simple site:weekend.visitsweden.com/no/ you can see that the frontpage is simple not in the index. Also by doing a cache:weekend.visitsweden.com/no/ I see that Google tries to index the page without the trailing /no/ for some reason. http://webcache.googleusercontent.com/search?q=cache:http://weekend.visitsweden.com/no/ Any smart ideas to get this fixed or where to start looking? All help greatly appreciated Kind regards Fredrik
Intermediate & Advanced SEO | | Resultify0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0