404 Errors
-
Do 404 Errors really have a lot of impact on rankings and over all authority of the site with google? Say you have a site that all the pages have moved apart from the home page which is exactly the same before moving? So most of your pages are showing 404 errros.
-
Hi Adul,
Just to follow up on this in case you're wondering why the answer is being downvoted. Blocking the pages that 404 in robots.txt will only stop Google getting a 404 because they can't reach the page. Users will still get a 404 so this isn't ideal. Also, if you don't 301 redirect the old pages to the new one, you lose any equity that those pages built up over the years.
Hope that helps,
Craig
-
Go to google webmaster tools, > Crawl > Crawl Error .. Download all page
Open excel and paste notepad robots.txt
User-agent: *
disallow: /page1.html
Disallow: /page2.html -
If no one can access your site except for the home page, that is pretty bad.
As to rankings, look at it from a broad perspective. A user clicks a link in search results. That link goes to a 404. They immediately go back and find someone else's site or link to click on. Another user clicks another link for the same broken site. They get a 404 error and do the same thing. Google bot comes along and sees that the site in question has a very low on-page time, and users frequently leave and go somewhere else. They also see a large quanity of the pages dont work.
If you were Google, would you give that site much weight or credit? Or would you hand it to a site that works? I don't think they openly express that it can hurt you, or that they will hurt your ranking for having 404 errors. IMO they do, it's just not as transparent as the rest of the things they state to do to improve your ranking.
-
OP, your case is an extreme one in that every page on the site but the homepage 404s. That means you moved but didn't do any 301 redirects, so that's an issue.
But generally, 404s have no impact on your site's ranking and that's been stated on record multiple times.
-
Hi, the 404 errors are pretty bad and en user experience standpoint and so Google does not like them. During domain migrations, the most important aspect is to control the number of 404 errors to the possible extent if not possible to make them zero.
When pages are moved, you should go in for a one-to-one or page-to-page server-side, 301 permanent redirection from the old pages to the corresponding new locations so that the old pages do not end-up in 404 errors and with 301 in place, Google will know that the old pages are no more in force and that they have been replaced by the new corresponding destinations. This will also make the old ones replaced by the new ones in search engine indices.
So to conclude, 404 errors are bad from both, the users and the search engines.
Hope it helps my friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site structure: Any issues with 404'd parent folders?
Is there any issue with a 404'd parent folder in a URL? There's no links to the parent folder and a parent folder page never existed. For example say I have the following pages w/ content: /famous-dogs/lassie/
Intermediate & Advanced SEO | | dsbud
/famous-dogs/snoopy/
/famous-dogs/scooby-doo/ But I never (and maybe never plan to) created a general **/famous-dogs/ **page. Sitemaps.xml does not link to it, nor does any page on my site. Is there any concerns with doing this? Am I missing out on any sort of value that might pass to a parent folder?0 -
Setting Up Hreflang and not getting return tag errors
I've set up a dummy domain (Not SEO'd I know) in order to get some input on if I'm doing this correctly. Here's my option on the set up and https://technicalseo.com/seo-tools/hreflang/ is saying it's all good. I'm self-referencing, there's a canonical, and there is return tags. https://topskiphire.com - US & International English Speaking Version https://topskiphire.com/au/ - English language in Australia The Australian version is on a subdirectory. We want it this way so we get full value of our domain and so we can expand into other countries eventually e.g. UK. Q1. Should I be self-referencing or should I have only a canonical for US site? Q2. Should I be using x-default if we're only in the English language? Q3. We previously failed when we had errors come back saying 'return tags not found' on a separate site even though the tags were on both sites. Was this because our previous site was only new and Google didn't rank it as often as our main domain.
Intermediate & Advanced SEO | | cian_murphy0 -
How do I get rid of my errors for Schema.org?
I put the Schema.org data on my item pages and it works great. However, when an item closes it removes the price. It showed an empty price and that causes an error. The site is now programmed to where if an item closes it removes the price component. This was done about 2 weeks ago and it is still showing a lot of errors. Any ideas?
Intermediate & Advanced SEO | | EcommerceSite0 -
New domain purchase 301 and 404 issues. Please help!
We recently purchased www.carwow.com and 301 redirected the site to www.carwow.co.uk (our main domain). The problem is that carwow.com had URLs indexed like www.carwow.com/a-b-c the 301 sends them to carwow.co.uk/a-b-c which obviously doesn't exist so is a 404! What should be done in this situation? Should it be ignored and not re-directed at all, or is there a way to delete/disavow these dead pages? An SEO has advised we redirect all pages to the homepage, but won't that mess up the link profile? Any advice would be great!
Intermediate & Advanced SEO | | JamesPursey0 -
What are partial urls and why this is causing a sitemap error?
Hi mozzers, I have a client that recorded 7 errors when generating Xml sitemap. One of the errors appear to be coming from partial urls and apparently I would need to exclude them from sitemap. What are they exactly and why would they cause an error in the sitemap. Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Are 17000+ Not Found (404) Pages OK?
Very soon, our website will go a rapid change which would result in us removing 95% or more old pages (Right now, our site has around 18000 pages indexed). It's changing into something different (B2B from B2C) and hence our site design, content etc would change. Even our blog section would have more than 90% of the content removed. What would be the ideal scenario be? Remove all pages and let those links be 404 pages Remove all pages and 301 redirect them to the home page Remove all unwanted pages and 301 redirect them to a separate page explaining the change (Although it wouldn't be that relevant since our audience has completely changed)- I doubt it would be ideal since at some point, we'd need ot remove this page as well and again do another redirection
Intermediate & Advanced SEO | | jombay0 -
Wordpress No 404
Hello, My issue is that in wordpress 404 does not seem to be working properly. An example of this is: sitename.com/category/catname loads the files in that category but I can also type sitename.com/category/asdasfaasd/catname and it still goes to the posts in that category and does not 404. I can replace the misc text with anything and it does not 404. My worry is that this can be used to exploit duplicate content. I've looked at a couple of other sites and they do the same. I'm using Yoast as my SEO plugin and my theme is elogix from themeforest. I've tried disabling all plugins, cloudflare and changing theme and the same issue exists. If anyone can help it would be extremely appreciated.
Intermediate & Advanced SEO | | LukeHutchinson0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0