Are thousands of 404s a problem?
-
An ecommerce site I work on has around 16,000 URLs that are 404s in Webmaster Tools. The vast majority are for products that are no longer stocked by the site, which is a natural occurrence in ecommerce.
But my question is, could these possibly be harming rankings?
-
It's not necessarily an SEO problem (as long as you don't redirect all 404s to your homepage at least. I've seen that be an issue in the past.)
However, use it as an opportunity. Make the 404 a sales page if you have that many. It's a lander now for "the product you were searching for is out of stock - here are the search results on our site for similar products" etc. You can turn it into something productive to have that many 404s.
From WMT, mark them as fixed and then anytime one pops up, find out what is linking to the product and try to remove that link. You don't want to be linking to 404s internally if you can help it (bad user experience = Google's nightmare) so having 1000s of 404s isn't bad but try to break any internal links to them. (Xenu can help as well if you don't feel like waiting for WMT.)
-
As Kevin stated they do not directly impact your ranking.
The user experience might be harmed if the 404 page is not optimized for the user.Using a custom 404 page you can assist the user to further explore the website on which they have arrived. If this is not done they might immediately leave the website and visit another one.
Google uses usage metrics in it's rankings now and therefor a lot of 404 pages might indirectly impact rankings.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Problems with Loading to a Subfolder?
A client has a single page app website that shows https://example.com/example when you visit https://example.com . I don't think this is a redirect; I think it's a URL rewrite. My questions: Is this setup common with single page apps? What are the SEO benefits or drawbacks of having a domain's homepage load, rewrite, or redirect to a subfolder?
Technical SEO | | Kevin_P0 -
Thousands of links coming from an iframe
We have an iframed calculator on one website (www.renewablesguide.co.uk) which has a text link to another of our websites (www.solarguide.co.uk) which is where the calculator originates. We allow other sites to embed the calculator which gives us the benefit of a followed link back to our site. However in the case of renewablesguide (which we own) we've added a tab to the calculator on every page which GWT shows up as 24 000 links from this site hitting the Solar Guide homepage. As the link is held within an iframe would this amount of links be seen as spammy?
Technical SEO | | holmesmedia0 -
Problem with Markup (Price, MaxPrice, MinPrice)
Hi Hi, I've inserted some Markup on my homepage for a vacation flat. It's looking like thies: As there is not just one price for the whole season, I used minPrice and maxPrice to define it. Unfortunatelly the Google Testing Tool (https://developers.google.com/webmasters/structured-data/testing-tool/) sais, that the value "price" has to be also in the code, but I have no clue what value I should give it? Has someone an advice? Thank you very much! Best regards André
Technical SEO | | Andre-S0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Problem With Video Sitemap Becuase All Videos Are in he Same URL
Hi, I created a video sitemap and now I'm getting an error on webmaster tools because the location for some of the videos is the same. It says: Duplicate URL - This URL is a duplicate of another URL in the sitemap. Please remove it and resubmit. What can I do if all my videos are located in the same URL?? Thanks
Technical SEO | | Tug-Agency0 -
Can URL re writes fix the problem of critical content too deep in a sites structure?
Good morning from Wetherby UK 🙂 Ok imagine this scenario. You ask the developers to design a site where "offices to let" is on level two of a sites hierachy and so the URL would look like this: http://www.sandersonweatherall.co.uk/office-to-let. But Yikes when it goes live it ends up like this: http://www.sandersonweatherall.co.uk...s/residential/office-to-let Is a fix to this a URL re - write? Or is the only fix relocating the office to let content further up the site structure? Any insights welcome 🙂
Technical SEO | | Nightwing0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0