Google says 404s don't cause ranking drops, but what about a lot of them
-
Hello,
According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site.
There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s
Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
-
Hi Dana,
Thank you. I thought of that too but these pages aren't linked to anywhere in the site anymore so am I correct that screaming frog won't find them?
-
Hi Bob,
I use Screaming Frog for this. If you don't already have it, it's $99 very well spent. Once your site is crawled it's very easy to pull the 404s into an Excel spreadsheet and deal with them from there.
Hope that helps!
Dana
-
What's the best way to find the 404s in 9000 pages if they don't show up in GWT?
-
I highly agree to stewart’s statement that 404 might not affect rankings but lot of 404 will create a high bounce rate and this defiantly will increase the bounce rate which can cause the ranking drop.
When I work with clients, one of the first jobs I do is to reduce the 404’s number to minimum. It is not necessary to take it to zero but reduce it as much as you can!
-
I agree with Stewart, but it's also a tricky thing. Are you 100% positive there are no rogue links out in the www linking to any of these sites? Have you reviewed your site map are they hanging out in there? Could there be a 301 somewhere pointing to it? Not so much bounce rate I would be concerned about but usability - which is something that Google is taking a look at.
-
Per se they don't but a high bounce rate will harm your site - no one knows the extent of this but an adverse bounce rate makes up a part of the quality ranking signals (more than 200) Google uses to evaluate a site in the serps.
Therefore having several 404's can effect a bounce rate figure negatively and ultimately harm a site to a lesser or greater degree.
However, 50 I don't think is high relative to the numbers of visitors that navigate to a second page which is a postive metric - if that is the case for you I wouldn't worry. But 50 404's on a website with 200 pages is bad.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much does doing google search queries dilute your search console data
So, does performing dozens or hundreds of search queries a day dilute your search console data, or does google filter this out or how does this work exactly? When you do an icognito search and click on your site does this information get recorded in search console?
White Hat / Black Hat SEO | | jfishe19880 -
How Do You Know or Find Out if You've been hit by a Google Penalty?
Hi Moz Community, How do you find out if you have been hit with a Google Penalty? Thanks, Gary
White Hat / Black Hat SEO | | gdavey0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
Pleasing the Google Gods & Not DeIndexing my site.
Hey Mozzers, So plenty of you who follow these threads have come across my posts and have read bits and pieces of the strange dark dark gray hat webspace that I have found myself in. So I'm currently doing some research and I wanted all of your opinion too. Will Google always notify you before they stop indexing your website? Will Google always allow you back if you do get pulled? Does Google give a grace period where they say "fix in 30 days?"? What is every bodies experience with all of this?
White Hat / Black Hat SEO | | HashtagHustler0 -
Why does Google recommend schema for local business/ organizations?
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets". Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks? I would be interested to hear what everyone else opinions are on this thanks.
White Hat / Black Hat SEO | | PeterRota0 -
I think I've been hit by Penguing - Strategy Discusson
Hi, I have a network of 50 to 60 domain names which have duplicated content and whose domains are basically a geographical location + the industry I am in. All of these websites have links to my main site. Over the weekend I saw my traffic fall. I attribute our drop in rankings to what people are calling Penguing 1.1. I want to keep my other domains as we are slowly creating unique content for each of those sites. However, in the mean time, clearly I need to deal with the inbound linking and anchor text problem. Would adding a nofollow tag to all links that point to my main site resolve my issue with Google's penguin update? Thanks for the help.
White Hat / Black Hat SEO | | MangoMan160 -
Link Building: High Ranking Site vs. Relevancy
Hello, When link building, is it acceptable to link with a site that has high authority but has minimal relevancy to our site? For example, if we sell nutritional products and the link exchange would be with a site that relates to free coupons, would that work? Also, if we are publishing articles on other sites, should we also publish them on our own site? Should we add "nofollow" if we publish them in our site?
White Hat / Black Hat SEO | | odegi0