Increased 404 and Blocked URL Notifications in Webmaster Tools
-
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools.
When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May.
The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building.
I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic.
My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot.
Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
-
I doubt very much that an increase in old 404s resulted in a 30% organic traffic drop. I'd look closely at your backlink profile, competition and page quality to try and diagnose why you saw that drop in traffic.
As for the 404s I'd fix those that are fixable and 301 redirect the rest to relevant pages (or the home page). If the number is extremely large then you should put a high priority on fixing this. Otherwise I haven't met a site that Google couldn't find a 404 error on. And yeah, they keep telling you about the same ones!
Hope that helps!
Jacob
-
Hi!
As Lynn points out, there could be some issues in regards to your perceived uptime. Do you see a lot of 404 errors reported in Analytics as well? If this is the case, perhaps your hosting provider (or IT department) should have a look at this?
Also, adding the redirects seems like a good idea, as Google couold be reindexing some sites/pages linking to the old, deleted URL's.
Do you have a custom crawl frequency set up in Google Webmaster Tools? It's worth looking into if Googlebot is slowing down your site.
Good luck.Anders
-
The Moz scan is not showing the same errors. And we haven't made any technological changes. These are incoming links pointing to pages that don't exist anymore. It looks like its been that way for years, I just started getting notified of these and I'm wondering if somehow it is hurting the site.
About the robots file, I just don't know. I've decided to make it blank and re-assess in a few days.
-
Hi,
A bit difficult to say without some more details. Some of it might be outdated information. See: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools for a rundown on how to check it if you haven't already. What urls is it flagging from the robots.txt? Are they still valid urls? In regards the 404s, 28,000 is quite a lot. Has your system changed or been updated recently? Maybe there is a systemic fault going on that is creating these errors? Is the moz scan flagging the same errors?
It is tough to say if the errors have any connection to the drop in visits, but it is certainly something you want to get to the bottom of. I threw your site into xenu (http://home.snafu.de/tilman/xenulink.html) and it was timing out on quite a few of the pages. Is it possible the site is timing out on heavy loads? That might account for the drop in organic visits also...
Lots of questions, not many answers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google search console: 404 and soft 404 without any back-links. Redirect needed?
Hi Moz community, We can see the 404 and soft 404 errors in Google web masters. Usually these are non-existing pages which are found somewhere on internet by Google. I can see some of these reported URLs don't have any back-links (checked on ahrefs tool). Do we need to redirect each and every link reported here or ignore or marked to be fixed? Thanks
Algorithm Updates | | vtmoz0 -
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
Do the sub domain backlinks count for main domain and increase authority?
Hi all, I just wonder if the back links for different sub domains will be counted and considered to rank the main domain better or they are just limit to sub domain pages? There are many websites which has got multiple sub domains which receive backlinks? So the backlinks to main domain and sub domain weigh same at Google? Thanks
Algorithm Updates | | vtmoz0 -
Key Word in URL - To Include or Exclude?
Hi MoZ Community, Key word inclusion in URL has been discussed a fair bit on here and curious for some feedback on two options on URL structure. Ran’s #3 tip from his recent ‘15 SEO Best Practices for Structuring URLs’ states that key word inclusion still has some value but I’m not too sure if we’re going too far with the below examples. We sell footwear and only footwear for Women, Men & Kids and use those words as our key menu headings at the top. Under each of the main headings within a mega menu the users then has the choice to ‘shop by style’, ‘shop by brand’ etc… The key question or feedback is about including the word ‘shoes’ in my URLs as many of the top ranking competitors do it. e.g. /women-shoes-heels, womens-shoes-sandals or womens-shoes/heels, womens-shoes/sandals I think Google is smart enough to determine we have a shoe store and not sure of the value from a SEO or user experience perspective of adding the additional word. Thoughts on going with option A or B would be valued.... Option A - http://shopname.com/womens/sandals, http://shopname.com/womens/heels OR Option B - http://shopname.com/womens-shoes/sandals, http://shopname.com/womens-shoes/heels Thanks, | | |
Algorithm Updates | | chewythedog
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | | | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |0 -
Shortened URLs ??
Anyone have any insight into how shortened URLs affect SEO? I use Bitly occasionally for shortened links and was curious if this matters for any reason at all?? I basically use it so I can fit the links in places where long URLs look absurd...mostly social media platforms. I know there's some debate over whether the domain name affects ranking or not. Frankly, that all just goes over my head. Any thoughts welcomed!
Algorithm Updates | | adamxj20 -
Choosing domain name - ccTLD vs Vanity URL
I have to choose between a country specific domain name that is long and difficult to remember, vs or a .me domain which is short and contains the exact keywords I'm optimising for. The challenge is that I'm only targeting local search traffic for the service I am advertising. Does a country specific domain name have any benefits in terms of weighting when I'm only interested in traffic from that country?
Algorithm Updates | | flashie0 -
Increasing Brands/Products thus increasing pages - improve SEO?
We curently have 5 brands on our website and roughly 200 pages. Does increasing the number of products you stock and thus increasing the number of pages improve your SEO?
Algorithm Updates | | babski0 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0