Increased 404 and Blocked URL Notifications in Webmaster Tools
-
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools.
When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May.
The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building.
I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic.
My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot.
Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
-
I doubt very much that an increase in old 404s resulted in a 30% organic traffic drop. I'd look closely at your backlink profile, competition and page quality to try and diagnose why you saw that drop in traffic.
As for the 404s I'd fix those that are fixable and 301 redirect the rest to relevant pages (or the home page). If the number is extremely large then you should put a high priority on fixing this. Otherwise I haven't met a site that Google couldn't find a 404 error on. And yeah, they keep telling you about the same ones!
Hope that helps!
Jacob
-
Hi!
As Lynn points out, there could be some issues in regards to your perceived uptime. Do you see a lot of 404 errors reported in Analytics as well? If this is the case, perhaps your hosting provider (or IT department) should have a look at this?
Also, adding the redirects seems like a good idea, as Google couold be reindexing some sites/pages linking to the old, deleted URL's.
Do you have a custom crawl frequency set up in Google Webmaster Tools? It's worth looking into if Googlebot is slowing down your site.
Good luck.Anders
-
The Moz scan is not showing the same errors. And we haven't made any technological changes. These are incoming links pointing to pages that don't exist anymore. It looks like its been that way for years, I just started getting notified of these and I'm wondering if somehow it is hurting the site.
About the robots file, I just don't know. I've decided to make it blank and re-assess in a few days.
-
Hi,
A bit difficult to say without some more details. Some of it might be outdated information. See: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools for a rundown on how to check it if you haven't already. What urls is it flagging from the robots.txt? Are they still valid urls? In regards the 404s, 28,000 is quite a lot. Has your system changed or been updated recently? Maybe there is a systemic fault going on that is creating these errors? Is the moz scan flagging the same errors?
It is tough to say if the errors have any connection to the drop in visits, but it is certainly something you want to get to the bottom of. I threw your site into xenu (http://home.snafu.de/tilman/xenulink.html) and it was timing out on quite a few of the pages. Is it possible the site is timing out on heavy loads? That might account for the drop in organic visits also...
Lots of questions, not many answers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What kind of impact does a 404 have in a sitemap regarding ranking?
We recently had a site update where our robots file disallowed our sitemap for about two weeks. When we found the problem and resubmitted the sitemap to Google Search Console, it found a 404 error. Does this have any impact on ranking or visibility if we are still recovering from the disallow?
Algorithm Updates | | GaryBlanchard0 -
Do keyword target landing pages increase rankings?
Let's say we create landing pages for targeted keywords in our niche. So like we have landing pages optimised for 80% of the top keywords with decent search volume. If these pages started ranking at first page or around; will this scenario improves the ranking of website? Right now, only few of our top pages are ranking good. Planning to create more of such.
Algorithm Updates | | vtmoz0 -
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
In how much time will my search visibility increase?
Hey I really wanna know how much time will take to my search visibility increase!! Thanks
Algorithm Updates | | THOMAZDANDREA10 -
Images added to website automatically become URLs - is this an issue?
Hello Mozzers! I've just been trawling through a website and noticed all of the images had their own URLs. There's a bespoke CMS and that's how it works with images... So out of 1447 urls, 1314 are images. Firstly, is this an issue / problem from an SEO perspective. If it is, how should I deal with it? Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Keyword stuffing in URL? Ekk. Help Please.
Okay, so I work as content manager in the travel industry and we're re-doing our site, pretty much from scratch, including the SEO, anchor text/route url, etc. I am struggling with one particular thing. If all my url's have similar keywords, ie example.com/atlanta-trip and example.com/boston-trip and so on and so forth for every destination, will using "trip" in the url be seen by Google as keyword stuffing? Should I make my url's more diverse? My gut feeling is no based on all the Moz, Google and other SEO research I've done, because it's all relevant to the content and the user experience, but I'd like to be sure, since we really can't afford to get penalized by Google...again.
Algorithm Updates | | hpeisach0 -
Unable to increase the site traffic since 2 yrs
Hello friends, I am new to seomoz forum and this is my first query. Even i asked this query in many forums, i didnt get the right answer. it will be a big help if anyone answers my question. Since 2yrs i am doing seo for my site. even i am following all the white hat techniques and doing every submission manually. Still my site traffic is below 100 visits. Can any one help me to increase the site traffic? What are the techniques i need to follow to increase site visits? Also one of my sites recently got disappeared from google. I have checked all the pages listed in google for my site's major keywords. I didnt find the site anywhere. Can u hep me why this condition wll happen and what to do to overcome such issues?
Algorithm Updates | | Covantech0 -
Google Shopping Blocking All Vitamins and Natural Products - Glitch or Deliberate Censorship?
Hi everyone. We have a client that manufactures and supplies dietary supplements all around the world. We are slightly concerned that a recent Google shopping glitch (or change) is now seemingly excluding products from the shopping search results. This currently appears only to be happening in the US but we are really concerned, as our client ships all over the world and the potential loss of revenue could be quite large. There is already a YouTube video that demonstrates what is going on which is available below: http://www.youtube.com/watch?v=zNDyS0tF4dY Just to clarify, these are products that should not be included in any of Google's “sensitive” categories as they currently stand. Taking Vitamin B12 as an example, it is recognised as a permissible dietary supplement within pretty much every regulatory framework around the world, including those governed by the US FDA, The Euopean Commission and the Australian TGA. Therefore there would be no legal reasons to prevent it's inclusion in shopping results in any country. Has this just slipped under the radar or can anyone point us to a resource that may be able to clarify why this has happened? Thanks in advance guys!
Algorithm Updates | | AduroLabs0