Webmaster tools crawl errors
-
Hi there,
iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also.
that pages have 404 error page + redirect to homepage, and google still notice them with old cache content.
does someone have a clue why is this happening?
-
Thank you very much Dana for the superb answer!
any clue for how much this errors are critical for my website seo? (is this problem worth fixing)
-
Hi!
Have you verified that there is a proper 301 redirect from the old URL's? If, for example, there is a 302 instead, I wouldn't be surprised if Google kept the old info in the index, as you have sort of said "I'll be back soon at this old address, just wait a minute (or year)".
Do you have an example URL for us that we could take a look at?
-
Hi Dana,
If your looking for someone to confirm what Dana said she is 100% right.
If your using a data base like WordPress it is posable for it to hold old pages and serve them. Make sure your host knows of your problem if you are using a CMS.
Hope this helps,
Thomas
-
Yes, I understand this issue very well and have seen it many times. Most often, it is happening because another of your pages that is being indexed is still referencing those pages. Please forgive Google, it is but a humble, not-so-intelligent bot (despite what the world would have you think). If you keep referencing a page that doesn't exist, Googlebot will say to itself "but it does exist! it does!" and keep indexing it, despite the 404 error. I mean, for all Google knows, it is your intention to bring that page back and maybe you just screwed up and it is 404-ing, you know?
Here's the remedy:
Do a content audit. Here's a great post on how to do that: http://www.distilled.net/blog/seo/how-to-perform-a-content-audit/
You will discover many things, without a doubt, including pages that are linking to these 404 pages. Decide what to do with those pages, i.e. nix them, re-build them, whatever. If you have pages in Google that you really do want out of the index and those same pages are 404-ing....simply go to your Google Webmaster Tools Account and in the left nav select "Google Index" and then "Remove URLs." Then simply click the box that says "create a new removal request" and enter the desired URL into that Box. Provided your page meets the requirements (and if it's producing a 404 error it does) Google will prioritize its removal. Under no circumstances should this be confused with the disavow tool. Just to make that clear.
I hope this helps. Any questions and I'm happy to help further if I can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Diagnose "Crawled - Currently Not Indexed" in Google Search Console
The new Google Search Console gives a ton of information about which pages were excluded and why, but one that I'm struggling with is "crawled - currently not indexed". I have some clients that have fallen into this pit and I've identified one reason why it's occurring on some of them - they have multiple websites covering the same information (local businesses) - but others I'm completely flummoxed. Does anyone have any experience figuring this one out?
Reporting & Analytics | | brettmandoes2 -
Does the new Google Analytics Search Console Beta tool use API to pull more data?
So my client has been asking for definitive proof of why the search query data provided on Google Search Console does not exactly match up the data presented directly in the Search Console itself. The simple answer is that the Google Search Console is limited to 1000 rows of data. However our client is requesting a Google article/documentation of why the new Search Console beta tool has no row limit (hence much more data for a big website). I know that the Google Search Console API was available before Google announced the new Search Console Beta tool in Google Analytics. I also know this API could pull in more data than the 1000 row limit. However is there any article available (preferably from Google) that Google Analytics is pulling this Search Console data via API? Thanks!
Reporting & Analytics | | RosemaryB0 -
Google Web Master Tools show that my site has been crawled, but search results show old title tags, ect.
Does the index report reflect what will be displayed on Google results? Google seems to be indexing my site every Sunday...
Reporting & Analytics | | 928shopper0 -
Webmaster Tools vs. Google Trends data doesn't add up
I am investigating a two-month 25% drop in organic traffic from Google to a client's site. When I turned to the Webmaster Tools data for the site, there is a clear, gradual drop over the course of a couple months both in impressions and clicks. In general, the drop occurred across many pages and for a large number of queries; there wasn't a core group of keywords or pages that saw the drop...it was more sitewide. Yet, the average rankings reported by WMT were, for the top 100 or so landing pages, not significantly different. The site hosts information about medical conditions, and I wouldn't expect any time-related variations in search volume, and this was confirmed by looking at Google Trends data for a number of the top keywords. I started to look at the data by query for all the top keywords (all ranked in the top 10), and saw the following general trend: impressions were down, rankings stayed in the top 10, and Google Trends showed either flat or rising volumes. So I am trying to make sense of that. If the search volume trend did not decline and rankings held inside the top 10, then how could the number of impressions drop significantly? Am I trusting the WMT data too much? But the reality is that the volume of traffic measured by Google Analytics from Google organic did indeed drop the way Webmaster Tools show it.
Reporting & Analytics | | WillW0 -
Webmaster Tools Records
Hey, I'm trying to find a way of getting data from Googles Webmaster Tools and recording it in either a database or spread sheet. Is there any software out there that I can use to do this or would i have to look at developing it my self? Thanks, Luke.
Reporting & Analytics | | NoisyLittleMonkey0 -
Spike in 404 Errors from a redirecting domain
All, The non www version of a site I own redirects to the www version. Recently, WMT began showing a big spike in 404 errors (1,000+) on the non www version of the site. Site traffic is off about 15% since the spike in 404 errors. There was also a brief period about two weeks ago, where the site went down due to an issue with the code that has now been resolved. Any ideas how WMT is showing 404 errors on redirected pages? Thanks, John
Reporting & Analytics | | JSOC0 -
Subdomains and SEO - Analytics & Webmaster Tools Setup Help
Any advice on the following greatly appreciated: How to get multiple subdomain data into 1 Google Analytics profile? Can we get multiple subdomain data into Google Webmaster Tools (and if so how?) or do we need to set GWT up per subdomain?
Reporting & Analytics | | AndyMacLean0 -
What factors does the Keyword Difficulty Tool measure?
Hi Guys I am new to this community and finding loads of informative stuff here. The keyword difficulty tool looks very interesting. I think google keyword tool competetion field is based on PPC competetion? Will i be right in saying that the moz keyword difficulty tool measures top 10 ranking pages, their age, backlinks etc?
Reporting & Analytics | | SamBuck0