Webmaster tools crawl errors
-
Hi there,
iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also.
that pages have 404 error page + redirect to homepage, and google still notice them with old cache content.
does someone have a clue why is this happening?
-
Thank you very much Dana for the superb answer!
any clue for how much this errors are critical for my website seo? (is this problem worth fixing)
-
Hi!
Have you verified that there is a proper 301 redirect from the old URL's? If, for example, there is a 302 instead, I wouldn't be surprised if Google kept the old info in the index, as you have sort of said "I'll be back soon at this old address, just wait a minute (or year)".
Do you have an example URL for us that we could take a look at?
-
Hi Dana,
If your looking for someone to confirm what Dana said she is 100% right.
If your using a data base like WordPress it is posable for it to hold old pages and serve them. Make sure your host knows of your problem if you are using a CMS.
Hope this helps,
Thomas
-
Yes, I understand this issue very well and have seen it many times. Most often, it is happening because another of your pages that is being indexed is still referencing those pages. Please forgive Google, it is but a humble, not-so-intelligent bot (despite what the world would have you think). If you keep referencing a page that doesn't exist, Googlebot will say to itself "but it does exist! it does!" and keep indexing it, despite the 404 error. I mean, for all Google knows, it is your intention to bring that page back and maybe you just screwed up and it is 404-ing, you know?
Here's the remedy:
Do a content audit. Here's a great post on how to do that: http://www.distilled.net/blog/seo/how-to-perform-a-content-audit/
You will discover many things, without a doubt, including pages that are linking to these 404 pages. Decide what to do with those pages, i.e. nix them, re-build them, whatever. If you have pages in Google that you really do want out of the index and those same pages are 404-ing....simply go to your Google Webmaster Tools Account and in the left nav select "Google Index" and then "Remove URLs." Then simply click the box that says "create a new removal request" and enter the desired URL into that Box. Provided your page meets the requirements (and if it's producing a 404 error it does) Google will prioritize its removal. Under no circumstances should this be confused with the disavow tool. Just to make that clear.
I hope this helps. Any questions and I'm happy to help further if I can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Crawl Errors/Not Found - Strange URLs
Hello, In Google Search Console under Crawl > Crawl Errors > Not found I have strange URLs like the following: https://www.domain.com//UbaOZ/
Reporting & Analytics | | chuck-layton
https://www.domain.com//UPhXZ/
https://www.domain.com//KaUpZ/WYdhZ/SnQZZ/MOcUZ/ There is no info in Linked From tab. Have you seen this type of error??
Does anyone know whats causing it??
How should it be fixed?? Thanks for reading and the help!0 -
Can not divide in different properties a domain in Search Console (Webmaster Tools)
Dear Moz Community, I hope you can give me a hand with the following questions. Im in charge of SEO of an ecommerce site in LATAM. It´s service is available in several countries, therefore each country has it subdirectory Eg. /ar /pe /co /bo /cl /br,etc... (in the future we will move to differente ccTLDs). I have been recomended to split or create different Search Console or Webmaster Tools properties (one for each subdirectory) but when Im creating a new property with a subdirectory, lets say www.domain.com/ar, Webmaster tools starts creating a property for www.domain.com/ar/ (NOTICE THE LAST SLASH) and it returns since that page doesn´t exist, what do you recomend me to do? Best wishes, Pablo Lòpez C
Reporting & Analytics | | pablo_carrara0 -
Mismatch between analytics and webmaster tools
Hi, I've noticed that the Queries report in analytics gives a completely different answer (for impressions, clicks, CTR, Avg. position) to webmaster tools. What gives? Which would you use? Thanks, Amelia
Reporting & Analytics | | CommT0 -
On Google Analytics, Pages that were 301 redirected are still being crawled. What's the issue here?
URL that we redirected are being crawled on Google Analytics. Since they dont exist, they have high bounce rates. What can the issue be?
Reporting & Analytics | | prestigeluxuryrentals.com0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Webmaster Tools Error: Unreachable page
Hi all, When I try to the "Fetch as Google" feature on Webmaster Tools, I get the error Unreachable page. I checked the Google Analytics code, everything seems to be OK. What should I do?
Reporting & Analytics | | fisniks0 -
Schema Rich Snippet Markup tool is validating the markup but it says it isn't in Webmaster Tools.
All my Schema Rich Snippets are being validated in the webmaster tools Rich Snippet tester tool here http://www.google.com/webmasters/tools/richsnippets But when I go take a peek at the new tab in Webmaster tools when I'm signed into my analytics account it shows there are no active Markups on my site. Has anyone else been having this issue? If it says it's working in the Google tool from the link above should I even be worried if its not showing up when I'm signed into my analytics account? Thanks.
Reporting & Analytics | | DCochrane0 -
Google Webmaster Backlinks
How can I view/download all my backlinks in google webmaster ? Currently when I try to download or view it shows only "Top 14 domains linking to" information and not ALL the links.. Any thoughts?
Reporting & Analytics | | krishru0