Webmaster tools crawl errors
-
Hi there,
iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also.
that pages have 404 error page + redirect to homepage, and google still notice them with old cache content.
does someone have a clue why is this happening?
-
Thank you very much Dana for the superb answer!
any clue for how much this errors are critical for my website seo? (is this problem worth fixing)
-
Hi!
Have you verified that there is a proper 301 redirect from the old URL's? If, for example, there is a 302 instead, I wouldn't be surprised if Google kept the old info in the index, as you have sort of said "I'll be back soon at this old address, just wait a minute (or year)".
Do you have an example URL for us that we could take a look at?
-
Hi Dana,
If your looking for someone to confirm what Dana said she is 100% right.
If your using a data base like WordPress it is posable for it to hold old pages and serve them. Make sure your host knows of your problem if you are using a CMS.
Hope this helps,
Thomas
-
Yes, I understand this issue very well and have seen it many times. Most often, it is happening because another of your pages that is being indexed is still referencing those pages. Please forgive Google, it is but a humble, not-so-intelligent bot (despite what the world would have you think). If you keep referencing a page that doesn't exist, Googlebot will say to itself "but it does exist! it does!" and keep indexing it, despite the 404 error. I mean, for all Google knows, it is your intention to bring that page back and maybe you just screwed up and it is 404-ing, you know?
Here's the remedy:
Do a content audit. Here's a great post on how to do that: http://www.distilled.net/blog/seo/how-to-perform-a-content-audit/
You will discover many things, without a doubt, including pages that are linking to these 404 pages. Decide what to do with those pages, i.e. nix them, re-build them, whatever. If you have pages in Google that you really do want out of the index and those same pages are 404-ing....simply go to your Google Webmaster Tools Account and in the left nav select "Google Index" and then "Remove URLs." Then simply click the box that says "create a new removal request" and enter the desired URL into that Box. Provided your page meets the requirements (and if it's producing a 404 error it does) Google will prioritize its removal. Under no circumstances should this be confused with the disavow tool. Just to make that clear.
I hope this helps. Any questions and I'm happy to help further if I can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there such tool?
Morning Moz! Is there such tool, to view if your site has Google Analytics script by page? (I mean page by page, so to crawl an entire site) It's a 500+ page website, so i'm trying to avoid viewing source of each individual page. Thank you!
Reporting & Analytics | | Whittie0 -
Google Tag Assistant showing Error
Hello, I am using google tag assistant extension in chrome and it is giving me one error for google tag manager at my checkout step 1 and error is -
Reporting & Analytics | | devdan0 -
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Tools For Analysing Backlinks
Does anyone know of a good tool that analyses backlinks? I have already tried Link Research Tools and found it very useful but it is very expensive. Does anyone know of an alternative that does the same thing, i.e. will tell you what links are causing harm to your site?
Reporting & Analytics | | AAttias0 -
Google Webmaster. Backlinks
GWMT only shows that there are 3 domains pointing to a site of mine. I'm looking under "Links to site". But this can't be true because the site is pretty old and I know there are hundreds of domains that point to this one. What would explain this discrepancy? And is there some other free tool that will show all the backlinks? I've used Opensite explorer but that tool isn't close to comprehensive as GWMT usually is (based on other sites I've analyzed)
Reporting & Analytics | | priceseo0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
How serious are the Duplicate page content and Tags error?
I have a travel booking website which reserves flights, cars, hotels, vacation packages and Cruises. I encounter a huge number of Duplicate Page Title and Content error. This is expected because of the nature of my website. Say if you look for flights between Washington DC and London Heathrow you will at least get 60 different options with same content and title tags. How can I go about reducing the harm if any of duplicate content and meta tags on my website? Knowing that invariably I will have multiple pages with same content and tags? Would appreciate your advice? S.H
Reporting & Analytics | | sherohass0 -
500 errors and impact on google rankings
Since the launch of our newly designed website about 6 months ago, we are experiencing a high number of 500 server errors (>2000). Attempts to resolve these errors have been unsuccessful to date. We have just started to notice a consistent and sustained drop in rankings despite our hard sought efforts to correct. Two questions... can very high levels of 500 errors adversely effect our google rankings? And, if this is the case, what type of specialist (what are they called) has expertise to investigate and fix this issue. I should also mention that the sitemap also goes down on a regular basis, which some have stated is due to the size of the site (>500 pages). Don't know if they're part of the same problem? Thanks.
Reporting & Analytics | | ahw0