Issue with 'Crawl Errors' in Webmaster Tools
-
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10.
Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected.
Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not.
Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?!
Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!!
Thanks.
-
Thanks both for your responses. It's a strange one and I can only assume that these pages remain in Google's index - I have checked many link sources and found that the links do not exist and therefore haven't done since the page was deleted. It seems ridicilous that you should have to 301 every page you delete, there are literally 500+ of these phantom links to non-existant URLs and the site is changing all the time.
I have opted to add a 'no index' meta to the 404s and also encourage them to delete from index by adding the pages to the robots.txt file.
Let's see if it works - I'll post on here when I know for sure so other people with the same question can see the outcome.
Thanks again, Damien and Steven.
-
Completely agree with Damien. If they don't exist but Webmaster Tools is showing them, 301 them, there has to be a link somewhere on the internet that is causing them to think there is. I would also go through the server logs to see if there is any additional information like a referring page to the non-existent ones.
-
Hey,
I guess if you've exhausted all other possibilities you can either let them return a 404 and leave them be which will most likely do you no harm or 301 that particular URL to another relevant page on your site.
Make sure they are actually returning a 404 first though via header response check.
DD
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Attempt Errors & Homepage Not Ranking
Hi all, I have scanned the community forum thoroughly to find a solution to this issue and noticed some detailed and informed responses, but I am not sure which apply to the issue we are currently having. We are receiving a lot of 803 Crawl Attempt Errors on a weekly basis for our site www.mangofurniture.co.uk and also our homepage isn't ranking and I can't help but think that the two are linked. We have some rankings for the internal pages and have a couple of other sites that use the same template as www.mangofurniture.co.uk that are doing well with no crawl attempt errors and strong homepage rankings. There are a lot of great resources out there on the Moz forum and elsewhere but I am little unsure what applies to our problem or whether to two are linked at all. We have tried rewriting the homepage and developing the internal linking system but to no success as yet. Also, because the site is fairly new so the link profile is quite small at present. Any advice regarding this would be greatly appreciated. Many thanks in advance.
Technical SEO | | FurnitureGeek0 -
Clarification on indexation of XML sitemaps within Webmaster Tools
Hi Mozzers, I have a large service based website, which seems to be losing pages within Google's index. Whilst working on the site, I noticed that there are a number of xml sitemaps for each of the services. So I submitted them to webmaster tools last Friday (14th) and when I left they were "pending". On returning to the office today, they all appear to have been successfully processed on either the 15th or 17th and I can see the following data: 13/08 - Submitted=0 Indexed=0
Technical SEO | | Silkstream
14/08 - Submitted=606,733 Indexed=122,243
15/08 - Submitted=606,733 Indexed=494,651
16/08 - Submitted=606,733 Indexed=517,527
17/08 - Submitted=606,733 Indexed=517,498 Question 1: The indexed pages on 14th of 122,243 - Is this how many pages were previously indexed? Before Google processed the sitemaps? As they were not marked processed until 15th and 17th? Question 2: The indexed pages are already slipping, I'm working on fixing the site by reducing pages and improving internal structure and content, which I'm hoping will fix the crawling issue. But how often will Google crawl these XML sitemaps? Thanks in advance for any help.0 -
Sitemap issue? 404's & 500's are regenerating?
I am using the WordPress SEO plugin by Yoast to generate a sitemap on http://www.atozqualityfencing.com. Last month, I had an associate create redirects for over 200 404 errors. She did this via the .htaccess file. Today, there are the same amount of 404s along with a number of 503 errors. This new Wordpress website was constructed on a subdirectory and made live by simply entering some code into the .htaccess file in order to direct browsers to the content we wanted live. In other words, the content actually resides in a subdirectory titled "newsite" but is shown live on the main url. Can you tell me why we are having these 404 & 503 errors? I have no idea where to begin looking.
Technical SEO | | JanetJ0 -
Webmaster tools reporting spurious errors?
For the past 3 or so months Webmaster tools has been reporting 404 errors on my pages... The odd thing is that I can't figure out what they are seeing. Here is an example of a link they claim is a 404 antiquebanknotes/nationalcurrency/rare/1895-Ten-Dollar-Bill.aspx This is strange because it's a malformed URL. It says it's linked from this page: http://www.antiquebanknotes.com/antiquebanknotes/rare/1882-twenty-dollar-bill.aspx Which is a URL that doesn't exist. The bolded portion of this URRL shouldn't be there. Can anyone give me an idea what is happening here? Kind regards, Greg
Technical SEO | | Banknotes1 -
Webmaster Tools - How to Change Site Owner When Owner is MIA
I started working with a client about a year and a half ago. I had the computer guy who setup the website for my client grant me access to webmaster tools. When I added the site, I have "Restricted Access" I kind of dropped the ball in following up with him and now, there is no response from his email and the hosting provider website is down. Is there any way to get his non working email off as owner and make me the owner?
Technical SEO | | Czubmeister0 -
The 'On Page' section of SEOMOZ
How does SEOMOZ choose a keyword for a page, for example it has ranked one of my pages for a search term which does not really appear on that page and then given it an F - how do I change the key word association? Secondly, when I first started using SEOMOZ I could change the page and then click the button 'Grade my on-page optimization' and it would show an immediate update - does anyone know why this has been stopped, as it is very useful to know you have got the page right away to an A for example.
Technical SEO | | bowravenseo0 -
Google Webmaster Tools Reporting False Links
I was looking at Google Webmaster Tools and the amount of links that are reported in there are inaccurate. They reported over 50,000 links that created a huge spike in their link graph and I checked some of the links and they don't even have the link on their site. Can anyone help with this?
Technical SEO | | TopFloor0 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0