Issue with 'Crawl Errors' in Webmaster Tools
-
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10.
Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected.
Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not.
Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?!
Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!!
Thanks.
-
Thanks both for your responses. It's a strange one and I can only assume that these pages remain in Google's index - I have checked many link sources and found that the links do not exist and therefore haven't done since the page was deleted. It seems ridicilous that you should have to 301 every page you delete, there are literally 500+ of these phantom links to non-existant URLs and the site is changing all the time.
I have opted to add a 'no index' meta to the 404s and also encourage them to delete from index by adding the pages to the robots.txt file.
Let's see if it works - I'll post on here when I know for sure so other people with the same question can see the outcome.
Thanks again, Damien and Steven.
-
Completely agree with Damien. If they don't exist but Webmaster Tools is showing them, 301 them, there has to be a link somewhere on the internet that is causing them to think there is. I would also go through the server logs to see if there is any additional information like a referring page to the non-existent ones.
-
Hey,
I guess if you've exhausted all other possibilities you can either let them return a 404 and leave them be which will most likely do you no harm or 301 that particular URL to another relevant page on your site.
Make sure they are actually returning a 404 first though via header response check.
DD
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webmaster tools not showing links but Moz OSE is showing links. Why can't I see them in the Google Search Console
Hi, Please see attached photos. I have a website that shows external follow links when performing a search on open site explorer. However, they are not recognised or visible in search console. This is the case for both internal and external links. The internal links are 'no follow' which I am getting developer to rectify. Any ideas why I cant see the 'follow' external links? Thanks in advance to those who help me out. Jesse T7dkL5s T7dkL5s OkQmPL4 3qILHqS
Technical SEO | | jessew0 -
Crawl Diagnostics: Duplicate Content Issues
The Moz crawl diagnostic is showing that I have some duplicate content issues on my site. For the most part, these are variations of the same product that are listed individually (i.e size/color). What would be the best way to deal with this? Choose one variation of the product and add a canonical tag? Thanks
Technical SEO | | inhouseseo0 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Sitemap as Referrer in Crawl Error Report
I have just downloaded the SEOMoz crawl error report, and I have a number of pages listed which all show FALSE. The only common denominator is the referrer - the sitemap. I can't find anything wrong, should I be worried this is appearing in the error report?
Technical SEO | | ChristinaRadisic0 -
Error Reporting
http://pro.seomoz.org/campaigns/33868/issues/18 Rel Canonical Found about 16 hours ago <dl> <dt>Tag value</dt> <dd>http://www.geeks.com/</dd> <dt>Description</dt> <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> <dd>We do have rel canonical on some of the pages this report is recommending that we "fix" this issue.</dd> <dd> Rel Canonical Found about 16 hours ago <dl> <dt>Tag value</dt> <dd>http://www.geeks.com/products.asp?cat=MBB</dd> <dt>Description</dt> <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> </dl> <a class="more expanded">Minimize</a> </dd> </dl>
Technical SEO | | JustinGeeks0 -
How do i Organize an XML Sitemap for Google Webmaster Tools?
OK, so i used am xlm sitemap generator tool, xml-sitemaps.com, for Google Webmaster Tools submission. The problem is that the priorities are all out of wack. How on earth do i organize it with 1000's of pages?? Should i be spending hours organizing it?
Technical SEO | | schmeetz0 -
Issue with Joomla Site not showing in SERP's
Site: simpsonelectricnc dot com I'm working on a Joomla website for a local business that isn't ranking at all for any relevant keyword - including the business name. The site is only about six months old and has relatively few links. I realize it takes time to compete for even low-volume keywords, but I think something else may be preventing the site from showing up. The site is not blocked by Robots.txt (which includes a valid reference to the sitemap)
Technical SEO | | CGR-Creative
There is no duplicate content issue, the .htaccess is redirecting all non-www traffic to www version
Every page has a unique title and H1 tag.
The URL's are search-engine friendly (not dynamic either)
XML sitemap is live and submitted to Google WM Tools. Google shows that it is indexing about 70% of the submitted URL's. The site has essentially no domain authority (0.02) according to Moz - I'm assuming this is due to lack of links and short life on the web.
Until today, 98% of the pages had identical meta descriptions. Again, I realize six months is not an eternity - but the site will not even show up for "business name + city,state" searches in Google. In fact, the only way I've seen it in organic results is to search for the exact URL. I would greatly appreciate any help.0 -
Should I have a 'more' button for links?
I have a website that has a page for each town. rather than listing all the towns with a link to each, I want to show only the most popular towns and have a 'more' button that shows all of them when you click it. I know that the search engine can always see the full list of links and even though the visitor can't this doesn't go against Google guidelines because there is no deception involved, the more button is quite clear. However, my colleague is concerned that this is 'making life hard' for the search engines and so the pages are less likely to be indexed. I disagree. Is he right to worry about this??
Technical SEO | | mascotmike0