Remove more than 1000 crawl errors from GWT in one day?
-
In google webmasters tools you have the feature "Crawl Errors". This one displays the top 1000 crawl errors google have on your site.
I have around 16k crawl errors at the moment, which all are fixed. But i can only mark 1000 of them as fixed each day/each time google crawls the site. (This as it only displays top 1000 errors. When i have marked those as fixed it won't show other errors for a while.)
Does anyone know if it's possible to mark ALL errors as fixed in one operation?
-
Google indexed around 20k useless URL's due to mediawiki's insane amounts of URL's that is generated by not using "Short URL's".
It was resolved when we moved the wiki to another location, added the short URL's.
We have just redirected everything. (301).
-
So google indexed more than 16000 pages on your site and now you do what?
Did you just remove them (404) or redirect them (301)?
-
No problem at all, had a wiki up and running without the "short URL's". So Google had ~19k errors on this one because of too long/complicated URL's. Removed it, problem solved and all errors resolved.
-
Hi Host1
It is not possible! You can only mark 1000 errors as fixed a day.
May i ask you how you fixed 16.000 errors at once?
Regards
Alsvik
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No index and Crawl Budget
Hello, If we noindex pages, will it improve crawl budget ? For example pages like these - https://x-z.com/2012/10/
Technical SEO | | Johnroger
https://x-y.com/2012/06/
https://x-y.com/2013/03/
https://x-y.com/2019/10/
https://x-y.com/2019/08/ Should we delete/redirect such pages ? Thanks0 -
Will putting a one page site up for all other countries stop Googlebot from crawling my UK website?
I have a client that only wants UK users to be able to purchase from the UK site. Currently, there are customers from the US and other countries purchasing from the UK site. They want to have a single webpage that is displayed to users trying to access the UK site that are outside the UK. This is fine but what impact would this have on Google bots trying to crawl the UK website? I have scoured the web for an answer but can't find one. Any help will be greatly appreciated. Thanks 🙂
Technical SEO | | lbagley0 -
Pages appear fine in browser but 404 error when crawled?
I am working on an eCommerce website that has been written in WordPress with the shop pages in E commerce Plus PHP v6.2.7. All the shop product pages appear to work fine in a browser but 404 errors are returned when the pages are crawled. WMT also returns a 404 error when ‘fetch as Google’ is used. Here is a typical page: http://www.flyingjacket.com/proddetail.php?prod=Hepburn-Jacket Why is this page returning a 404 error when crawled? Please help?
Technical SEO | | Web-Incite0 -
Moving a website from one domain to another
Hi Guys, I figured I'd investigate this fully before potentially ruining a client's traffic. The rundown:Two websites; one is an ecommerce store and the other is just a brochure website which has references to the ecommerce store. The ecommerce store is hosted on a server we control whereas the brochure one isn't, the URL for the brochure store is nice and simple which is the reason for the switch, as the ecommerce URL is very long and hard to remember. Now from an SEO point of view will it be a case of 301 redirecting every URL from the old domain name to the new one one or is there an easier option? Any tips or links to more information would be much appreciated. Thanks, Dan
Technical SEO | | Sparkstone0 -
Odd URL errors upon crawl
Hi, I see this in Google Webmasters, and am now also seeing it here...when a crawl is performed on my site, I get many 500 server error codes for URLs that I don't believe exist. It's as if it sees a normal URL but adds this to it: %3Cdiv%20id= It's like this for hundreds of URLs. Good URL that actually exists http://www.ffr-dsi.com/food-retailing/supplies/ URL that causes error and I have no idea why http://www.ffr-dsi.com/food-retailing/supplies/%3Cdiv%20id= Thanks!
Technical SEO | | Matt10 -
Best Practice to Remove a Blog
Note: Re-posting since I accidentally marked as answered Hi, I have a blog that has thousands of URL, the blog is a part of my site. I would like to obsolete the blog, I think the best choices are 1. 404 Them: Problem is a large number of 404's. I know this is Ok, but makes me hesitant. 2. meta tag no follow no index. This would be great, but the question is they are already indexed. Thoughts? Thanks PS A 301 redirect to the main page would be flagged as a soft 404
Technical SEO | | Bucky0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0 -
Should there be a canonical tag on my 404 error page?
In my crawl diagnostics, I notice some 4xx client errors. They are appearing for pages that no longer exist, so I'm not sure what the problem is. Shouldn't they just be dealt as 404's? Anyway, on closer inspection I noticed that my 404 error page contains a canonical tag which points to the missing page. Could this be the issue? Is it a good idea to remove the canonical tag from this error page? Thanks.
Technical SEO | | Leighm0