Webmaster Tools 404s
-
We try to keep our 404s in google webmaster tools to a minimum but in recent months, the volume has simply exploded to over 500k errors. 99.95% of this is complete spam linking to pages that never existed.
Have tried marking as resolved but they just end up back in the list eventually and don't like the idea of 301ing so many links when the pages never existed in the first place.
We can just ignore them all but this makes it hard to identify legitimate 404s that need redirecting as there is only so much data we can export out of WT.
Has anyone had experience with returning 410s? Does google eventually drop these from WT?
-
that seems to be a common problem, that GWT is slow in removing old 404s, even if they haven't been present for ages. i have that problem particularly from urls that have temporarily shown up on a xml-sitemap. 6 months later, they're still there...
guess as usual, we'll just have to wait for the bug to be fixed...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Inspector, Rich Results Tool, GSC unable to detect Logo inside Embedded schema
I work on a news site and we updated our Schema set up last week. Since then, valid Logo items are dropping like flies in Search Console. Both URL inspector & Rich Results test cannot seem to be able to detect Logo on articles. Is this a bug or can Googlebot really not see schema nested within other schema?Previously, we had both Organization and Article schema, separately, on all article pages (with Organization repeated inside publisher attribute). We removed the separate Organization, and now just have Article with Organization inside the publisher attribute. Code is valid in Structured Data testing tool but URL inspection etc. cannot detect it. Example: https://bit.ly/2TY9Bct Here is this page in URL inspector: By comparison, we also have Organization schema (un-nested) on our homepage. Interestingly enough, the tools can detect that no problem. That's leading me to believe that either nested schema is unreadable by Googlebot OR that this is not an accurate representation of Googlebot and it's only unreadable by the testing tools. Here is the homepage in URL inspector: In pseudo-code, our OLD schema looked like this: The NEW schema set up has the same Article schema set up, but the separate script for Organization has been removed. We made the change to embed our schema for a couple reasons: first, because Google's best practices say that if multiple schemas are used, Google will choose the best one so it's better to just have one script; second, Google's codelabs tutorial for schema uses a nested structure to indicate hierarchy of relevancy to the page. My question is, does nesting schemas like this make it impossible for Googlebot to detect a schema type that's 2 or more levels deep? Or is this just a bug with the testing tools?
Technical SEO | | ValnetInc0 -
Webmaster tools not showing links but Moz OSE is showing links. Why can't I see them in the Google Search Console
Hi, Please see attached photos. I have a website that shows external follow links when performing a search on open site explorer. However, they are not recognised or visible in search console. This is the case for both internal and external links. The internal links are 'no follow' which I am getting developer to rectify. Any ideas why I cant see the 'follow' external links? Thanks in advance to those who help me out. Jesse T7dkL5s T7dkL5s OkQmPL4 3qILHqS
Technical SEO | | jessew0 -
Webmaster Tools and Domain registration
Hi, I have a travel project to manage and a question to arrange the registration of this page. Should I register in Webmaster Tools all domains which lead to the webpage of this travel company like abctravel.com, a-b-c-travel.com, adventure-bahamas-crew-travel.com and adventurebahamascrewtravel.com or only the main domain abctravel.com. Thanks for your advice.
Technical SEO | | reisefm0 -
Google Webmasters News Errors ressolution
Hello to the community, i had a sudden increase from just a couple to 50 someting Google Webmaster News Errors. The two areas affected are Content of article and date of article.I found a very good article in SEOMoz about Google Webmasters, but it was published before the changes early last year were done in Google Webmasters. http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools The people that have been asking the same question in the internet have not yet received replies from Google and the Google support replies dont make it really clear. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93994 Any views experiences with this. My site is in Google News, but we do not have a Google News Sitemap. Thanks, Polar
Technical SEO | | Polarstar0 -
Massive amount of 404s after forum prune
So I pruned my vbulletin forum the other week and now webmaster tools detects over 4000 urls as not found. Is there a solution to this? Is this something that could negatively effect rankings? Any ideas?
Technical SEO | | TheTippingPoint0 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0 -
Crawl Tool Producing Random URL's
For some reason SEOmoz's crawl tool is returning duplicate content URL's that don't exist on my website. It is returning pages like "mydomain.com/pages/pages/pages/pages/pages/pricing" Nothing like that exists as a URL on my website. Has anyone experienced something similar to this, know what's causing it, or know how I can fix it?
Technical SEO | | MyNet0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0