WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
-
Hello Moz Community.
I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure.
If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks.
-
Hi Jane, thanks for the follow up. Every time we see errors showing up in WMT (mainly 404's) we remove the URL's right away and indeed we see the errors going down every 4-5 days (under HTML improvements).
I am just surprised, that if we would not use the URL removal tool, how long it takes for Google to actually remove 404's from their index. I know the higher the PR, the more likely they crawl more often and the faster they remove these 404's I guess, but still.
-
Hi again,
Four months seems abnormally long, but it could have something to do with how many 404s are are - 400 is pretty high. Is this number at least going down every few weeks in WMT?
Cheers,
Jane
-
hi Jane, we've solved the cause of these errors more than 4 months ago at this point. There is no path to these urls anymore, but they keep showing up so it takes Google pretty long to clean up. And our estimate is that there about 400 more of these 404 errors so we still have some time to go I guess.
-
Hi,
How long have these errors been appearing since you fixed the issue? It could be a case of Google looking for URLs on the site that it has seen in the past, even though there is no path to them anymore. With the pathway gone, it should stop looking, but I'm curious how long the issue has been fixed for?
-
I hate to speculate on anything involving SEO, but I've always taken those 404s as visits Google has been able to grab data for. If Webmasters is able to catch the data for a visit to a 404, it'll let you know about it.
What lead me to this cringe assumption cringe was how similar those 404s were to existing pages, like someone tried to type in a URL and got it wrong, or deleted some of it and hit "enter".
Take the info for what it's worth, which isn't fact, just an idea to get you rolling.
-
I've had those too and they are quiet annoying (love seeing 0 errors hehe). I just mark and fixed and hope it doesn't show up again (usually stops appearing after doing that once or twice).
If anyone has another other insight into this, please share!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My backlinks are not showing in webmaster tools? Why
Hi Experts, I have follow backlinks from a domain for 6 months, but its not apear in Links to Your Site tools (search console) that domain has 302k indexed pages in google! Could you please explain me why google not showing this type of backlinks?
Technical SEO | | denakalami7 -
404 or rel="canonical" for empty search results?
We have search on our site, using the URL, so we might have: example.com/location-1/service-1, or example.com/location-2/service-2. Since we're a directory we want these pages to rank. Sometimes, there are no search results for a particular location/service combo, and when that happens we show an advanced search form that lets the user choose another location, or expand the search area, or otherwise help themselves. However, that search form still appears at the URL example.com/location/service - so there are several location/service combos on our website that show that particular form, leading to duplicate content issues. We may have search results to display on these pages in the future, so we want to keep them around, and would like Google to look at them and even index them if that happens, so what's the best option here? Should we rel="canonical" the page to the example.com/search (where the search form usually resides)? Should we serve the search form page with an HTTP 404 header? Something else? I look forward to the discussion.
Technical SEO | | 4RS_John1 -
Changes to 'links to your site' in WebMaster Tools?
We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!
Technical SEO | | AbsoluteDesign0 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Leveraging "Powered by" and link spam
Hi all, For reference: The SaaS guide to leveraging the "Powered By" tactic. My product is an embeddable widget that customers place on their websites (see example referenced in link above). A lot of my customers have great domain authority (big brands, .gov's etc). I would like to use a "Powered By" link on my widgets to create high quality backlinks. My question is: if I have identical link text (on potentially hundreds) of widgets, will this look like link spam to Google? If so, would setting the link text randomly on each widget to one of a few different phrases (to create some variation) avoid this? Hope this makes sense, thanks in advance.
Technical SEO | | NoorHammad0 -
I am getting an error message from Google Webmaster Tools and I don't know what to do to correct the problem
The message is:
Technical SEO | | whitegyr
"Dear site owner or webmaster of http://www.whitegyr.com/, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team" I have always tried to follow Google's guidelines and don't know what I am doing wrong, I have eight different websites all getting this warning and I don't know what is wrong, is there anyone you know that will look at my sites and advise me what I need to do to correct the problem? Website with this warning:
artistalaska.com
cosmeticshandbook.com
homewindpower.ws
montanalandsale.com
outdoorpizzaoven.net
shoes-place.com
silverstatepost.com
www.whitegyr.com0 -
NoIndex/NoFollow pages showing up when doing a Google search using "Site:" parameter
We recently launched a beta version of our new website in a subdomain of our existing site. The existing site is www.fonts.com with the beta living at new.fonts.com. We do not want Google to crawl the new site until it's out of beta so we have added the following on all pages: However, one of our team members noticed that google is displaying results from new.fonts.com when doing an "site:new.fonts.com" search (see attached screenshot). Is it possible that Google is indexing the content despite the noindex, nofollow tags? We have double checked the syntax and it seems correct except the trailing "/". I know Google still crawls noindexed pages, however, the fact that they're showing up in search results using the site search syntax is unsettling. Any thoughts would be appreciated! DyWRP.png
Technical SEO | | ChrisRoberts-MTI0 -
"Too Many On-Page Links" Issue
I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page. The site is here - http://www.ibethel.org/ Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?
Technical SEO | | BethelMedia0