Google not found errors in webmaster tool help
-
Hi,
Google Webmaster tools sent me a few messages recently about the jump in the number of 'not found' errors.
From 0 to 290 errors, ouch.
I know what it's from but I think Google is seeing things. We developed another page/subdomain we're working on with links back to the root domain.
Basically a complete list of articles page that lists each article and links back to the root domain.
Not sure what Google is crawling but the links that would result in a 'not found' error aren't there.
Will these disappear over time?
Thanks for the help!
-
Hi Aaron,
Do I get some examples those links ?
-
Thanks, just curious where they would be on my sitemap if they aren't on the site itself?
The links that google is 'crawling' simply don't exist. So there's nothing to 301. That's what I'm confused about.
Thanks again.
Aaron
-
Hi,
Download not found URLs from Google webmaster tools. Then follow below steps:
1. Remove those urls from your sitemap
2. Remove any inter link that is pointing to those URLs
3. Do appropriate 301 redirect of each URLs
4. Submit URLs in "Remove URLs" in Google webmaster Tool
If you follow above points you will quickly able to remove URLs from Google index.
Hope this help....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Google Webmaster tools: Sitemap.xml not processed everyday
Hi, We have multiple sites under our google webmaster tools account with each having a sitemap.xml submitted Each site's sitemap.xml status ( attached below ) shows it is processed everyday except for one _Sitemap: /sitemap.xml__This Sitemap was submitted Jan 10, 2012, and processed Oct 14, 2013._But except for one site ( coed.com ) for which the sitemap.xml was processed only on the day it is submitted and we have to manually resubmit every day to get it processed.Any idea on why it might?thank you
Technical SEO | | COEDMediaGroup0 -
Google Links
I am assuming that the list presented by Google Webmaster tools (TRAFFIC | Links To Your Site) is the one that will actually be used by Google for indexing ? There seem to be quite a few links that there that should not be there. ie Assumed NOFOLLOW links. Am I working under an incorrect assumption that all links in webmaster tools are actually followed ?
Technical SEO | | blinkybill0 -
Persistent Unnatural Links in Webmaster tools
We recently were notified about unnatural links from two websites (totalling a few thousands links each). We went to the websites and asked them to remove the links, which they apparently did. After this we applied for reconsideration to Google, explaining the situation, however they came back and said we still have links. We noticed there were still links, however there were less than before, and so we once again asked the sites to remove all the links. Now we are sure all the links are gone as when we click a random link and view the page source there is no reference to our site, however WebMaster tools is not updating the link list, claiming we still have thousands of links. Do we have to apply for another reconsideration request to get them to re-crawl the sites to get rid of the links, or should it happen automatically?
Technical SEO | | eXia0 -
Recent Webmaster Tools Glitch Impacting Site Quality?
The ramifications of this would not be specific to myself but to anyone with this type of content on their pages... Maybe someone can chime in here, but I'm not sure how much if at all site errors (for example 404 errors) as reported by Google Webmaster Tools are seen as a factor in site quality, which would impact SEO rankings. Any insight on that alone would be appreciated. I've noticed some fairly new weird stuff going on in the WMT 404 error reports. It seems as though their engine is finding objects within the source code of the page that are NOT links but look a URL, then trying to crawl them and reporting them as broken. I've seen a couple different of cases in my environment that seem to trigger this issue. The easiest one to explain are Google Analytic virtual pageview Javascript calls where for example you might send a virtual pageview back to GA for clicks on outbound links. So in the source code of your page you would have something like: onclick="<a class="attribute-value">_gaq.push(['_trackPageview', '/outboundclick/www.othersite.com']);</a> Although this is obviously not a crawl-able link, sure enough Webmaster Tools now would be reporting the following broken page with a 404: www.mysite.com/outboundclick/www.otherwite.com I've seen other such cases of thing that look like URLs but not actual links being pulled out of the page source and reported as broken links. Has anyone else noticed this? Do 404 instances (in this case false ones) reported by Webmaster Tools impact site quality rankings and SEO? Interesting issue here, I'm looking forward to hear some people's thoughts on this. Chris
Technical SEO | | cbubinas0 -
Use webmaster tools "change of address" when doing rel=canonical
We are doing a "soft migration" of a website. (Actually it is a merger of two websites). We are doing cross site rel=canonical tags instead of 301's for the first 60-90 days. These have been done on a page by page basis for an entire site. Google states that a "change of address" should be done in webmaster tools for a site migration with 301's. Should this also be done when we are doing this soft move?
Technical SEO | | EugeneF0 -
Sitemaps for Google
In Google Webmaster Central, if a URL is reported in your site map as 404 (Not found), I'm assuming Google will automatically clean it up and that the next time we generate a sitemap, it won't include the 404 URL. Is this true? Do we need to comb through our sitemap files and remove the 404 pages Google finds, our will it "automagically" be cleaned up by Google's next crawl of our site?
Technical SEO | | Prospector-Plastics0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0