Google Webmaster Tools Sitemap errors for phantom urls?
-
Two weeks ago we changed our urls so the correct addresses are all lowercase. Everything else 301 redirects to those. We have submitted and made sure that Google has downloaded our updated sitemap several times since.
Even so, Webmaster Tools is reporting 33000 + errors in our sitemap for urls that are no longer in our sitemap and haven't been for weeks. It claims to have found the errors within the last couple of days but the sitemap has been updated for a couple of weeks and has been downloaded by Google at least three times since.
Here is our sitemap: http://www.aquinasandmore.com/urllist.xml
Here are a couple of urls that Webmaster Tools says are in the sitemap:
http://www.aquinasandmore.com/catholic-gifts/Caroline-Gerhardinger-Large-Sterling-Silver-Medal/sku/78664
Redirect errorunavailable
Oct 7, 2011
http://www.aquinasandmore.com/catholic-gifts/Catherine-of-Bologna-Small-Gold-Filled-Medal/sku/78706
Redirect errorunavailable
Oct 7, 2011 -
How long does the actual data usually take to catch up with what WMT says is current?
I have not experienced any delay before. There should only be one sitemap record for your site at any time. That record could be composed of multiple files, but it is one collection of records.
When Google identifies crawl errors, those errors should be generated from the sitemap on file at the time of the error. There is a view sitemap option in Google WMT you can use to see the sitemap they have on file. This step would be next. If you can confirm the bad URL does not appear in the sitemap, I would then wait to see if the issue re-appears after today, October 11th.
I know this is frustrating but the system is very straight forward. I cannot explain why a URL not included in your sitemap would appear on your sitemap crawl errors tab. The only two possibilities I can come with is either you have made an error when sharing some information, or there is an unusual glitch on Google's end.
With all the above noted, working with sitemaps is not a good investment of your time. If your site navigation is properly designed, your sitemap offers no benefit whatsoever.
-
"then these links should not appear going forward." - They are showing up now even though Google says they have our latest sitemap and that the errors were found yesterday. How long does the actual data usually take to catch up with what WMT says is current?
The image urls are built from the actual title on the fly and don't 301 so those aren't a problem. The other one you mentioned does need to be cleaned up in the site map. Thanks for catching that.
These errors are showing up when I go to the crawl errors section and click the sitemap tab. Yes, the sitemap I shared is the same one in WMT.
-
I was unable to locate the URLs listed in your sitemap. If you Google WMT tools settings are correct and the sitemap which you have shared is the same one listed in your Google WMT account, then these links should not appear going forward.
You would need to examine your Google WMT account closely to determine the exact source of these errors.
Where exactly within your Google WMT are you seeing these errors? How are you identifying the source of these URLs are being from your sitemap?
Two weeks ago we changed our urls so the correct addresses are all lowercase.
There are many URLs in your site map which are not lower case. An example:
http://www.aquinasandmore.com/title/Brian-Kolodiejchuk/FuseAction/store.AuthorSearch/Author/2337/
Also you share a lot of image URLs which are not lower case either.
I would not necessarily advise cleaning up the entire site, but at least establish the best practice going forward.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical URL's searchable in Google?
Hi - we have a newly built site using Drupal, and Drupal likes to create canonical tags on pretty much everything, from their /node/ url's to the URL Alias we've indicated. Now, when I pull a moz crawl report, I get a huge list of all the /node/ plus other URL's. That's beside the point though... Question: when I directly enter one of the /node/ url's into a google search, a result is found. Clicking on it redirects to the new URL, but should Google even be finding these non-canonical URL's?? I don't feel like I've seen this before.
Intermediate & Advanced SEO | | Jenny10 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Website Re-Launch - New URLS / Old URL WMT
Hello... We recently re-launched website with a new CMS (Magento). We kept the same domain name, however most of the structure changed. We were diligent about inputting the 301 redirects. The domain is over 15 years old and has tons of link equity and history. Today marks 27 days since launch...And Google Webmaster Tools showed me a recently detected (dated two days ago) URL from the old structure. Our natural search traffic has take a slow dive since launch...Any thoughts? Some background info: The old site did not have a sitemap.xml. The relaunched site does. Thanks!
Intermediate & Advanced SEO | | 19prince0 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0 -
Why are our sites top landing pages URL's that no longer exist and retrun 404 errors?
Digging through analytics today an noticed that our sites top landing pages are for pages that were part of the old www.towelsrus.co.uk website taken down almost 12 months ago. All these pages had the 301 re-directs which were removed a few months back but still have not dropped out of Googles crawl error logs. I can't understand why this is happening but almost certainly the bounce rate on these pages (100%) mean we are loosing potential conversions. How can I identify what keywords and links people are using to land on these pages?
Intermediate & Advanced SEO | | Towelsrus0 -
SEO Overly-Dynamic URL Website with thousands of URLs
Hello, I have a new client who has a Diablo 3 database. They have created a very interesting site in which every "build" is it's own URL. Every page is a list of weapons and gear for the gamer. The reader may love this but it's nightmare for SEO. I have pushed for a blog to help generate inbound links and traffic but overall I feel the main feature of their site is a headache to optimize. They have thousands of pages index in google but none are really their own page. There is no strong content, H-Tags, or any real substance at all. With a lack of definition for each page, Google see's this as a huge ball of mess, with duplicate Page Titles and too many onpage links. The first thing I did was tell them to add a canonical link which seemed to drop the errors down 12K leaving only 2400 left...which is a nice start, but the remaining errors is still a challenge. I'm thinking about seeing if I can either find a way to make each page it's own blurb, H Tag or simple have the Nav bar and all the links in the database Noindex. That way the site is left with only a handful of URLs + the Blog and Forum Thought?
Intermediate & Advanced SEO | | MikePatch0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0