GWT giving me 404 errors based on old and deleted site map
-
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure.
However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages.
Any thoughts?
-
How long has it been since you deleted the old sitemap and provide the new one ?
I think your Google account may just need some extra time to update correctly. it seems to me that Google updates in my WMT account seems to have a little lag time. I think it will update correctly after a few weeks to a month. I don't think you have any actual problem as much as Google is not actually finding and updating new info correctly.
I would wait and see what happens after a short while.
Joe
-
As long as it's 301'ing the important ones then that's ok. The 404's will regularly pop up in this scenario as long as it crawls something that it encounters a page with the link or from the index and follows it to a 404. Just mark them as fixed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
Seomoz pages error
Hi
Technical SEO | | looktouchfeel
I have a problem with seomoz, it is saying my website http://www.clearviewtraffic.com has page errors on 19,680 pages. Most of the errors are for duplicate page titles. The website itself doesn't even have 100 pages. Does anyone know how I can fix this? Thanks Luke0 -
Client error 404
I have got a lot (100+) of 404´s. I got more the last time, so I rearranged the whole site. I even changed it from .php to .html. I have went to the web hotel to delete all of the .php files from the main server. Still, I got after yesterdays crawl 404´s on my (deleted) .php sites. There is also other links that has an error, but aren't there. Maybe those pages were there before the sites remodelling, but I don't think so because .html sites is also affected. How can this be happening?
Technical SEO | | mato0 -
Getting 404 error when open the cache link of my site
My site is hazanstadservice.se and when I am trying to open this to check the cache date i got a 404 error from google. I don't know why ? The cache page url is http://webcache.googleusercontent.com/search?q=cache:j99uW96RuToJ:www.hazanstadservice.se/+&cd=1&hl=en&ct=clnk.
Technical SEO | | Softlogique0 -
Index page 404 error
Crawl Results show there is 404 error page which is index.htmk **it is under my root, ** http://mydomain.com/index.htmk I have checked my index page on the server and my index page is index.HTML instead of index.HTMK. Please help me to fix it
Technical SEO | | semer0 -
How ro write a robots txt file to point to your site map
Good afternoon from still wet & humid wetherby UK... I want to write a robots text file that instruct the bots to index everything and give a specific location to the sitemap. The sitemap url is:http://business.leedscityregion.gov.uk/CMSPages/GoogleSiteMap.aspx Is this correct: User-agent: *
Technical SEO | | Nightwing
Disallow:
SITEMAP: http://business.leedscityregion.gov.uk/CMSPages/GoogleSiteMap.aspx Any insight welcome 🙂0 -
How to setup tumblr blog.site.com to give juice to site.com
Is it possible to get a subdomain blog.site.com that is on tumblr to count toward site.com. I hoped I could point it in webmaster tools like we do www but alas no. Any help would be greatly appreciated.
Technical SEO | | oznappies0 -
Old proudct pages - eComm Site
Hello, Geeks.com currently has approx. 194k pages in Google index. (approx. 30k suppl.) http://www.google.com/search?q=site%3Ageeks.com+inurl%3Aadditem&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a#sclient=psy&hl=en&client=firefox-a&hs=Ltp&rls=org.mozilla:en-US%3Aofficial&source=hp&q=site:www.geeks.com%2F&aq=f&aqi=&aql=&oq=&pbx=1&fp=876898a2ea0c82c7&biw=1512&bih=641 We have many thousands of old product urls which have gone out of stock, never to "see the light of day" again. 14 years worth! Should we be 301'ing all old products pages that go out of stock, if we know for certain we will never carry that SKU again? If we were to do a "mass" 301 of 30k+ urls how would google or other SE's react to that? Could there be any negative implications to doing so? What is considered best practice for eComm sites, as I imagine we are not alone with this type of situation. Thank you in advance. Michael B.
Technical SEO | | JustinGeeks0