GWT giving me 404 errors based on old and deleted site map
-
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure.
However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages.
Any thoughts?
-
How long has it been since you deleted the old sitemap and provide the new one ?
I think your Google account may just need some extra time to update correctly. it seems to me that Google updates in my WMT account seems to have a little lag time. I think it will update correctly after a few weeks to a month. I don't think you have any actual problem as much as Google is not actually finding and updating new info correctly.
I would wait and see what happens after a short while.
Joe
-
As long as it's 301'ing the important ones then that's ok. The 404's will regularly pop up in this scenario as long as it crawls something that it encounters a page with the link or from the index and follows it to a 404. Just mark them as fixed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
Canonical error from Google
Moz couldn't explain this properly and I don't understand how to fix it. Google emailed this morning saying "Alternate page with proper canonical tag." Moz also kinda complains about the main URL and the main URL/index.html being duplicate. Of course they are. The main URL doesn't work without the index.html page. What am I missing? How can I fix this to eliminate this duplicate problem which to me isn't a problem?
Technical SEO | | RVForce0 -
Site hacked in Jan. Redeveloped new site. Still not ranking. Should we change domain?
Our top ranking site in the UK was hacked at the end of 2014. http://www.ultimatefloorsanding.co.uk/ The site was the subject of a manual spam action from Google. After several unsuccessful attempts to clean it up, using Securi.net and reinstating old versions of the site, changing passwords etc. we took the decision to redevelop the site. We also changed hosting provider as we had received absolutely no support from them whatsoever in resolving the issue. So far we have: Removed the old website files off the server Developed a new website having implemented 301's for all the old URL's (except the spam ones) Submitted a reconsideration request for the manual spam action, which was accepted. Disavowed all the spammy inbound links through Webmaster Tools Implemented custom URL parameters through Google to not index the SPAM URLs ( which were using parameters) Our organic traffic is down by 63% compared to last year, and we are not ranking for most of our target keywords any longer. Is there anything that I am missing in the actions I have taken so far? We were advised that at this stage changing domain and starting again might be the way to go. However the current domain has been used by us since 2007, so it would be a big call. Any advice is appreciated, thanks. Sue - http://www.ultimatefloorsanding.co.uk/
Technical SEO | | galwaygirl0 -
403 error
Hey guys, I know that a 403 is not a terrible thing, but is it worth while fixing? If so what is the best way to approach it. Cheers
Technical SEO | | Adamshowbiz0 -
302 error removing site from results
I have a client who had a screwy url structure based off of parameters and all. They hired a developer that added the keyword to the end of the url and set up 302 redirects to the new keyword included url. Since then the entire site has virtually gone missing in the results but it is not penalized. I put in a request with webmaster tools for reconsideration and they said there was no penalty. I only just found the 302 problem today and think this is probably the problem. Could this remove a site from the search results?
Technical SEO | | webfeatseo0 -
404 errors and what to do
Hi, I am fairly new to the whole seo thing and am still getting confused a bit as to what to do to sort things out. I've checked the help pages but I cannot seem to find the issue. I've just signed up so my site is crawled for the first time and coming up with more then a 1000 404 errors. I checked a couple of the links via the report I downloaded and it does indeed show a 404 error but when I check the pages all seems to work fine. I did find one issue where an image if clicked on twice was pointing to an url with 'title= at the end. Now I have tried to get of that but couldn't find anything wrong. I'm a bit lost as to where to start!
Technical SEO | | junglefrog0 -
Moving Duplicate Sites
Apologies in advance for the complexity. My client, company A, has purchased company B in the same industry, with A and B having separate domains. Current hosting arrangement combines registrar and hosting functions in 1 account so as to allow both domains to point to a common folder, with the result that identical content is displayed for both A & B. The current site is kind of an amalgam of A and B. Company A has decided to rebrand and completely absorb company B. The problem is that link value overwhelmingly favours B over A. The current (only) hosting package is Windows, and I am creating a new site and moving them to Linux with another hosting company. I can use 301's for A , but not for B as it is a separate domain and currently shares a hosting package with A. How can I best preserve the link juice that domain B has? The only conclusion I can come up with is to set up separate Linux hosting for B which will allow for the use of 301's. Does anyone have a better idea?
Technical SEO | | waynekolenchuk0