Sitemaps during a migration - which is the best way of dealing with them?
-
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
-
Very much appreciated CleverPhD!
-
Found this while looking for a answer for another question could not find this the other day- right from the mouth of Google to not include pages that do not exist in XML sitemaps.
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
URLs
URLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:
- Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.
-
Mate nailed it completely!
-
I would say make sure that your new sitemap has all the latest URLs. The reason people say that you should have old URLs in the sitemap is so that Google can quickly crawl the old URLs to find the 301s to the new URLs.
I am not convinced that this helps. Why?
Google already has all your old URLs in its systems. You would be shocked how far back Google has data on your site with old URLs. I have a site that is over 10 years old and I still see URL structures referenced in Google from 7 years ago that have a 301 in place. Why is this?
Google will assume that, "Well, I know that this URL is a 301 or 404, but I am going to crawl it every once in a while just to make sure the webmaster did not do this by mistake." You can notice this in Search Console error or link reports when you setup 301s or 404s, they may stay in there for months and even come back once they fall out of the error list. I had an occurrence where I had some old URLs showing up in the SERPs and various Search Console reports for a site for 2 years following proper 301 setups. Why was this happening?
This is a large site and we still had some old content still linking to the old URLs. The solution was to delete the links in that old content and setup a canonical to self on all the pages to help give a definitive directive to Google. Google then finally replaced the old URLs with the new URLs in the SERPs and in the Search Console reports. The point here being that previously our site was giving signals (links) that told Google that some of the old URLs were still valid and Google was giving us the benefit of the doubt.
If you want to have the new URLs seen by Google, show them in your sitemap. Google already has all the old URLs and will check them and find the 301s and fix everything. I would also recommend the canonical to self on the new pages. Don't give any signals to Google that your old URLs are still valid by linking to them in any way, especially your sitemap. I would even go so far as to reach out to any important sites that link to old URLs to ask for an updated link to your site.
As I mentioned above, I do not think there is an "advantage" of getting the new URLs indexed quicker by putting old URLs in the sitemap that 301 to the new URLs. Just watch your Google Search Console crawl stats. Once you do a major overhaul, you will see Google really crawl your site like crazy and they will update things pretty quick. Putting the old URLs in the sitemap is a conflicting signal in that process and has the potential to slow Google down IMHO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to add semantic linked data to WordPress?
As a recent Moz subscriber, I'm trying to up my game in terms of inbound marketing. One of the most pressing tasks is to add json-ld across all of my WordPress sites. What is the best way to do this? Should I use the technique set out here: https://moz.com/blog/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags Or should I use one of these plugins? https://en-gb.wordpress.org/plugins/schema/ https://en-gb.wordpress.org/plugins/wp-structuring-markup/ I want to get this right so any guidance would be gratefully received.
Intermediate & Advanced SEO | | treb0r0 -
What is internal like structure best for website
I want to construct website internal like structure better, can you advise me what's model architecture to build menu, navigation, build link, hub content will good for audience and search engine. Thank your advise
Intermediate & Advanced SEO | | dunghv360 -
Which search engines should we submit our sitemap to?
Other than Google and Bing, which search engines should we submit our sitemap to?
Intermediate & Advanced SEO | | NicheSocial0 -
Video Sitemap Creation Question
I have created a sitemap file as per Google Web Master Tools instructions. I have it saved as a .txt file. Am I right in thinking that this needs to be uploaded as a .xml file? If so, how do I convert this to a XML? I have tried but it seems to corrupt - there must be a simple way to do this?!
Intermediate & Advanced SEO | | DHS_SH0 -
Best .htaccess guides in your bookmark?
It would be helpful if you can share .htaccess guides you're currently using. Thanks in advance! 🙂
Intermediate & Advanced SEO | | esiow20130 -
Sitemaps and subdomains
At the beginning of our life-cycle, we were just a wordpress blog. However, we just launched a product created in Ruby. Because we did not have time to put together an open source Ruby CMS platform, we left the blog in wordpress and app in rails. Thus our web app is at http://www.thesquarefoot.com and our blog is at http://blog.thesquarefoot.com. We did re-directs such that if the URL does not exist at www.thesquarefoot.com it automatically forwards to blog.thesquarefoot.com. What is the best way to handle sitemaps? Create one for blog.thesquarefoot.com and for http://www.thesquarefoot.com and submit them separately? We had landing pages like http://www.thesquarefoot.com/houston in wordpress, which ranked well for Find Houston commercial real estate, which have been replaced with a landing page in Ruby, so that URL works well. The url that was ranking well for this word is now at blog.thesquarefoot.com/houston/? Should i delete this page? I am worried if i do, we will lose ranking, since that was the actual page ranking, not the new one. Until we are able to create an open source Ruby CMS and move everything over to a sub-directory and have everything live in one place, I would love any advice on how to mitigate damage and not confuse Google. Thanks
Intermediate & Advanced SEO | | TheSquareFoot0 -
What is the best way to embed PDF documents for SEO?
I have been using SCRIBD to embed PDF documents on my site but until recently I did not include the link back to SCRIBD. Will my site get credit for this content or will it go to SCRIBD? Is there a better way to embed PDF documents for SEO?
Intermediate & Advanced SEO | | casper4340 -
What's the best way to hold newly purchased domains over 2 years?
Hi, A friend has just bought 3 domains and is not planning to build websites with them for around 2 years. He asked me what the best thing to do with these domains was...I have 2 ways of look ing at it: a) Putting a holding page on these and submit to Google Webmaster Tools - this way they are indexed by Google and hold search engine trust when the site finally goes up - HOWEVER, if they are not updated with fresh content would that work against them in 2 years time? b) Simply redirect them to their existing site and don't do anything else. Let me know your thoughts. Adido.
Intermediate & Advanced SEO | | Adido-1053990