Sitemaps during a migration - which is the best way of dealing with them?
-
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
-
Very much appreciated CleverPhD!
-
Found this while looking for a answer for another question could not find this the other day- right from the mouth of Google to not include pages that do not exist in XML sitemaps.
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
URLs
URLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:
- Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.
-
Mate nailed it completely!
-
I would say make sure that your new sitemap has all the latest URLs. The reason people say that you should have old URLs in the sitemap is so that Google can quickly crawl the old URLs to find the 301s to the new URLs.
I am not convinced that this helps. Why?
Google already has all your old URLs in its systems. You would be shocked how far back Google has data on your site with old URLs. I have a site that is over 10 years old and I still see URL structures referenced in Google from 7 years ago that have a 301 in place. Why is this?
Google will assume that, "Well, I know that this URL is a 301 or 404, but I am going to crawl it every once in a while just to make sure the webmaster did not do this by mistake." You can notice this in Search Console error or link reports when you setup 301s or 404s, they may stay in there for months and even come back once they fall out of the error list. I had an occurrence where I had some old URLs showing up in the SERPs and various Search Console reports for a site for 2 years following proper 301 setups. Why was this happening?
This is a large site and we still had some old content still linking to the old URLs. The solution was to delete the links in that old content and setup a canonical to self on all the pages to help give a definitive directive to Google. Google then finally replaced the old URLs with the new URLs in the SERPs and in the Search Console reports. The point here being that previously our site was giving signals (links) that told Google that some of the old URLs were still valid and Google was giving us the benefit of the doubt.
If you want to have the new URLs seen by Google, show them in your sitemap. Google already has all the old URLs and will check them and find the 301s and fix everything. I would also recommend the canonical to self on the new pages. Don't give any signals to Google that your old URLs are still valid by linking to them in any way, especially your sitemap. I would even go so far as to reach out to any important sites that link to old URLs to ask for an updated link to your site.
As I mentioned above, I do not think there is an "advantage" of getting the new URLs indexed quicker by putting old URLs in the sitemap that 301 to the new URLs. Just watch your Google Search Console crawl stats. Once you do a major overhaul, you will see Google really crawl your site like crazy and they will update things pretty quick. Putting the old URLs in the sitemap is a conflicting signal in that process and has the potential to slow Google down IMHO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migration to a new domain
Hi everyone, So i have one project where I'm planning to move current content on new domain, two reasons: 1. It seems current domain has some google penalty (backlink related, not manual) 2. Client wants rebranding and already has domain with new brand name. So as content is high quality and there is no content related penalty from google, what would be the best way to migrate existing content without passing any penalty AND without Google treating it as duplicate content. If i do 301 i suspect any penalty there is might follow, if i just copy existing content it won't be original content, what is the best solution here? Thanks
Intermediate & Advanced SEO | | joelsemy0 -
Sitemap Folders on Search Results
Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?
Intermediate & Advanced SEO | | roipublic0 -
Sitemap Dissappearance??
Greetings Mozzers, Doing my standard run through Webmaster tools and I discover up to 30% of my sitemaps no longer exist. Has anyone else experienced the recent loss of sitemaps/can suggest reasons why this may have happened? Re-submitting all sitemaps now but just concerned this might become an on-going issue...
Intermediate & Advanced SEO | | RobertChapman0 -
Post your 3 best ways to rank well on Google
Hi, Anyone care to share what are your 3 best ways to rank well on Google? As for me i think: 1.) Link building & Social Media 2.) Onsite optimization 3.) Quality Content What about you?
Intermediate & Advanced SEO | | chanel270 -
Submitting sitemaps every 7 days
Question, if you had a site with more than 10 million pages (that you wanted indexed) and you considered each page to be equal in value how would you submit sitemaps to Google? Would you submit them all at once: 200 sitemaps 50K each in a sitemap index? Or Would you submit them slowly? For example, would it be a good idea to submit 300,000 at a time (in 6 sitemaps 50k each). Leave those those 6 sitemaps available for Google to crawl for 7 days then delete them and add 6 more with 300,000 new links? Then repeat this process until Google has crawled all the links? If you implemented this process you would never at one time have more than 300,000 links available for Google to crawl in sitemaps. I read somewhere that eBay does something like this, it could be bogus info though. Thanks David
Intermediate & Advanced SEO | | zAutos0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Best way to re-order page elements based on search engine users
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else. Questions: Is it cloaking? what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet? Is there any better way to re-order elements based on search engine? Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
Intermediate & Advanced SEO | | StickyRiceSEO0 -
What is the best Keyword Research Process and Tool?
I'm trying to re-fine my keyword research process and take any pointers you can give. Also, please share the tools you use these days 🙂 I need to make my process fast and efficient, right now it feels bulky.
Intermediate & Advanced SEO | | Hyrule0