Sitemaps during a migration - which is the best way of dealing with them?
-
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
-
Very much appreciated CleverPhD!
-
Found this while looking for a answer for another question could not find this the other day- right from the mouth of Google to not include pages that do not exist in XML sitemaps.
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
URLs
URLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:
- Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.
-
Mate nailed it completely!
-
I would say make sure that your new sitemap has all the latest URLs. The reason people say that you should have old URLs in the sitemap is so that Google can quickly crawl the old URLs to find the 301s to the new URLs.
I am not convinced that this helps. Why?
Google already has all your old URLs in its systems. You would be shocked how far back Google has data on your site with old URLs. I have a site that is over 10 years old and I still see URL structures referenced in Google from 7 years ago that have a 301 in place. Why is this?
Google will assume that, "Well, I know that this URL is a 301 or 404, but I am going to crawl it every once in a while just to make sure the webmaster did not do this by mistake." You can notice this in Search Console error or link reports when you setup 301s or 404s, they may stay in there for months and even come back once they fall out of the error list. I had an occurrence where I had some old URLs showing up in the SERPs and various Search Console reports for a site for 2 years following proper 301 setups. Why was this happening?
This is a large site and we still had some old content still linking to the old URLs. The solution was to delete the links in that old content and setup a canonical to self on all the pages to help give a definitive directive to Google. Google then finally replaced the old URLs with the new URLs in the SERPs and in the Search Console reports. The point here being that previously our site was giving signals (links) that told Google that some of the old URLs were still valid and Google was giving us the benefit of the doubt.
If you want to have the new URLs seen by Google, show them in your sitemap. Google already has all the old URLs and will check them and find the 301s and fix everything. I would also recommend the canonical to self on the new pages. Don't give any signals to Google that your old URLs are still valid by linking to them in any way, especially your sitemap. I would even go so far as to reach out to any important sites that link to old URLs to ask for an updated link to your site.
As I mentioned above, I do not think there is an "advantage" of getting the new URLs indexed quicker by putting old URLs in the sitemap that 301 to the new URLs. Just watch your Google Search Console crawl stats. Once you do a major overhaul, you will see Google really crawl your site like crazy and they will update things pretty quick. Putting the old URLs in the sitemap is a conflicting signal in that process and has the potential to slow Google down IMHO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration and Traffic Help!
Hi Moz, I recently migrated my website with the help of an SEO company using 301 redirects. The reason for the move was to change our CMS from .aspx to Drupal/Wordpress. The homepage (www.shiftins.com) and the blog (www.shiftins.com/blog) were the only two pages that kept the same url. Everything else was redirected. It's been about two months since the redirects were completed and traffic has dropped off about 90%. I'm starting to worry that something was not done properly and my traffic may never return. The process for the redirects seem correct when I checked the work the SEO company did. All pages were duplicated, redirected to individual pages, then the old pages were de-indexed. Are there any insights the community can provide? Please help!
Intermediate & Advanced SEO | | shictins1 -
Where is the best location for my primary keyword in my URL?
http://moz.com/learn/seo/url says: http://www.example.com/category-keyword/subcategory-keyword/primary-keyword.html However I am wondering about structuring things this a little backwards from that: http://www.example.com/primary-keyword/ (this would be an introduction and overview of the topic described by the primary keyword)
Intermediate & Advanced SEO | | TheEspresseo
http://www.example.com/primary-keyword/secondary/ (this would be a category landing page with snippets from articles within the niche described by the secondary keyword, which is itself a niche of the primary keyword)
http://www.example.com/primary-keyword/secondary/article-title/ (in-depth article on a topic within the scope of the secondary, which is within the scope of the primary) Where http://www.example.com/primary-keyword/ is the most important page targeting the most important URL. Thoughts?0 -
How Does Google Deal with Negative Reviews or Mentions
I would really like to get everyone's opinion on how you all think Google deals with negative reviews, or just mentions on negative websites. One of my clients has a page on a powerful negative website (one that is designed to shame all those on it), and no other real reviews around the web. I have never seen any evidence that Google takes positive or negative reviews into account when ranking websites. But maybe one of you has? Currently when you search for my client by name, the negative website comes up second, which is obviously embarrassing for them. If we sought out a whole load of positive (obviously genuine) reviews from happy clients, do you think this might influence the prominent placement of this negative website? Also would it influence the ranking of the website in general? I would love to hear your opinions on this topic. [BTW We have already explored the path of the right to be forgotten, but it seems to be inundated so we are not holding our breaths.]
Intermediate & Advanced SEO | | Wagada0 -
Sitemaps recommend by google
Google in it guideline recommends to create a sitemap. Do they means a /sitemap.xml or does it need to be sitemap directly on the website ? Does it make any difference ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
What would be the best domain choice?
Hello I got a website www.keywordCA.com and I'm ranking #1 spot on "keyword" but what I notice if you have the exact match you get more site links and etc. Like this keyword that match with my domain name "keyword CA" The ideal name will be www.keyword.com but is taken and the owner don't want to sell the domain (at least he is not using it, is just parked) and I also got the domain www.keyword.net Do you think www.keyword.net will be much better than KeywordCA.com in order to get more exposure and google will generate more site links?
Intermediate & Advanced SEO | | jpgprinting0 -
Keyword Targeting Best Practices??
What is the best way to target a specific keyword? I rank well for several of my keywords but want to do better on others. How do I go about doing this?
Intermediate & Advanced SEO | | bronxpad0 -
Best tools for exploring links?
and not just every single link, but ones you know that Google is actually indexing. I find seomoz to be super easy, but there is no way to distinguish links that are actually counting "juice", or am i missing something. What about majesticseo - any other similar tools you use when trying to find linking sites that pass juice?
Intermediate & Advanced SEO | | imageworks-2612900 -
Limiting URLS in the HTML Sitemap?
So I started making a sitemap for our new golf site, which has quite a few "low level" pages (about 100 for the golf courses that exist in the area, and then about 50 for course architects), etc etc. My question/open discussion is simple. In a sitemap that already has about 50 links, should we include these other low level 150 links? Of course, the link to the "Golf Courses" is there, along with a link to the "Course Architects" MAIN pages (which, subdivides on THOSE pages.) I have read the limit is around 150 links on the sitemap.html page and while it would be nice to rank long tail for the Golf Courses. All in all, our site architecture itself is easily crawlable as well. So the main question is just to include ALL the links or just the main ones? Thoughts?
Intermediate & Advanced SEO | | JamesO0