Sitemaps during a migration - which is the best way of dealing with them?
-
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
-
Very much appreciated CleverPhD!
-
Found this while looking for a answer for another question could not find this the other day- right from the mouth of Google to not include pages that do not exist in XML sitemaps.
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
URLs
URLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:
- Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.
-
Mate nailed it completely!
-
I would say make sure that your new sitemap has all the latest URLs. The reason people say that you should have old URLs in the sitemap is so that Google can quickly crawl the old URLs to find the 301s to the new URLs.
I am not convinced that this helps. Why?
Google already has all your old URLs in its systems. You would be shocked how far back Google has data on your site with old URLs. I have a site that is over 10 years old and I still see URL structures referenced in Google from 7 years ago that have a 301 in place. Why is this?
Google will assume that, "Well, I know that this URL is a 301 or 404, but I am going to crawl it every once in a while just to make sure the webmaster did not do this by mistake." You can notice this in Search Console error or link reports when you setup 301s or 404s, they may stay in there for months and even come back once they fall out of the error list. I had an occurrence where I had some old URLs showing up in the SERPs and various Search Console reports for a site for 2 years following proper 301 setups. Why was this happening?
This is a large site and we still had some old content still linking to the old URLs. The solution was to delete the links in that old content and setup a canonical to self on all the pages to help give a definitive directive to Google. Google then finally replaced the old URLs with the new URLs in the SERPs and in the Search Console reports. The point here being that previously our site was giving signals (links) that told Google that some of the old URLs were still valid and Google was giving us the benefit of the doubt.
If you want to have the new URLs seen by Google, show them in your sitemap. Google already has all the old URLs and will check them and find the 301s and fix everything. I would also recommend the canonical to self on the new pages. Don't give any signals to Google that your old URLs are still valid by linking to them in any way, especially your sitemap. I would even go so far as to reach out to any important sites that link to old URLs to ask for an updated link to your site.
As I mentioned above, I do not think there is an "advantage" of getting the new URLs indexed quicker by putting old URLs in the sitemap that 301 to the new URLs. Just watch your Google Search Console crawl stats. Once you do a major overhaul, you will see Google really crawl your site like crazy and they will update things pretty quick. Putting the old URLs in the sitemap is a conflicting signal in that process and has the potential to slow Google down IMHO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration Question
Hi Guys, I am preparing for a pretty standard site migration. Small business website moving to a new domain, new branding and new cms. Pretty much a perfect storm. Right now the new website is being designed and will need another month, however the client is pretty antsy to get her new brand out over the web. We cannot change the current site, which has the old branding. She wants to start passing out business cards and hang banners with the new domain and brand. However, I don't want to be messing with any redirects and potentially screw up a clean migration from the old site to the new. To be specific, she wants to redirect the new domain to the current domain and then when the new site, flip the redirect. However, I'm a little apprehensive with that because a site migration from the current to the new is already so intricate, I don't want to leave any possibility of error. I'm trying to figure out the best solution, these are 2 options I am thinking of: DO NOT market new domain. Reprint all Marketing material and wait until new domain is up and then start marketing it. (At cost to client) Create a one pager on new domain saying the site is being built & have a No Follow link to the current site. No redirects added. Just the no follow link. I'd like option 2 so that the client could start passing out material, but my number one concern is messing with any part of the migration. We are about to submit a sitemap index to Google Search Console for the current site, so we are just starting the site migration. What do you guys think?
Intermediate & Advanced SEO | | Khoo0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Sitemap Indexation
When we use HTML sitemap. Many a times i have seen that the sitemap itself gets mapped to keywords which it shouldn't have got to. So should we keep the HTML sitemap as No-Index, Follow or does anyone has a better solution that the sitemap doesn't show-up for other keyword terms that actually isn't representing this page.
Intermediate & Advanced SEO | | welcomecure0 -
Is it worth creating an Image Sitemap?
We've just installed the server side script 'XML Sitemaps' on our eCommerce site. The script gives us the option of (easily) creating an image sitemap but I'm debating whether there is any reason for us to do so. We sell printer cartridges and so all the images will be pretty dry (brand name printer cartridge in front of a box being a favourite). I can't see any potential customers to search for an image as a route in to the site and Google appears to be picking up our images on it's own accord so wonder if we'll just be crawling the site and submitting this information for no real reason. From a quality perspective would Google give us any kind of kudos for providing an Image Sitemap? Would it potentially increase their crawl frequency or, indeed, reduce the load on our servers as they wouldn't have to crawl for all the images themselves?
Intermediate & Advanced SEO | | ChrisHolgate
I can't stress how little of a hardship it will be to create one of these automatically daily but am wondering if, like Meta Keywords, there is any benefit to doing so?1 -
Best support site software to use
Hi Guys We currently use Desk to run our company support site, it seems ok (I don't administer it), however is it very template driven and doesn't allow useful tools such as being able to add metadata to each page (hence in our Moz crawl tests we get a large number of no metadata errors (which seems like a lost opportunity for us to optimise the site). Our support team are looking to implement MadCap Flare as an information management tool, however this tool outputs HTML as iframes which obviously make it hard for google to crawl the content. We recently implemented HubSpot as our content marketing platform which is great, and we'd love to have the support site hosted on this (great for tracking traffic etc), however as far as I'm aware MadCap Flare doesn't integrate directly with HubSpot....so looking for suggestions on what others are successfully using to host/manage their SEO optimised support sites? Cheers Matt
Intermediate & Advanced SEO | | SnapComms0 -
XML Sitemap Indexation Rate Decrease
On September 28th, 2013 I saw my indexation rate decrease on my XML sitemap that I've submitted through GWT. I've since scraped my sitemap and removed all 404, 400 errors (which only made up ~5% of the entire sitemap). Any idea why Google randomly started indexing less of my XML sitemap on that date? I updated my sitemap 2 week before that date and had an indexation rate of ~85% - no I'm below 35%. Thoughts, idea, experiences? Thanks!
Intermediate & Advanced SEO | | RobbieWilliams0 -
Best way to consolidate link juice
I've got a conundrum I would appreciate your thoughts on. I have a main container page listing a group of products, linking out to individual product pages. The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products. Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe. To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page. However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page. Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this. Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method? Thanks
Intermediate & Advanced SEO | | James770 -
Which is best structure for Multiple XML Sitemap?
I have read such a great blog posts on Multiple XML Sitemaps on following websites before a week. SEOmoz Distilled Google Webmaster Central Blog Search Engine Land SEO Inc I have created multiple XML sitemaps for my eCommerce website with following structure and submitted to Google webmaster tools. http://www.vistastores.com/main_sitemap.xml http://www.vistastores.com/products_sitemap.xml But, I am not satisfy with my second XML sitemap because it contain more than 7K+ product page URLs and looks like very slow crawling by Google! I want to separate my XML sitemap with following structure. With Root Level Category http://www.vistastores.com/outdoor_sitemap.xml http://www.vistastores.com/furniture_sitemap.xml http://www.vistastores.com/kitchen_dining_sitemap.xml http://www.vistastores.com/home_decor_sitemap.xml OR::: End Level Category http://www.vistastores.com/table_lamps_sitemap.xml http://www.vistastores.com/floor_lamps_sitemap.xml . . . . . . . etc.... So, Which is best structure for Multiple XML Sitemap?
Intermediate & Advanced SEO | | CommercePundit0