301 forwarding old urls to new urls - when should you update sitemap?
-
Hello Mozzers,
If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls?
I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not.
Thanks in advance, Luke
-
Hi Luke
To include the suggestion on searchenginewatch.com in this conversation, it said:
Submit an updated sitemap to Google Webmaster tools and use the change of address function if moving to a new domain. Remember to initially keep the old URLs in your XML sitemap to facilitate Google crawling those links and processing the changes in their index.
Well it would be interesting to hear others feedback on that. Personally, I think having old URLs in a sitemap (that without a redirect would result in a page not found 404 error) doesn't seem correct to me.
Presumably, you have had the URL in the sitemap previously when the page at the URL was active. But then, by setting up a 301 redirect, you are telling Google that the page at the URL that Google has in its index has now permanently moved to a new URL.
When you submit a sitemap to Google then you are submitting a list of all the URLs on your site that you are asking Google to crawl. But to include the old URL in your sitemap along with the new URL is essentially asking Google to crawl two URLs pointing to the same page.
I'm not sure Google would necessarily consider that to be a canonical issue (because the old URL is now not current) but for me it's a misuse of the sitemap.
But as I say, it would be interesting to hear others feedback on this.
Peter
-
Thanks Peter - I note here:
http://searchenginewatch.com/article/2115729/10-Steps-to-a-Successful-SEO-Migration-Strategy
Here it's suggested that sites keep old URLs in their sitemaps. I've heard others suggest otherwise. Seems to be a fair bit of conflicting advice out there.
-
Hi Luke
My advice would be that you submit one the sitemap containing the new URLs. The new sitemap should contain the new URLs and replace the old sitemap.
The 301 redirects are for the purpose of redirecting outdated links and search engine indexes to the new 301 (permanent) URLs and should remain in place even when search engines have updated their indexes so that they always redirect any outdated backlinks.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Pages that did NOT 301 redirect to the new site
Hi, Is there a tool out there that can tell me what pages did NOT 301 redirect to the new sites? I need something rather than going into google.com and typing in site:oldsite.com to see if it's still indexed and if it's not 301 redirecting.. I'm not sure if screaming frog can do that. Thanks.
Intermediate & Advanced SEO | | ggpaul5620 -
Rankings drop from the new update
Hello, I've noticed some big ranking downs on important keywords, from the last Google update, and don't really know what seams to be the problem, but have an assumption. In April 2015 we had 3.000.000 pages indexed by Google, and 80% of them had duplicated content for about 90% of it. The site I'm talking about is http://nobelcom.com/. The duplicated content came from variations between calling from and calling to selections, because each of this selection was making a new url (ex. nobelcom.com/caling from/calling to). If "calling from" and calling to were the same country the url was nobelcom.com/calling-from, but after you chosen another calling to, the url become like the one in the example. To solve this I've decided to keep the nobelcom.com/calling-from urls and for different calling to country to display the content trough a javascript, because it was the same, it changed only the country names and the rates. I thought that this change will help us with the duplicate content, and still deliver our client what they are interested in, without affecting the UX, and also reducing the link juice dilution because we had 3.000.000 indexed by Google and most of them with no added value Can this be the reason for the drops? Now we have 590.000 pages indexed by Google.
Intermediate & Advanced SEO | | Silviu1 -
Duplicate URLs ending with #!
Hi guys, Does anyone know why a site can contain duplicate URLs ending with hastag & exclamation mark e.g. https://site.com.au/#! We are finding a lot of these URLs (as duplicates) and i was wondering what they are from developer standpoint? And do you think it's worth the time and effort adding a rel canonical tag or 301 to these URLs eventhough they're not getting indexed by Google? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Effect of 301 redirect to a relative url to homepage?
One of our new clients recently encountered a site-wide ranking drop for many keywords and I'm pretty confident regarding their link profile as to being 98% legit. Background: 1. Client full site is https, and all http pages are 301 redirected to their https counterpart 2. Client has ~50 links partners (all legitimate sites + schools etc) links to client with urls such as www.example.com/portal/123.aspx that redirects to www.example.com. 3. Client homepage 301 redirects from www.example.com to www.example.com/default.aspx and then 301 redirects to the relative url "/Home.aspx". 4. Client launched some testing with Google website optimizer tool. ~1-2 months ago. Symptoms: 1. Rankings dropped for basically many/all 30-40+ keywords by ~15 positions 2. Seomoz reports close to a double of existing pages + (600+) duplicate content in the same date range. Webmasters only report 80 duplicate titles though. 3. Domain authority by seomoz reduced a bit + backlinks recorded by seomoz to the website nearly halved in the past 2 months. I'm not sure if I narrowed this towards the right direction, and it isn't clear when the relative url 301 redirect was implemented: 1. The 301 redirect to the relative page (www.example.com/default.aspx to "/home.aspx") is accounting for the loss of links recorded by seomoz. 2. The ~50 links the client currently use (www.example.com/portal.123.aspx 301 redirecting to www.example.com, also relative) as a tracking tool is being considered 301 redirect abuse. 3. Maybe something went wrong with the usage of google optimizer tool for SEO purposes? Visitor traffic to each of the tested pages looked fine. I would greatly appreciate any advice/insights on what I might be missing in terms of direction / factors. Thanks! Alex
Intermediate & Advanced SEO | | sixspokemedia0 -
Sitemaps: Alternate hreflang
Hi, some time ago I have read that there is a limit of 50.000 URLs per sitemap file (So, you need to create a sitemap index and separate files with 50.000 urls each). [Source]. Now we are about to implement the link hreflang in the sitemap [Source], and we dont know if we have to count each alternate as a different url. We have 21 different well positioned domains (Same name, different cctlds, a little different content [varies in currencies, taxes, some labels, etc] depending in the target country) so the amount of links per url would be high. A) Shall we count each link alternate as a separate url, or just the original ones? For example, if we have to count the link alternates, that would make us have 2380pages per sitemap, each with one original url and 20 alternate links. (Always being aware of the 50mb maximum filesize) B) Actually we have one sitemap per domain. Using this, shall we generate one per domain using the matching domain as original url? Or it would be the same if we upload to every domain the same sitemap? Thanks
Intermediate & Advanced SEO | | marianoSoler980 -
New Site: Use Aged Domain Name or Buy New Domain Name?
Hi,
Intermediate & Advanced SEO | | peterwhitewebdesign
I have the opportunity to build a new website and use a domain name that is older than 5 years or buy a new domain name. The aged domain name is a .net and includes a keyword.
The new domain would include the same keyword as well as the U.S. state abbreviation. Which one would you use and why? Thanks for your help!0 -
301 from penalized domain to new domain
I have a client whose site isn't necessarily penalized since they still show for many terms in the SERPS, however at one point they did an xrummer blast of 13,000 links for two anchor texts they were trying to rank for. They have purchased a new domain and have gone white hat and want to 301 some of the old site to the new purely for the users sake so past visitors still find them at t the new location. Will creating 301 redirects pass on to the new domain any bad Karma from the old one in Google's eyes? Thanks for the help.
Intermediate & Advanced SEO | | JoshGill270