Static XML Sitemap
-
I performed a change of address for one of our sites to go to a new domain. In the process we left out the submission of the old site's sitemap at the new property in google webmaster console and realize now that we need to do this step. The old site has all these domains still getting indexed: https://www.google.com/#q=site:citychurchfamily.org . I believe that I should be making a static xml sitemap file, upload it to the new domain's root directory, and then test/submit it to google on the new domain's GWM property. Question, should the xml sitemap contain entries for all the old domain's links that are currently still being indexed and what is the fastest way to generate this sitemap? Any insight is greatly appreciated.
-
Hi Paul,
Thank-you for sharing that. Question, over the past few weeks we've seen our new domain climb back up into the spot the old domain was at before we made the domain change. I still haven't added a sitemap of all the old urls to the new google webmaster property for the new domain. At this point I'm questioning whether I should even do this at all since I'm concerned doing so at this stage may produce some sort of negative result. I can see the index amount of the old site urls slowly climbing down but there is still a decent amount of them: https://www.google.com/#q=site:citychurchfamily.org . Does it still make sense to submit a sitemap of all the old urls at this stage? If I don't will all the old urls eventually stop being indexed by Google? Should I remove the old site entirely at citychurchfamily.org but keep my 301 redirects in place on the server? I appreciate your first response and any help/insight you offer here.
Sincerely,
Andrew
-
One of the quick things you can do to help the process along is to ensure you've used the Change of Address tool in Google Search Console to alert the SE that the address has officially changed. (You'll need to do this from the verified GSC account of the original website.)
But yes, uploading a sitemap from the old site and leaving it in place for a short while can speed up the process of the search engines detecting the changeover and reindexing under only the new URLs.
The temp sitemap should definitely include ALL the old site's URLs if at all possible. If you don't have a backup of the original site's xml sitemap to work with, there are a number of workarounds.
- Install a backup of the old site on a staging server and crawl it to create the new sitemap (you'll need to adjust the output to correct the domain name, but that'll be easy using basic search/replace tools) [Most complete method]
- Use a tool like Screaming Frog (or a static sitemap creation tool like xml-sitemaps.com that Robert mentions) to crawl the Wayback Machine's best-archived version of the site. [Dependent on level of detail on Wayback]
- Scrape the results of that site: Google search and construct a sitemap from those URLs [Likely least complete, but quickest access.]
You'll only want to leave the alternate sitemap live for a short period of time, or it can impact the crawling of the legit new sitemap. Search Engines don't like sitemaps with a lot of "dirt" or redirected pages in them. Once you're monitoring shows a large number of the old URLs dropping out of the index, remove the sitemap and let the Change of Address and the 301-redirects do the rest of the job they're designed for.
Hope that all makes sense?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
Unsolved What are the best SEO plugins for WordPress?
Before embarking on a programming language learning journey, the first thing to think of is your motivation.
Local Website Optimization | | Kate_balls0 -
Captcha in Serf
i have a problem in search engine.. look at rank 3 that is my site.. how can i solve it ? this is post link : https://whcl.ir/super-fue/ sVSxGna sVSxGna
Local Website Optimization | | whcl.ir740 -
I have followed all the steps in google speed ranking on how to increase my website http://briefwatch.com/ speed but no good result
My website http://briefwatch.com/ has a very low-speed score on google page speed and I followed all the steps given to me still my website speed doesn't increase
Local Website Optimization | | Briefwatch0 -
Meta descriptions in other languages than the page's content?
Hi guys, I need an opinion on the optimization of meta descriptions for a website available in 6 languages that faces the following situation: Main pages are translated in 6 languages, English being primary >> all clear here. BUT The News section includes articles only in English, that are displayed as such on all other language versions of the website. Example:
Local Website Optimization | | Andreea-M
website.com/en/news/article 1
website.com/de/neues/article 1
website.com/fr/nouvelles/article 1
etc. Because we don't have the budget right now to translate all content, I was wondering if I could add only the Meta Titles and Meta Descriptions in the specific languages (using Google Translate), while the content to remain in English. Would this be accepted as reasonable enough for Google, or would it affect the website ranking?
I'd like to avoid major mistakes, so I'm hoping someone here on this forum has a better idea of how to proceed in this case.0 -
Using posts to make static pages - A best practice or a bad idea?
I have started working with a few law firms this year. They already have websites and I am doing various marketing tasks such as copywriting, redesigns, and, of course, SEO. In a couple of cases I was surprised to find that they had made the pages describing their various practice areas post content. I'm not sure why. But I suspect that the idea might have been to have the phrase: /practice-areas/ as a part of their URL. I didn't really like the idea of treating pages like posts. It seems a bit like working the system. But apart from that, wouldn't pages have a higher value as "permanent" content? As posts - their publish date has more weight, right? And they'd get old? But maybe the previous developers were on to something and the category/post approach to listing practice areas is the way to go? I am starting a new site for a new firm and I'd like to feel more confident about the right structure to choose for this kind of website before I start. Does anybody know the right answer? Thanks!
Local Website Optimization | | Dandelion1 -
Need sitemap opinion on large franchise network with thousands of subdomains
Working on a large franchise network with thousands of subdomains. There is the primary corporate domain which basically directs traffic to store locators and then to individual locations. The stores sell essentially the same products with some variations on pricing so lots of pages with the same product descriptions. Different content All the subdomains have their location information address info in the header, footer and geo meta tags on every page. Page titles customized with franchise store id numbers. Duplicate content Product description blocks. Franchisee domains will likely have the ability to add their own content in the future but as of right now most of the content short of the blocks on the pages are duplicated. Likely limitations -- Adding City to page titles will likely be problematic as there could be multiple franchises in the same city. Ideally it would be nice if users could search for the store or product and have centers return that are closest to them. We can turn on sitemaps on all the subdomains and try to submit them to the search engines. Looking for insight regarding submitting all these sites or just focusing on the main domain that has a lot less content on it.
Local Website Optimization | | jozwikjp0 -
How many backlinks from one domain?
How many backlinks from one domain is too many? 1? 3? 10? For example, directory listings. If you have 5 separate links to one website in lets say DMOZ (good for you!), is it really only "juicy" one time? Or each one just as awesome? What about multiple guest articles on a related website? If I had 2 or 3 articles on one website that each have different contextual links, is it just the same as if I had one article?
Local Website Optimization | | Cantor-Crane0