New domain's Sitemap.xml file loaded to old domain - how does this effect SEO?
-
I have a client who recently changed their domain when they redesigned their site. The client wanted the old site to remain live for existing customers with links to the new domain. I guess as a workaround, the developer loaded the new domain's sitemap.xml file to the old domain. What SEO ramifications would this have if any on the primary (new) domain?
-
I don't believe any. Like Jesse said, change it and move on (Google is fairly forgiving).
-
That is a great point. I've just never seen something like that before - questionable, very very questionable.
-
Whatever ramifications it had, nothing a sitemap reports is permanent. So just fix it and move on and it will quickly refresh. Be sure and submit it in GWMT just to be extra sure.
-
Chances are the client didn't have the foresight to match URL's, which means the sitemap contains many invalid URL's. I do not know what exact effect this will have, but it can't be good.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some bots excluded from crawling client's domain
Hi all! My client is in healthcare in the US and for HIPAA reasons, blocks traffic from most international sources. a. I don't think this is good for SEO b. The site won't allow Moz bot or Screaming Frog bot to crawl it. It's so frustrating. We can't figure out what mechanism they are utilizing to execute this. Any help as we start down the rabbit hole to remedy is much appreciated. thank you!
Technical SEO | | SimpleSearch0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Sitemap Size effect SEO
So I've noticed that the sitemap I use has a capacity of 4500 URLs, but my website is much larger. Is it worth paying for a commercial sitemap that encompasses my entire site? I also notice that of the 4500 URLs which have been submitted, only 104 are indexed. Is this normal, if not, why is the index rate so low?
Technical SEO | | moon-boots0 -
Redirect to a new domain and seo effects
I created a one page blogger with listing of several affiliated websites.It gained some visibility on google but it was very plain so i decided to create a wordpress more complex and fancy and to reach the top of search positions. At the moment i decided to keep the listing on blogger and add some links on the page saying "i've moved to a new website. click for more info" and it redirects to my page. But i dont get many clicks to my new site so i was thinking to maybe create a full redirect from my blogger to my wordpress or a iframe to fetch the wordpress but im afraid it may hurt my seo on my blogger. what should i do? thanks in advance
Technical SEO | | cardealpt0 -
SEMRush's Site Audit Tool "SEO Ideas"
Recently SEMRush added a feature to its site audit tool called "SEO Ideas." In the case of specific the site I'm looking at it with, it's ideas consist mostly of suggesting words to add to the page for the page/my phrase(s) to perform better. It suggests this even when the term(s) or phrases(s) it's looking at are #1. Has anybody used this tool for this or something similar and found it to be valuable and if so how valuable? The reason I ask is that it would be a fair amount of work to go through these pages and find ways to add the select words and phrases and, frankly, it feels kind of 2005 to me. Your thoughts? Thanks... Darcy
Technical SEO | | 945010 -
Site Migration between CMS's
Hi There, I have a technical question about migrating CMS's but not servers. My client has site A on Joomla install, He want's ot migrate to Wordpress and we will call this site B. As he has a lot of old content on site A he doesn't want to lose, he has put site B (wordpress install) on a subdirectory site.com/siteb (for example). and will use a htaccess to forward the root domain to this wordpress site. Therefore anyone going to www.site.com will see the new wordpress site and the old content and joomla install will sit on the root of the server. Will Google have an issue with this? Will it even find the old content? what are the issues for the new site and new content? Look forward getting your guys input
Technical SEO | | nezona1 -
Old domain still being crawled despite 301s to new domain
Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>
Technical SEO | | fernandoRiveraZ1 -
Domain redirection and seo implications
We have an existing site that is a subdomain but we recently acquired an exact match domain. Will building links to the exact match domain and having the domain point at our existing subdomain work or should we convert the entire site and redirect our existing subdomain to the new domain? What I'm trying to figure out is how to maximize the benefit here and how the existing mass of links pointing to our existing subdomain (shop.domain.com) can be used. New domain: keywordshop.com Existing URL: shop.domain.com
Technical SEO | | CHarkins0