Sitemap with References to Second Domain
-
I have just discovered a client site that is serving content from a single database into two separate domains and has created xml sitemaps which contain references to both domains in an attempt to avoid being tagged for duplicate content.
I always thought that a sitemap was intended to show the files inside a single domain and the idea of multiple domains in the sitemap had never occurred to me...
The sites are both very large storefronts and one of them (the larger of the two) has recently seen a 50% drop in search traffic and loss of some 600 search terms from top 50 positions in Google. My first instinct is that the sitemaps should be altered to only show files within each domain, but am worried about causing further loss of traffic.
Is it possible that the inclusion URLs for the second domain in the sitemap may in fact be signalling duplicate content to Search Engines? Does anyone have a definitive view of whether these sitemaps are good, bad or irrelevant?
-
Aran has pretty much hit the nail on the head, the main purpose of a sitemap from a SEO point of view is to submit it to Google Webmaster tools to help Google find all of the pages on your website. Of course sitemaps are also useful from a users point of view and helps people to quickly find a page on your site.
I would defiantly recommend you stick to one sitemap per domain.
Regarding redirects - these should help:
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday
http://www.seomoz.org/blog/301-redirect-or-relcanonical-which-one-should-you-use
-
Hi Aran,
Thanks ... that is pretty much what I was thinking, just blew my mind when I saw it so wanted to check my reasoning.
The second domain is basically a part of the other site (which represents a niche), so I am thinking that those pages in the main site should reference the same pages in the niche site.
Your suggestion of using canonicals is slightly different from my initial solution - thought I should use 301 redirects so that juice is passed to the pages on the second domain. Is there a reason why you would use canonicals rather than 301's?
-
Hi, As far as I'm aware, a sitemap is exactly that, A site map, thus they shouldn't be used to map URLs for mulitple domains.
To avoid duplicate content canonical references should be used.
Use a separate site map for each domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.xml sitemap showing in SERP
Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work.
Technical SEO | | Kyleroe950 -
Domain Name Character Amount
I know that best practice is to keep your domain short, but is there an actual suggested character ballpark when coming up with one? Any thoughts are appreciated.
Technical SEO | | AliMac260 -
Parked Domains
I have a client who has a somewhat odd situation for their domains. They've been really inconsistent with how they've used them over the years, which makes for a slightly sticky situation. The client has two domains: compname.com and fullcompanyname.com. Right now, their website is just HTML (no CMS) and all of the URLs are relative, so both domains work. Since the new website will be in WordPress, they need to commit to one domain as the primary. Right now, it looks like compname.com is the one they've used the most in ads and such, so I'm going to recommend they go with that. However, the client has also used fullcompanyname.com a lot. They don't want to have to setup individual 301 redirects for everything. I think it's ridiculous, but you can lead a horse to water... Our developer has done some research and he may have found a solution that will satisfy the client. I just want to find out if there are any SEO implications. The possible plan is to us compname.com as the primary domain and to park fullcompanyname.com. That way, if someone visits fullcompanyname.com/products/my-favorite-product, it will still work without having to setup 301 redirects. Since the domain is parked, Google won't recognize it as duplicate content, correct? Just to be clear on the whole situation, I'm insisting that all of the website URLs need 301 redirects, regardless of the domain. The primary concern is with a lot of other stuff on the server that isn't related to the site (email campaign landing pages, image files, assets that are pulled in by the client's software, etc.). The client's concern is about redirecting all that other stuff (and there is a lot of it--thousands of files). The parked domain would seem to fix that, but I want to make sure that the client won't get Google slapped.
Technical SEO | | BopDesign0 -
Why is there duplicates of my domain
When viewing crawl diagnostics in SEOmoz I can see both "www.website.com" and a truncated version "website.com" is this normal and why is it showing (I do not have duplicates of my site on the server)? E.g.: http://www.klinehimalaya.com/
Technical SEO | | gorillakid
http://klinehimalaya.com/0 -
My Old Domain is Not Changing in Google
I have taken over the following domain www.choice-cottages.co.uk, part of the contract was to re-direct the old site www.choicecottages.info to the new site. Unfortunately I am only a middle man in the arrangement as the website is hosted with another company. The switch was done well over 4 weeks ago, the re-direct itself is working fine. However if you google choice cottages you will see the first listing is www.choicecottages.info, then I have my new site below for a few listings. Google is definitely updating something as before the old domain had lots of site links but this has reduced to a few. Does anyone know anything on this, as in the past it only takes a couple of days to update. Many thanks Andy
Technical SEO | | iprosoftware0 -
Change of domain name?
Hello, We are currently developing a new site for an existing online clothing retailer. The existing site is on a .co.uk domain, however we are targeting a global market and wondered whether we could/should launch the new site under a .com address and whether this would be beneficial? Most of our back links come from Affiliate blogs and we could quite easily change these to the new URL. Thanks Bilal
Technical SEO | | PLP1 -
Different domains
Firstly apologies for the very brief question as I am mainly looking for your thoughts as opposed to specific help about a specific problem. I am working on a site which has two sepreate domains and within one domain, two sub domains. The two different sites both havea high page rank, PR6 each, one is the corporate site and the other is the company blog. There are also two sub domains within the corporate site, again both domains have high pr and tons of content. My question is would it be better to consolidate all the assets under one domain or is it better to keep the sites sepreate, from an seo perspective that is.
Technical SEO | | LiquidTech0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0