Sitemap with References to Second Domain
-
I have just discovered a client site that is serving content from a single database into two separate domains and has created xml sitemaps which contain references to both domains in an attempt to avoid being tagged for duplicate content.
I always thought that a sitemap was intended to show the files inside a single domain and the idea of multiple domains in the sitemap had never occurred to me...
The sites are both very large storefronts and one of them (the larger of the two) has recently seen a 50% drop in search traffic and loss of some 600 search terms from top 50 positions in Google. My first instinct is that the sitemaps should be altered to only show files within each domain, but am worried about causing further loss of traffic.
Is it possible that the inclusion URLs for the second domain in the sitemap may in fact be signalling duplicate content to Search Engines? Does anyone have a definitive view of whether these sitemaps are good, bad or irrelevant?
-
Aran has pretty much hit the nail on the head, the main purpose of a sitemap from a SEO point of view is to submit it to Google Webmaster tools to help Google find all of the pages on your website. Of course sitemaps are also useful from a users point of view and helps people to quickly find a page on your site.
I would defiantly recommend you stick to one sitemap per domain.
Regarding redirects - these should help:
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday
http://www.seomoz.org/blog/301-redirect-or-relcanonical-which-one-should-you-use
-
Hi Aran,
Thanks ... that is pretty much what I was thinking, just blew my mind when I saw it so wanted to check my reasoning.
The second domain is basically a part of the other site (which represents a niche), so I am thinking that those pages in the main site should reference the same pages in the niche site.
Your suggestion of using canonicals is slightly different from my initial solution - thought I should use 301 redirects so that juice is passed to the pages on the second domain. Is there a reason why you would use canonicals rather than 301's?
-
Hi, As far as I'm aware, a sitemap is exactly that, A site map, thus they shouldn't be used to map URLs for mulitple domains.
To avoid duplicate content canonical references should be used.
Use a separate site map for each domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Value of dormant domain
My client used to own a successful domain. They sold the business, the domain was not used by the purchaser. My client bought back the business and redirects the original STRONG domain to their new domain. How can I find out current page rank, traffic, etc of the original domain? Mik
Technical SEO | | mcorso0 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
I Lost Index Status of My Sitemap
We have a simple WordPress website for our law firm, with an English version and a Spanish version. I have created a sitemap (with appropriate language markup in the XML file) and submitted it to Webmaster Tools. Google crawled the site and accepted the sitemap last week, 24/24 pages indexed, 12 English and 12 Spanish. This week, Google decided to remove one of the pages from the index, showing 23/24 pages indexed. So, my questions are as follows: How can I find out which page was dropped from the index? If the pages are the same content, but different language, why did only one version of the page get dropped, while the other version remains? Why did the Big G drop one of my pages from the index? How can I reindex the dropped page? I know this is a fairly basic issue, and I'm embarrassed for asking, but I sure do appreciate the help.
Technical SEO | | RLG0 -
XML Sitemap Creation
I am looking for a tool where I can add a list of URL's and output an XML sitemap. Ideally this would be Web based or work on the mac? Extra bonus if it handles video sitemaps. My alternative is XLS and a bunch of concatenates, but I'd rather something cleaner. It doesn't need to crawl the site. Thanks.
Technical SEO | | Jeff_Lucas0 -
Multiple Sitemaps
Hello everyone! I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future. Am I allowed to do so? would that be a good idea? Open to suggestion 🙂
Technical SEO | | PremioOscar0 -
Is Go Daddy a bad domain?
I heard today that Go Daddy is not the besting hosting domain for websites...it isn't crawled well by websites. Is this true? What is the best hosting domain?
Technical SEO | | CapitolShine0 -
Country domain: Seo for other languages
Hi, I have an italian domain (.it) for an italian hotel, it is an old authoritative domain (1997) and it is well optimized for the keywords that include the city the hotel is in, now the page is decently positioned in Google Italy. There are many problems to have the same rank for German version (in google.de, google.at). The German version is in the /de folder. The hotel has another .com domain, much less authoritative (2007), in a German server, but it was and is only a simple redirect 301 (by code) to the German version in the .it domain. (obviously the rank for this domain is almost nonexistent). Do you have any suggestion? Thank you.
Technical SEO | | depi0 -
Robots.txt versus sitemap
Hi everyone, Lets say we have a robots.txt that disallows specific folders on our website, but a sitemap submitted in Google Webmaster Tools that lists content in those folders. Who wins? Will the sitemap content get indexed even if it's blocked by robots.txt? I know content that is blocked by robot.txt can still get indexed and display a URL if Google discovers it via a link so I'm wondering if that would happen in this scenario too. Thanks!
Technical SEO | | anthematic0