Using a single sitemap for multiple domains
-
We have a possible duplicate content issue based on the fact that we have a number of websites run from the same code base across .com / .co.uk / .nl / .fr / .de and so on. We want to update our sitemaps alongside using the href lang tags to ensure Google knows we've got different versions of essentially the same page to serve different markets.
Google has written an article on tackling this:https://support.google.com/webmasters/answer/75712?hl=en but my question remains whether having a single sitemap accessible from all the international domains is the best approach here or whether we should have individual sitemaps for each domain.
-
Jon -
If you have the different websites running with different languages (i.e. .com is English, .nl is Dutch, .fr is French, etc), then you should probably have a separate sitemap for each site.
If they are all the same language, and you just have the site loading with .com / .nl / .fr, then Google will see this as duplicate content and you should likely make changes to keep them a bit more separate...
Thanks,
-- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Image Sitemap
I currently use a program to create our sitemap (xml). It doesn't offer creating an mage sitemaps. Can someone suggest a program that would create an image sitemap? Thanks.
Technical SEO | | Kdruckenbrod0 -
Domain not ranking in Google
https://www.buitenspeelgoed.nl/ is a domain acquired by our client. Previously this website was on http://www.buitenspeelgoed-keupink.nl. With the old domain they were ranking top 30 on 'buitenspeelgoed' in google.nl. Now with the new exact match domain they aren't ranking any more (for months). However, the website is indexed, as you can see on http://1l1.be/nz I don't know what to do anymore. Need some advise. What we allready have done the last months: made adjustments to the 301-redirects (this was originaly setup wrong by the webdesigner (de) optimized the homepage on 'buitenspeelgoed' (strange is the fact that the Moz robot can't access the site). Checked the robots.txt to see if the website was blocked for Google Checked the meta robots to see if the website was blocked for Google Disavowed some spammy (old) links which linked to the old domain Checked Search console > Fetch as Google if there isn't any Malware of some kind (and to see if Google can access the site) Checked Search consol to see if there manual spam actions (isn't the case) Checked for duplicate content by copy/paste some texts in Google and see if any other results are showing up (isn't the case for most of the texts) Please let me know what we can do.
Technical SEO | | InventusOnline0 -
Migrating domains from a domain that will have new content.
We have a new url. The old url is being taken over by someone else. Is it possible to still have a successful redirect/migration strategy if we are redirect from our old domain, which is now being used by someone else. I see a big mess, but I'm being told we can redirect all the links to our old content (which is now used by someone else) to our new url. Thoughts? craziness? insanity? Or I'm just not getting it:)
Technical SEO | | CC_Dallas0 -
Can You Use More Then One Google Local Rich Snippet on a single site/ on a single page.
I am currently working on a website for a business that has multiple office locations. As I am trying to target all four locations I was wondering if it is okay to have more then one Local Rich Snippet on a single page. (For example they list all four locations and addresses within their footer and I was wondering if I could make these local rich snippets). What about having more then one on a single website. For example if a company has multiple offices located in several different cities and have set up individual contact pages for these cities, can each page have it's own Local Rich Snippet? Will Google look at these multiple "local rich snippets" as spaming or will they recognize the multiple locations and count it towards their local seo?
Technical SEO | | webdesignbarrie1 -
Domain Aliases
Hi there, I've got two sites mysite.com and mysite.org .org is indexed by google, .com doesnt seem to be. .com is used for some material that is sent out, and accounts for about 20% of incoming visotors. (80% end up on .org) Is there any positive or negative effect from this? Would I benefit from 301'ing the .com to .org?
Technical SEO | | dencreative0 -
Using Robots.txt
I want to Block or prevent pages being accessed or indexed by googlebot. Please tell me if googlebot will NOT Access any URL that begins with my domain name, followed by a question mark,followed by any string by using Robots.txt below. Sample URL http://mydomain.com/?example User-agent: Googlebot Disallow: /?
Technical SEO | | semer0 -
Impact on domain when using a subdomain for majority opf site content
Hello, We're looking to use a subdomain for a bookings engine that will also host the majority of our site content as it wil house the details of the courses that we'll be selling online. All content is currently available on www.existingdomain.co.uk A few pages will remain here but the majority will ultimately be hosted on a different IP address under a subdomain: courses.existingdomain.co.uk I am a little concerened about search engine reaction to this content separation. Would this approach dilute the rankings of www.existingdomain.co.uk? Is there anything else we need to be mindful of? We have alternative options if this is a real SEO faux pas. Thanks
Technical SEO | | Urbanfox0 -
Redirecting root domains to sub domains
Mozzers: We have a instance where a client is looking to 301 a www.example.com to www.example.com/shop I know of several issues with this but wondered if anyone could chip in with any previous experiences of doing so, and what outcomes positive and negative came out of this. Issues I'm aware of: The root domain URL is the most linked page, a HTTP 301 redirect only passes about 90% of the value. you'll loose 10-15% of your link value of these links. navigational queries (i.e.: the "domain part" of "domain.tld") are less likely to produce google site-links less deep-crawling: google crawls top down - starts with the most linked page, which will most likely be your domain url. as this does not exist you waste this zero level of crawling depth. robots.txt is only allowed on the root of the domain. Your help as always is greatly appreciated. Sean
Technical SEO | | Yozzer0