Content question about 3 sites targeted at 3 different countries
-
I am new here, and this is my first question. I was hoping to get help with the following scenario:
I am looking to launch 3 sites in 3 different countries, using 3 different domains. For example the.com for USA, the .co.uk for UK , and a slightly different .com for Australia, as I could not purchase .com.au as I am not a registered business in Australia. I am looking to set the Geographic Target on Google Webmaster. So for example, I have set the .com for USA only, with .co.uk I won't need to set anything, and I will set the other Australian .com to Australia.
Now, initially the 3 site will be "brochure" websites explaining the service that we offer. I fear that at the beginning they will most likely have almost identical content. However, on the long term I am looking to publish unique content for each site, almost on a weekly basis. So over time they would have different content from each other. These are small sites to begin with. So each site in the "brochure" form will have around 10 pages. Over time it will have 100's of pages.
My question or my worry is, will Google look at the fact that I have same content across 3 sites negatively even though they are specifically targeted to different countries? Will it penalise my sites negatively?
-
Thank you much appreciated.
I joined Moz for forum would you believe it I wanted somewhere I could go and post SEO questions, and where I would hopefully get answer, and I was hoping to get answers from "credible" people. So thanks.
It's a shame that Moz took out the email the experts feature they used to have a few years ago. Anyway, onwards...
-
Yes, you should be fine in that case. Google is not going to penalize you.
-
Hi Tekeshi,
Thanks for your response. I am not expecting Google to show all the sites in the same SERP's results, because they are clearly targerted for different countries or different Google search engines. So I expect the Australian Geo Trageted site to come up google.com.au , and the UK Geographic Targeted site come on the search results of google.co.uk and so on.
Over time, and I am talking about a matter of a couple of months, they will each be having their own unique blog posts and articles, which I am hoping will help them differentiate from each other. Although "the broad subject" matter will still be the same.
So all the sites might be about "Chocolates" , but over time, the Australian site will talk about Australian chocolates, and different things effecting chocolate sales in Australia, and the US site ma be talking about how to make chocolate flavoured ice cream and so on. I am hoping this approach will help me on the long term.
-
Google's Matt Cutts recently responded to a similar question. Basically, Google does not generally penalize sites for having duplicate content unless they're being spammy about it. However, Google will only show one version of the same content to searchers. So your sites won't be penalized, but Google isn't going to show the US, UK, and Australia sites in the same SERPs for the same query for the same content. The geo targeting should also help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Translated Content on Country Domains
Hi, We have blogs set up in each of our markets, for example http://blog.telefleurs.fr, http://blog.euroflorist.nl and http://blog.euroflorist.be/nl. Each blog is localized correctly so FR has fr-FR, NL has nl-NL and BE has nl-BE and fr-BE. All our content is created or translated by our Content Managers. The question is - is it safe for us to use a piece of content on Telefleurs.fr and the French translated Euroflorist.be/fr, or Dutch content on Euroflorist.nl and Euroflorist.be/nl? We want to avoid canonicalising as neither site will take preference. Is there a solution I've missed until now? Thanks,
Intermediate & Advanced SEO | | seoeuroflorist
Sam0 -
Same language, Different countries. What would be the best way to introduce it?
Hello, We have a .com magento store with the US geo targeting We're going to launch a different versions soon, one for the US, and another one for Canada (we're going to add a Spanish and French versions later as well) The stores content will be same, except currency and contact us page. What would be a better strategy to introduce it to Google? What is better URL structure? example.com/ca/ , example.com/en-ca/ , or ca.example.com/ ? Should we stay with the original www.example.com/ (example.com) and just close an access to /ca/ and /us/ / or use rel=canonical / or use "alternate" hreflang to avoid duplicate content issues? Thanks in advance
Intermediate & Advanced SEO | | Meditinc.com0 -
Index process multi language website for different countries
We are in charge of a website with 7 languages for 16 countries. There are only slight content differences by countries (google.de | google.co.uk). The website is set-up with the correct language & country annotation e.g. de/DE/ | de/CH/ | en/GB/ | en/IE. All unwanted annotations are blocked by robots.txt. The «hreflang alternate» are also set. The objective is, to make the website visible in local search engines. Therefore we have submitted a overview sitemap connected with a sitemap per country. The sitemap has been submitted now for quite a while, but Google has indexed only 10 % of the content. We are looking for suggestion to boost the index process.
Intermediate & Advanced SEO | | imsi0 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
Two Sites Similar content?
I just started working at this company last month. We started to add new content to pages like http://www.rockymountainatvmc.com/t/49/-/181/1137/Bridgestone-Motorcycle-Tires. This is their main site. Then i realized it also put the new content on their sister site http://www.jakewilson.com/t/52/-/343/1137/Bridgestone-Motorcycle-Tires. the first site is the main site and I think will get credit for the unique new content. The second one I do not think will get credit and will more than likely be counted as duplicate content. We are changing this so it will no longer be the same. However, I am curious to see ways people think we could fix this issues? Also is it effecting both sits for just the second one?
Intermediate & Advanced SEO | | DoRM0 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Site #2 beats site #1 in every aspect?
Hey guys, loving SEOMoz so far and will definitely continue my subscription after the free trial. I have a question however, which I am really confused about. When researching my primary keyword, I have found that the second ranked site beats the top site in every single aspect, apart from domain age, which is almost 6 years for the top one and 6 months for the second. When I say every single aspect, I mean everything. More authority for the page and domain, more links, more anchor text links, more authoritive links, more social signals, more relevant links, better domain (although second ranked site is a .net), better MozRank, better MozTrust etc.... I have noticed though, that in the UK SERPs, those sites are switched, so #2 is actually #1. Could it be that the US SERPs just haven't updated yet, or am I missing something completely different.
Intermediate & Advanced SEO | | darrenspeed1