Sites in multiple countries using same content question
-
Hey Moz,
I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain?
The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me.
I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately?
Please help and thanks so much!
Cole
-
Just asking.
-
Are you sure eyepaq?
** Yes. I have the same format implemented across several projects - big and small. All is perfect. I have a few cases when some domains are helping eachouther out – so when a new country is deployed it gets a small boost in that geo location due to the others. The approach was also confirmed by several trend analysis in Google in the google forum and at least one Google hangout and across the web in different articles.
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
** It won't be duplicate if you have the content in de in german and the content in uk in english. It will have the same message but it is not duplicate Of course you won't have the same rankings since it's different competition in Germany and UK for example and also the signals, mainly links are counted different for each country. One link from x.de will count towards the de domain in a different way then y.co.uk linking to the your uk domain.
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
*** if I understand correctly he is mainly concern about english content on different geo english based domains (uk, com, canada, co.nz, co au let's say) and for that - if it's the same content - he needs hreflang set for those and he is safe. Google will then rank co.uk domain and content in UK and not the canadian domain. He will also be safe with any "duplicate content issues" - although even without href lang there won’t be any.
-
Are you sure eyepaq?
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
@Colelusby - Is a sub-domain for each location on one domain out the question? So
uk.example.com, fr.example.com etc You can then tell WMTs the sub domain UK targets the UK and the fr targets France etc.
-
Yes, that's it
The use of hreflang has a lot of benefits and overall is very straight forward - google will understand how the structure is setup and you are safe.
Cheers.
-
Is that it?
The same article will rank it two different geographic locations and duplicate content won't hurt me?
I feel like that's too easy. Maybe I'm overthinking it.
Thanks!
-
HI,
In this case the use of hreflang is needed:
https://support.google.com/webmasters/answer/189077?hl=en
As summary each version will have rel alternate hreflang set with hreflang="en-ca" for Canada for example, hreflang="en-us" for US and so on. (first is language and second geo location). So even if the language is the same, it's for a particular region as in some cases you might have some small differences in UK vs Au or Ca etc.
Whne you have a domain with example.ch, the hreflang will be hreflang="de-ch" .
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirecting a site that currently links to the target site
I have a personal blog that has a good amount of back links pointing at it from high quality relevant authoritative sites in my niche. I also run a company in the same niche. I link to a page on the company site from the personal blog article that has bunch of relevant links pointing at it (as it's highly relevant to the content on the personal blog). Overview: Relevant personal blog post has a bunch of relevant external links pointing at it (completely organic). Relevant personal blog post then links (externally) to relevant company site page and is helping that page rank. Question: If I do the work to 301 the personal blog to the company site, and then link internally from the blog page to the other relevant company page, will this kill that back link or will the internal link help as much as the current external link does currently? **For clarity: ** External sites => External blog => External link to company page VS External sites => External blog 301 => Blog page (now on company blog) => Internal link to target page I would love to hear from anyone that has performed this in the past 🙂
Intermediate & Advanced SEO | | Keyword_NotProvided0 -
Can we use webiste content to Marketplce websites (Etsy / Amazon etc..)?
Hello Webmasters, My Name is Dinesh. I am working with Commerce Pundit as Marketing Person. We have one question with one of the website and would like to get the more idea on it We have one page or category name with "Engraved Photos on Wood". Here is page URL: http://www.canvaschamp.com/engraved-photos-on-wood-plaques So my Question about the content which we have added on this page. We have another team and they are handling marketplace department and they are using same content from the above page of website to do listing onto below Marketplace website. Refer website listing which are done by our marketplace team and where you can see that they guys have use the same content of form the above website page as a product info or description of the listing. https://www.etsy.com/listing/237807419/personalized-photo-art-or-custom-text-on?ref=listings_manager_grid
Intermediate & Advanced SEO | | CommercePundit
http://www.amazon.in/dp/B01003REIC
http://www.amazon.in/dp/B010037IEM
http://www.amazon.in/dp/B01000JG6I
http://www.amazon.in/dp/B01003HT6Y Does it create Duplicate content Issue with the our Website? Can marketplace team use the our website content onto various marketplace website to do website? We are every serious with the Organic Ranking for our website. So do let me know that is this right way or do we have to ask to them to stop this activities? Waiting for your reply Thanks
Dinesh
Commerce Pundit0 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Best-of-the-web content in steep competition, ecommerce site
Hello, I'm helping my client write a long, comprehensive, best-of-the-web piece of content. It's a boring ecommerce niche, but on the informational side the top 10 competitors for the most linked to topic are all big players with huge domain authority. There's not a lot of links in the industry, should I try to top all the big industries through better content (somehow), pictures, illustrations, slideshows with audio, and by being more thorough than these very good competitors? Or should I go for something that's less linked to (maybe 1/5 as much people linking to it) but easier? or both? We're on a short timeline of 3 and 1/2 months until we need traffic and our budget is not huge
Intermediate & Advanced SEO | | BobGW1 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0 -
Questions about Vittana.org's blogging contest and having bloggers use specific anchor text.
Hi All, Kenji Crosland here. I just joined vittana.org (yesterday!) to do some of the blogger outreach and content creation/link building. Although most of the links we've gotten in the past are branded links, we've decided to actively pursue anchor text links with specific keywords. If you check, you'll see that vittana has a relatively high domain authority. At the beginning of next week we'll be conducting a blogging contest with A-list celebrity tech bloggers. I don't think we'll have time to contact influencers in other areas for this contest unfortunately. When these A-list bloggers write their posts, we want them to have a link to this page: http://www.vittana.org/students To me, this seems a great opportunity to win on certain keywords we've discovered that should be easy to win and yet have a high volume of monthly searches. These are 5 word plus keywords that have over 300,000 searches per month. The students page, however, isn't optimized for those keywords. In the long run we want to win for the more difficult keyword "literacy". The word "literacy" is what we think will be a part of our new tagline: "Literacy is not enough". Because of time constraints, we won't be able to create landing pages to win for those "low hanging fruit" keywords in time for the blog contest. My question is: to what extent should we optimize the http://www.vittana.org/students page for the five word plus low hanging fruit keywords that we've discovered. I imagine if the content isn't relevant our clickthrough rates will suffer even if we do win for it (Altering our meta description is a possibility here) . Should we just try for the difficult keyword from the get go and come up with other ways to win for the low hanging fruit keywords? I'd love to hear your thoughts on this.
Intermediate & Advanced SEO | | vittana_seo0 -
How should we handle syndicated content on a partner site?
Say we have a subdomain with resources (resources.site.com) and a partner site (partner.com) and have an agreement to share content (I know - this isn't ideal but it's what I've got to work with). Please comment on the following: the use of cross-domain canonicals on "shared" articles an intro and/or conclusion paragraph that is unique on the site that re-publishes that could say something like "our partner over at resources.site.com recently published the following report ... yada, yada....." other meta tags to let Google know that we are not scraping, e.g. author tags any other steps we can take to ensure neither site gets "dinged" by the search engines. Thanks a bunch in advance! AK26
Intermediate & Advanced SEO | | akim260