Sites in multiple countries using same content question
-
Hey Moz,
I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain?
The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me.
I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately?
Please help and thanks so much!
Cole
-
Just asking.
-
Are you sure eyepaq?
** Yes. I have the same format implemented across several projects - big and small. All is perfect. I have a few cases when some domains are helping eachouther out – so when a new country is deployed it gets a small boost in that geo location due to the others. The approach was also confirmed by several trend analysis in Google in the google forum and at least one Google hangout and across the web in different articles.
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
** It won't be duplicate if you have the content in de in german and the content in uk in english. It will have the same message but it is not duplicate Of course you won't have the same rankings since it's different competition in Germany and UK for example and also the signals, mainly links are counted different for each country. One link from x.de will count towards the de domain in a different way then y.co.uk linking to the your uk domain.
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
*** if I understand correctly he is mainly concern about english content on different geo english based domains (uk, com, canada, co.nz, co au let's say) and for that - if it's the same content - he needs hreflang set for those and he is safe. Google will then rank co.uk domain and content in UK and not the canadian domain. He will also be safe with any "duplicate content issues" - although even without href lang there won’t be any.
-
Are you sure eyepaq?
If I had 5 domains so say .uk .fr .de .ie and .es and pasted the same 1000 words on each I would assume it would be duplicate content and wouldn't have equal rankings across all 5 domains, but I may be wrong?
I don't think Cole is talking about recreating the same article in different languages because then I would understand the use of the href-lang tag but I think he means the exact same article on separate domains, could be wrong here as well
@Colelusby - Is a sub-domain for each location on one domain out the question? So
uk.example.com, fr.example.com etc You can then tell WMTs the sub domain UK targets the UK and the fr targets France etc.
-
Yes, that's it
The use of hreflang has a lot of benefits and overall is very straight forward - google will understand how the structure is setup and you are safe.
Cheers.
-
Is that it?
The same article will rank it two different geographic locations and duplicate content won't hurt me?
I feel like that's too easy. Maybe I'm overthinking it.
Thanks!
-
HI,
In this case the use of hreflang is needed:
https://support.google.com/webmasters/answer/189077?hl=en
As summary each version will have rel alternate hreflang set with hreflang="en-ca" for Canada for example, hreflang="en-us" for US and so on. (first is language and second geo location). So even if the language is the same, it's for a particular region as in some cases you might have some small differences in UK vs Au or Ca etc.
Whne you have a domain with example.ch, the hreflang will be hreflang="de-ch" .
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query based site; duplicate content; seo juice flow.
Hi guys, We're planning on starting a Saas based service where we'll be selling different skins. Let's say WordPress themes, though it's not about that. Say we have an url called site.com/ and we would like to direct all seo juice to the mother landing page /best-wp-themes/ but then have that juice flow towards our additional pages: /best-wp-themes/?id=Mozify
Intermediate & Advanced SEO | | andy.bigbangthemes
/best-wp-themes/?id=Fiximoz /best-wp-themes/?id=Mozicom Challenges: 1. our content would be formatted like this:
a. Same content - features b. Same content - price c. Different content - each theme will have its own set of features / design specs. d. Same content - testimonials. How would be go about not being penalised by SE's for the duplicate content, but still have the /?id=whatever pages be indexed with proper content? 2. How do we go about making sure SEO juice flows to the /?id pages too?Basically it's the same thing with different skins. Thanks for the help!0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Launching V2 of my site - what questions to ask the SEO Agency
Hi I am in the process of launching a new version of my site and I know I need to get my current SEO Agency to check and test certain aspects of the site so that I don't lose any rankings. The problem is I am unsure of everything to ask them and I was wondering if someone could help me out. I don't want to have missed something basic and therefore want to give them a checklist to confirm they are happy before go live. Any help appreciated
Intermediate & Advanced SEO | | Andy-Halliday1 -
Scraped Content on Foreign Language Site. Big deal or not?
Hi All, I've been lurking and learning from this awesome Q&A forum, and I finally have a question. I am working on SEO for an entertainment site that tends to get scraped from time to time. Often, the scraped content is then translated into a foreign language, and posted along with whatever pictures were in the article. Sometimes a backlink to our site is given, sometimes not. Is scraped content that is translated to a foreign language still considered duplicate content? Should I just let it go, provided a backlink is given? Thanks!
Intermediate & Advanced SEO | | MKGraphiques
Jamie0 -
Focusing on Multiple Niches for one site: good or bad?
Is it wise to focus on multiple niches for one site, rather than zoning in one or two different niches? On one hand, you can target many more topics and go after tons of keywords, but on the other hand doesn't google get confused of what your site is really about? Won't google just focus on one of the niches that you provide more than all others? Any input would be great!
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0