Can We Publish Duplicate Content on Multi Regional Website / Blogs?
-
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs.
USA: http://www.bannerbuzz.com/blog/
UK: http://www.bannerbuzz.co.uk/blog/
AUS: http://www.bannerbuzz.com.au/blog/
CA: http://www.bannerbuzz.ca/blog/
Let me give you very clear ideas on it. Recently, We have published one article on USA website.
http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/
And, We want to publish this article / blog on UK, AUS & CA blog without making any changes.
I have read following paragraph on Google's official guidelines and It's inspire me to make it happen.
Which is best solution for it?
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
-
Hi Gianluca,
Your comment made me doubt my research. I started a new question about it. Do you have a minut to give your vision on my situation? I would really appreciate it.
https://moz.com/community/q/duplicated-content-multi-language-regional-websites
Best regards,
Bob
-
The answer is very simple, independently if you are publishing the post on subfolders or subdomains or even country code domains names:
-
The canonical URL of the copies must be the URL of the original article. This is must be so because the content of all the articles is the identical. If you don't use the rel="canonical" as I suggest you'll fall in the duplicated content issue;
-
You use the hreflang annotations to tell Google what post's URL to show to the users using English and searching from the different blogs' countries.
Remember: the hreflang is not meant for solving the duplicated content issues, but only for suggesting what URL to show to users in a given language and country.
For duplicated content issues the solution is the rel="canonical".
-
-
Thank you very much for answer. Rel Canonical will help me to remove cross domain. But, I would like to get index over 4 different regions.
-
And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs.
I found the problem, you were writing well and following the Google guidelines and all was well but you've decided to get a bit more grey hat.It pays off to take the time to write better content for each region.
Well a few things you could try:
- Cross domain rel=canonical - this will stop duplicate but it would only assign rank for one site.
- Attempt to rewrite the article
- Hide it via robots or no index the page.
- Try the rel-alternate however if they are all in the same language I'm not sure if that would work as well not to mention you have 3 regions not 4.
Hope some of that helps.
-
It seems like you probably don't want 4 different TLD, but instead would want to use ahref lang to indicate to Google that you are showing different regions different content. See here for more of an explanation - https://support.google.com/webmasters/answer/189077?hl=en
Alternatively, if you do want the unique TLDs you could put a canonical to whatever content is duplicate back to the main site (.com probably).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I have a client based in the UK and one of their distributors based in the UAE have copied the content for their own website, will this affect my clients rankings because of duplicate content?
Content Development | | CreativeCow0 -
Best Guest Blogging Plugin?
I have tried numerous times to use the MyBlogGuest plugin on my websites I manage for clients. I have NEVER had success with that plugin. Has anyone used that plugin and have it actually yield any results? I have followed all necessary steps, consistency, etc. Nothing. Is there any plugin out there better?
Content Development | | cschwartzel0 -
I work on a uk decorating website with five of our own bloggers all of which reside on the home page of the website on their own separete blogging urls as sub domains - is this a good idea or would google not like this from an seo point of view?
Should blogs that are part of an overall content site be on separate sites and link in or is it ok to promote them as content on the home page of the site and take users off to their own url to view the site. Is this good practise for seo?
Content Development | | Pday0 -
Correction Duplicate Page Title Problems for a Blog
EDITED: To just focus on the issue at hand. I am trying to figure out the SEO rules instead of just working on the content. Please bear with me. I am adept technically. I just do not know the rules of the SEO process or even some of the termology. So I’m trying to attack problems one at time. Today’s problem – **Duplicate Page Titles ** We evidently have thousands of Duplicate Page Titles. We are using Joomla 2.5 & Easyblog. Our sitemap is automated from XML Sitemap Easyblog takes the title of the sites and uses it for a name of the summary pages. We post 5 blog items per page and all the names are the same. http://www.OursiteName.com/?start=5 Page Title = Site Name http://www.OursiteName.com/?start=10 Page Title = Site Name A similar thing happens on the sorting by Author or Category etc etc. Basically non-duplicate pages are looking like duplicates. What is the best practice / approach? Using the Robot.txt or XML Sitemap to tell Google not to crawl these pages? Writing a script or edit the Easyblog code to edit the 2000 duplicate Page Titles? Other thoughts?
Content Development | | Romana0 -
Help with Duplicate Content Issue for pages...
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
Content Development | | pauledwards
There are about 300 pages affected by duplicate content currently. Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index? The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google. Any advice much appreciated. Kind Regards,0 -
Remove Scraped Content?
There is a site I work for that has content that, when you search in Google a snippet of text from, they are not the top result for. I believe what has happened is that they had written blogs and articles and added them to their site and article directories at the same time and the article directories got cached first. If we're not coming up first for our article, that means we are not believed to be the original author, correct? Should I remove all content from our site where this is happening, even though we actually did create these articles?
Content Development | | poolguy0 -
Can un-unique content damage my rankings?
Hi there, I run a blog @ http://ablemagazine.co.uk We produce our own editorial content for our print magazine. Which means I have a great bank of uniquely written content. I can usually afford to post 1-2 completely 100% unique articles a day. I've also been copy/pasting 2-3 articles from the BBC or The Guardian a day to keep up activity. Should I continue doing what I'm doing? Should I post exclusively unique articles? Thanks
Content Development | | craven220 -
Duplicate Content - Video
I recently noticed a drop in rankings for my site shortly after the new algorithm update. I'm not sure exactly why rankings went down, but would like to know if it has to do with having videos on our site that do not belong to us. We have a few videos on product pages that the manufacturer of the product had created. I was wondering if Google maybe thinks we are maliciously stealing these videos or something and penalizing us for it. And if stuff like this has anything to do with the recent algorithm update. We make our own videos, but some of our manufacturer's videos are just better... and they work with us and are glad for us to have their videos listed. Thanks in advance
Content Development | | poolguy0