Can We Publish Duplicate Content on Multi Regional Website / Blogs?
-
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs.
USA: http://www.bannerbuzz.com/blog/
UK: http://www.bannerbuzz.co.uk/blog/
AUS: http://www.bannerbuzz.com.au/blog/
CA: http://www.bannerbuzz.ca/blog/
Let me give you very clear ideas on it. Recently, We have published one article on USA website.
http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/
And, We want to publish this article / blog on UK, AUS & CA blog without making any changes.
I have read following paragraph on Google's official guidelines and It's inspire me to make it happen.
Which is best solution for it?
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
-
Hi Gianluca,
Your comment made me doubt my research. I started a new question about it. Do you have a minut to give your vision on my situation? I would really appreciate it.
https://moz.com/community/q/duplicated-content-multi-language-regional-websites
Best regards,
Bob
-
The answer is very simple, independently if you are publishing the post on subfolders or subdomains or even country code domains names:
-
The canonical URL of the copies must be the URL of the original article. This is must be so because the content of all the articles is the identical. If you don't use the rel="canonical" as I suggest you'll fall in the duplicated content issue;
-
You use the hreflang annotations to tell Google what post's URL to show to the users using English and searching from the different blogs' countries.
Remember: the hreflang is not meant for solving the duplicated content issues, but only for suggesting what URL to show to users in a given language and country.
For duplicated content issues the solution is the rel="canonical".
-
-
Thank you very much for answer. Rel Canonical will help me to remove cross domain. But, I would like to get index over 4 different regions.
-
And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs.
I found the problem, you were writing well and following the Google guidelines and all was well but you've decided to get a bit more grey hat.It pays off to take the time to write better content for each region.
Well a few things you could try:
- Cross domain rel=canonical - this will stop duplicate but it would only assign rank for one site.
- Attempt to rewrite the article
- Hide it via robots or no index the page.
- Try the rel-alternate however if they are all in the same language I'm not sure if that would work as well not to mention you have 3 regions not 4.
Hope some of that helps.
-
It seems like you probably don't want 4 different TLD, but instead would want to use ahref lang to indicate to Google that you are showing different regions different content. See here for more of an explanation - https://support.google.com/webmasters/answer/189077?hl=en
Alternatively, if you do want the unique TLDs you could put a canonical to whatever content is duplicate back to the main site (.com probably).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where do I place my blog
Sorry but this has probably been discussed earlier, but things change all the time. Gonna start blogging about my webside.
Content Development | | helgeolaussen
Where should I place my blog to get most SEO benefits for my main page?
mysite.xx/blog
blog.mysite.xx (on a server with diffrent ip)
An external TDL.0 -
Blog Posts
I have a blog as part of my business website...when I make a blog post, should I ALWAYS include a link in that post back to another page on my website? So every blog post has a link back to the site..or should I be intermittent on backlinks? Just wondering what the best practice is to help the site's SEO...
Content Development | | sdwellers0 -
Duplicate Content Discovery
I was hit with Penguin on April 24th like a ton of bricks. Luckily my cash cow keyword was kept safe and still is today with even an increase in traffic over the year. With some other main keywords I used to rank far I fell off the board on that day. Since then I have been slowly trying to clean things up as much as I know Today I was sitting down with my coffee and Penguin mindset and I decided to use copyscape again to review duplicate content issues and something I noticed which I either didn't before or didn't think was an issue was my footer. In my footer I used a blurb from some other site in my niche a long time ago. Which I discovered they used from one of the main sites in my niche. Anyways I noticed that my footer is what kept coming up as being duplicate content and was always at an overage of 28% according to copyscape. My question is should I be worried about the footer? Is 28% a lot?
Content Development | | cbielich0 -
Magento Multi Stores and seo
Hello all, im not that clued on SEO and only know the basics so please bear with me! I've just had built 5 different furniture retail websites, all of which will be selling the same products, but all priced differently and all of the sites will have different content etc on each one (4 of the 5 websites are on a multi store system with one back end on magento). The only thing that the 5 sites will have in common is the names of the products in the titles but all the content in the text and prices etc will all be different on each site, including articles and blogs… How will this sit with google in terms of rankings on each site? I know a lot of other companies that do this already and they all have good rankings on each site for their chosen keywords, the only difference between my scenario and theirs is 4 of my sites on the same back end which is mainly the ones i am concerned about… Any thoughts? thanks anthony
Content Development | | anthonybriant0 -
Duplicate content problems, so why does WordPress post onto Tumblr?
Hi Everyone, Basically I know that if you have duplicate content your ranking is effected! This I understand, so why is it that WordPress has the option to post your blog entry straight onto your paired Tumblr account? Surely if this can be done, I can have the same content on the company website but also on WP and Tumblr? Or is there some sort of method to how it works specifically for those 2 websites? Thanks in advance.
Content Development | | MariusFermi0 -
301 Redirect & Duplicate Content
We currently have 16465 audiobook products presented at our Web store. 5411 of them are out-of-publication (OOP). Here's an example: Harry Potter Audiobook 2 : Harry Potter and the Chamber of Secrets - J.K. Rowling - cassette audiobook Many of the 5411 OOP products are duplicates and triplicates of one title but were offered on a different medium (cassette, CD or MP3 CD) or were a different type (abridged, unabridged, dramatized). The description (story-line) is the same for all. Because we know once a page gets on the Internet, it can live there for years, we decided to keep OOP product pages at our Web store to: Let those who may have searched for the product and clicked on a link to an OOP product's page that it was no longer available. Invite them to explore our Web store. Let them know that although the product may not be available on cassette, CD or MP3 CD, that it might be available as a digital download. We know that Google does NOT like duplicate content from one site to another and even within the same site. If we redirect all the 5411 pages to one OOP page, will this eliminate this duplicate content issue? The OOP page would explain that the title they were looking for is no longer available but that it might be available as a digital download.
Content Development | | lbohen0 -
Blog Commenting Best interface
Hi All, In your opinion what are the advantages and disadvantages of Livefyre Facebook comments system Disques Thanks, John
Content Development | | johnshearer0 -
Archive older, low ranked content to help new content in Panda 2.2?
After watching the white board friday re: Panda 2.2, it got me to thinking about old content. One of the sites that I work with generates 3-10 new articles/day (movie reviews, interviews, guides, event previews, etc) and has been doing so since 2005. Now, they have almost 10k articles, 7k of which are indexed. The quality of the content varies, and much of it is dated (movies, events) much of the amount of older content gets 0-5 pageviews/month, made in the days BEFORE the site was using Google News + social tools to spread the word (and backlinks). Note that those older articles also of course tend to have 100% bounce, and small/zero TOS. Is this hurting the site? With 75-100 articles/month being published, I want to make sure they get maximum exposure. I'm also concerned that crawlers get sucked into the site chasing down old BS content, and that is hurting it as well. What to do with this content? Should I unpublish unpopular, dated content and get it off the internet? Or, do I leave it on, but NOINDEX it so Google won't crawl it?
Content Development | | EricPacifico0