Can We Publish Duplicate Content on Multi Regional Website / Blogs?
-
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs.
USA: http://www.bannerbuzz.com/blog/
UK: http://www.bannerbuzz.co.uk/blog/
AUS: http://www.bannerbuzz.com.au/blog/
CA: http://www.bannerbuzz.ca/blog/
Let me give you very clear ideas on it. Recently, We have published one article on USA website.
http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/
And, We want to publish this article / blog on UK, AUS & CA blog without making any changes.
I have read following paragraph on Google's official guidelines and It's inspire me to make it happen.
Which is best solution for it?
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
-
Hi Gianluca,
Your comment made me doubt my research. I started a new question about it. Do you have a minut to give your vision on my situation? I would really appreciate it.
https://moz.com/community/q/duplicated-content-multi-language-regional-websites
Best regards,
Bob
-
The answer is very simple, independently if you are publishing the post on subfolders or subdomains or even country code domains names:
-
The canonical URL of the copies must be the URL of the original article. This is must be so because the content of all the articles is the identical. If you don't use the rel="canonical" as I suggest you'll fall in the duplicated content issue;
-
You use the hreflang annotations to tell Google what post's URL to show to the users using English and searching from the different blogs' countries.
Remember: the hreflang is not meant for solving the duplicated content issues, but only for suggesting what URL to show to users in a given language and country.
For duplicated content issues the solution is the rel="canonical".
-
-
Thank you very much for answer. Rel Canonical will help me to remove cross domain. But, I would like to get index over 4 different regions.
-
And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs.
I found the problem, you were writing well and following the Google guidelines and all was well but you've decided to get a bit more grey hat.It pays off to take the time to write better content for each region.
Well a few things you could try:
- Cross domain rel=canonical - this will stop duplicate but it would only assign rank for one site.
- Attempt to rewrite the article
- Hide it via robots or no index the page.
- Try the rel-alternate however if they are all in the same language I'm not sure if that would work as well not to mention you have 3 regions not 4.
Hope some of that helps.
-
It seems like you probably don't want 4 different TLD, but instead would want to use ahref lang to indicate to Google that you are showing different regions different content. See here for more of an explanation - https://support.google.com/webmasters/answer/189077?hl=en
Alternatively, if you do want the unique TLDs you could put a canonical to whatever content is duplicate back to the main site (.com probably).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Blog vs Wordpress
We are moving our Ecommerce site to Shopify. Currently we run our blog on Wordpress and I'm wondering if anyone has an opinion on using the Shopify blog vs Wordpress?
Content Development | | Glaze0 -
Duplicate Blog Content
Hey Mozzers! I have a client who is a dentist, with multiple offices, and wants to use an identical blog post (including images, alt text, text, tags, everything pretty much) from one of his office's website on his other office's website to save time and mimic the success of the original content for his other office. Everything I've researched says this is a HUGE no-no, but I'd love to hear if anyone else has tried to do something like this and if they were successful in doing so (implementing rel=cannonical or 301?). Also, if he is the owner of both sites and they both receive low traffic will Google even notice? My biggest worry is that if I did post the content on his other site, identically, that it would dilute the visibility of the original post, which has and is continuing to surpass our organic search goals... The main goal though, is to drive traffic to BOTH sites via organic search using the same content. Would love to hear everyone's opinions if this is possible or unrealistic... Thanks! -D
Content Development | | Derrald0 -
What makes high quality content?
Content is becoming more and more important in rankings and I was wondering what exactly Google defines a good content. Any ideas?
Content Development | | EJDekkers0 -
New website launched
hi folks i launched a new website last month can anyone tell me how long it normally takes for google to pick them up? is there any of speeding up the process as i want to launch another one by the end of the month. the address of this one is www.thebookshop.ie cheers donal
Content Development | | homebrew10 -
Is framed content on another domain duplicate content?
I've read a number of articles and am getting opposing answers. I've been checking pages of Photo.net in copyscape for duplicate content. I'm finding a number of domains have the site iframed onto them. I was wondering why copyscape could read the content if the search engines supposedly didn't crawl iframes. Copyscape said Google can read the content. I just want to know if these sites need to remove the iframe (is it hurting Photo.net)? Thanks. Examples: http://www.copyscape.com/?q=http%3A%2F%2Fphoto.net%2Freviews%2F
Content Development | | cakelady0 -
What are the best content writer sites?
Hi, I'm doing some work on a new blog and wondered if anyone could recommend some low cost content writers? I have only justed started researching this service, so any advice the SEOmoz community could give would be grately appreciated. Thanks in advance.
Content Development | | RBH0 -
Blog Categories
I am working on redoing my blog and there are many posts that fit into two categories. Should I keep them in only one? or place them in both? For example- Category 1 - Gucci Category 2 - Shirts Should a post about "Gucci shirts" go in both places or only the "Gucci" category and the "shirts" category would then only include posts about my more minor brands. Thanks!
Content Development | | theLotter0 -
Archive older, low ranked content to help new content in Panda 2.2?
After watching the white board friday re: Panda 2.2, it got me to thinking about old content. One of the sites that I work with generates 3-10 new articles/day (movie reviews, interviews, guides, event previews, etc) and has been doing so since 2005. Now, they have almost 10k articles, 7k of which are indexed. The quality of the content varies, and much of it is dated (movies, events) much of the amount of older content gets 0-5 pageviews/month, made in the days BEFORE the site was using Google News + social tools to spread the word (and backlinks). Note that those older articles also of course tend to have 100% bounce, and small/zero TOS. Is this hurting the site? With 75-100 articles/month being published, I want to make sure they get maximum exposure. I'm also concerned that crawlers get sucked into the site chasing down old BS content, and that is hurting it as well. What to do with this content? Should I unpublish unpopular, dated content and get it off the internet? Or, do I leave it on, but NOINDEX it so Google won't crawl it?
Content Development | | EricPacifico0