Duplicate blogs across different domains
-
Hi,
I am running a blog for several English speaking websites ( e.g Australia, UK ) and I plan doing SEO for these websites. I want to know if it's a must changing all the blogs for these countries in order to avoid Duplicate Content ?
-
You really shouldn't be creating duplicate content of any kind for multiple clients. Even on different domains for businesses in different countries, this is not the best SEO strategy for you or the businesses. Work hard to create valuable, unique content ideas that the customer of the business may actually want to read, digest, and share.
If there's really no content for the business or industry, don't have a blog. Find other ways to include great content on the main pages to ensure there's robust, strong content for users to see and search engines to crawl and index appropriately.
-
I would like to use the blog posts I have for each country ( which are duplicated ) to target that specific country with the same content. What do you think ? what would you suggest regarding this.
Best,
Andrei
-
Hi Andrei,
It won't be necessary but also depends on the structure that you'll be using to set up your SEO campaigns. You could use the same content and pages to also do link building for other countries but next to that you could also change the content and make sure it will be more targeted to the other audiences in different countries.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I have duplicate content on my own website?
Hello Moz community, We are an agency providing services to various industries, and among them the hair salon industry. On our website, we have our different service pages in the main menu, as usual. These service pages are general information and apply to any industry.We also have a page on the website that is only intended for the hair salon industry. On this page, we would like to link new service pages: they will be the same services as our “general” services, but specialized for hair salons. My questions relate to duplicate content: Do we have to make the new individual service pages for hair salons with completely different text, even though it’s the same service, in order to avoid having duplicate content? Can we just change a few words from the “general service” page to specifically target hair salons, and somehow avoid Google seeing it as duplicate content? Reminder that these pages will be internal links inside of the hair salon industry page. Thank you in advance for your answers, Gaël
On-Page Optimization | | Gael_Regnault0 -
Is my domain holding me back in the SERPS?
Even after a good year or so, my site intensivedrivingschoolmiltonkeynes.co.uk does not rank in the top 10 (google.co.uk) for "Intensive Driving School Milton Keynes", and is nowhere for "Driving School Milton Keynes". Do you think the domain name is being penalised, or do you think there are other factors that contribute to the poor performance.
On-Page Optimization | | Buffalo-Mobile1 -
Should blog tags be location specific?
I understand the concept of organizing blogs with categories, but how specific should the tags on blog posts be? i.e. "cosmetic dentist" vs. "cosmetic dentist san francisco" I'm specifically using Squarespace if that helps. Thanks!
On-Page Optimization | | ReaderSam0 -
Duplicate Content Indentification Tools
Does anyone have a recommendation for a good tool that can identify which elements on a page are duplicated content? I use Moz Analytics to determine which pages have the duplicated content on them, but it doesn't say which pieces of text or on-page elements are in fact considered to be duplicate. Thanks Moz Community in advance!
On-Page Optimization | | EmpireToday0 -
Duplicate Page Title issues
Hello, I have a duplicate page title problem: Crawl Diagnostics Reported that my website got **sample URLs with this Duplicate Page Title **between:
On-Page Optimization | | JohnHuynh
http://www.vietnamvisacorp.com/faqs.html and these URLs below:http://www.vietnamvisacorp.com/faqs/page-2
http://www.vietnamvisacorp.com/faqs/page-3
http://www.vietnamvisacorp.com/faqs/page-4
http://www.vietnamvisacorp.com/faqs/page-5 I don't know why, because I have already implemented rel=”next” and rel=”prev” to canonical pages. Please give me an advice!0 -
Are these considered duplicates?
http://www.domain.com/blog/sample-blog-post/#more-0001 http://www.domain.com/blog/sample-blog-post/ The first URL is coming from a "click here" hyperlink from the excerpt of the 2nd URL in my homepage. Thanks in advance!
On-Page Optimization | | esiow20130 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Duplicate page content,
Hi, in my campaign crawls diagnostic, I have a lot of Duplicate page content, but we use canonicalization and I used webmastertool to make sure the campaign parameters are not consider by the Google bot. Can you see what could be my problem, or do you have a tip for me or things to look at ? Thank You VB
On-Page Optimization | | Vale70