Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content - Blog Rewriting
-
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website.
He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field.
To what extent would I need to rewrite each article so as to avoid duplicating the content?
Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords?
Would the articles need to be completely taken by all current publishers?
Any advice would be greatly appreciated.
-
Hi guys, have a client in a similar situation and working through best option for them...would appreciate any comments or feedback...
Current Status - client has two websites each targeting different countries: .co.nz and .com.au
With the exception of a few products that are offered separately between NZ and AU, the sites are the same. In essence duplicate content. This is due to current platform limitations (the way their web company has built the site it is same site showing in each region on separate domains with option to change products between regions using inventory an integrated inventory tool).
The great news is they are currently rebuilding their websites onto a new platform with two unique versions of the site…which will be great for ongoing SEO - ie we can really drill into creating separate sets of page, product, template content and meta data etc.
They also have a magazine running on Word Press Blog using sub-domains associated with the regional root domain. E.g.
magazine.domain.co.nz and magazine.domainname.com.auAgain, with a few exceptions, this is also duplicated for both countries…ie sub domains assigned to the same site. Again duplicate content.
Question: The magazine being built on Word Press has to date been geared at offering an “FAQ” type engagement with visitors....visitors can submit questions via module which are then answered in Word Press blog posts. There are also links from main site menu away to the magazine...so not ideal for conversion. Client wants to bring this FAQ type feature back to the two main sites and can now do so during new site rebuilds.
There is also some SEO juice in the magazine as in essence it is a large Word Press blog. I am trying to work through what would be the best option for transferring all of the FAQ answers/articles (content) from magazine FAQs to the two new main sites...so over time the two new main sites obtain that SEO strength.
Option 1
Leave magazine as it is so that main sites continue to get benefits of referral traffic to main sites and sales as result of the referrals. Also retains the links from magazine to main site (although links are from a sub-domain of the same domain)
Rewrite a brand new version of each magazine article for new NZ site
Rewrite a brand new version of each magazine article for new AU site
(Bearing in mind stringent Panda rules etc – mixing up titles so unique, unique content and posting etc to avoid Panda penalties)
Option 2
Take down magazine site and implement 301 redirects + one new version of the articles.
Move all magazine articles across to the highest performing region (NZ by far) and 301 redirect from NZ magazine to the new NZ site with corresponding articles. 301 redirects take care of the indexed pages to retain traffic and rankings for the NZ magazine articles.
Rewrite a brand new version of each magazine article and add to the new AU site and 301 redirect from AU magazine articles to the new version on AU site. 301 redirects take care of any indexed AU magazine articles...but there may be some fluctuation in rankings as the content is now completely different (brand new).
Could there be any issue with loss of the internal backlinks? impacts SEO strength that magazine subdomain to main site might give?
Other Options?
Appreciate any thoughts or comments... thanks in advance...
-
I would steer clear of removing 250 blog posts from the other web properties. They may be driving traffic to those websites.
The client is requesting 250 particular blog posts to be rewritten. This isn't the best content strategy in the world, but that's what you're being asked to do, so the BEST way to handle it is to completely rewrite every post so they are 100% unique.
If you were to remove the blog posts from the other websites and simply post them on the new website, you're running the risk of taking traffic away from the already established websites.
"Would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?" -- Yes, you run the risk of being penalized by Panda with such a large amount of duplicate content. Google wants to rank websites that provide value to visitors. If a website is entirely made up of content that already exists on another website, you're providing no added value to visitors. Again, you could remove the content from the other websites and 301 redirect to the new one.... but you're taking a lot of value away from those websites if you do that.
-
Hi Phillip,
Sorry - I meant to write: Would all of the blogs need to be removed from the website on which they are appearing?
So is the best course of action to have the articles taken off the platforms on which they appear before going ahead and putting them up on the new site?
Also could you explain how the new site might get hit by panda i.e. would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?
Thanks a million Phillip.
-
If you don't make them VERY unique from the originals, the new site won't perform very well. If the new site consists of nothing but 250 blog posts that were already discovered on other websites, you won't get good results. Simply keyword optimizing the posts won't be enough. They should be entirely re-written to avoid potential problems with Panda.
I'm not sure what you mean by this -- Would the articles need to be completely taken by all current publishers?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in sidebar
Hi guys. So I have a few sentences (about 50 words) of duplicate content across all pages of my website (this is a repeatable text in sidebar). Each page of my website contains about 1300 words (unique content) in total, and 50 words of duplicate content in sidebar. Does having a duplicate content of this length in sidebar affect the rankings of my website in any way? Thank you so much for your replies.
On-Page Optimization | | AslanBarselinov1 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
Duplicate page titles and Content in Woocommerce
Hi Guys, I'm new to Moz and really liking it so far!
On-Page Optimization | | jeeyer
I run a eCommerce site on Wordpress + WooCommerce and ofcourse use Yoast for SEO optimalisation I've got a question about my first Crawl report which showed over 600 issues! 😐 I've read that this is something that happens more often (http://moz.com/blog/setup-wordpress-for-seo-success). Most of them are categorized under:
1. Duplicate Page Titles or;
2. Duplicate Page Content. Duplicate Page Titles:
These are almost only: product category pages and product tags. Is this problem beeing solved by giving them the right SEO SERP? I see that a lot of categories don't have a proper SEO SERP set up in yoast! Do I need to add this to clear this issue, or do I need to change the actual Title? And how about the Product tags? Another point (bit more off-topic) I've read here: http://moz.com/community/q/yoast-seo-plugin-to-index-or-not-to-index-categories that it's advised to noindex/follow Categories and Tags but isn't that a wierd idea to do for a eCommerce site?! Duplicate Page Content:
Same goes here almost only Product Categories and product tags that are displayed as duplicate Page content! When I check the results I can click on a blue button for example "+ 17 duplicates" and that shows me (in this case 17 URLS) but they are not related to the fist in any way so not sure where to start here? Thanks for taking the time to help out!
Joost0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0 -
Best practice for franchise sites with duplicated content
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues. All sites are hosted on the same server therefor the same IP address All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate Almost all sites have the same design (A few of the groups we work with have multiple design options) Any suggestions would be greatly appreciated. Thanks Again Aaron
On-Page Optimization | | Shipyard_Agency0