Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content - Blog Rewriting
-
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website.
He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field.
To what extent would I need to rewrite each article so as to avoid duplicating the content?
Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords?
Would the articles need to be completely taken by all current publishers?
Any advice would be greatly appreciated.
-
Hi guys, have a client in a similar situation and working through best option for them...would appreciate any comments or feedback...
Current Status - client has two websites each targeting different countries: .co.nz and .com.au
With the exception of a few products that are offered separately between NZ and AU, the sites are the same. In essence duplicate content. This is due to current platform limitations (the way their web company has built the site it is same site showing in each region on separate domains with option to change products between regions using inventory an integrated inventory tool).
The great news is they are currently rebuilding their websites onto a new platform with two unique versions of the site…which will be great for ongoing SEO - ie we can really drill into creating separate sets of page, product, template content and meta data etc.
They also have a magazine running on Word Press Blog using sub-domains associated with the regional root domain. E.g.
magazine.domain.co.nz and magazine.domainname.com.auAgain, with a few exceptions, this is also duplicated for both countries…ie sub domains assigned to the same site. Again duplicate content.
Question: The magazine being built on Word Press has to date been geared at offering an “FAQ” type engagement with visitors....visitors can submit questions via module which are then answered in Word Press blog posts. There are also links from main site menu away to the magazine...so not ideal for conversion. Client wants to bring this FAQ type feature back to the two main sites and can now do so during new site rebuilds.
There is also some SEO juice in the magazine as in essence it is a large Word Press blog. I am trying to work through what would be the best option for transferring all of the FAQ answers/articles (content) from magazine FAQs to the two new main sites...so over time the two new main sites obtain that SEO strength.
Option 1
Leave magazine as it is so that main sites continue to get benefits of referral traffic to main sites and sales as result of the referrals. Also retains the links from magazine to main site (although links are from a sub-domain of the same domain)
Rewrite a brand new version of each magazine article for new NZ site
Rewrite a brand new version of each magazine article for new AU site
(Bearing in mind stringent Panda rules etc – mixing up titles so unique, unique content and posting etc to avoid Panda penalties)
Option 2
Take down magazine site and implement 301 redirects + one new version of the articles.
Move all magazine articles across to the highest performing region (NZ by far) and 301 redirect from NZ magazine to the new NZ site with corresponding articles. 301 redirects take care of the indexed pages to retain traffic and rankings for the NZ magazine articles.
Rewrite a brand new version of each magazine article and add to the new AU site and 301 redirect from AU magazine articles to the new version on AU site. 301 redirects take care of any indexed AU magazine articles...but there may be some fluctuation in rankings as the content is now completely different (brand new).
Could there be any issue with loss of the internal backlinks? impacts SEO strength that magazine subdomain to main site might give?
Other Options?
Appreciate any thoughts or comments... thanks in advance...
-
I would steer clear of removing 250 blog posts from the other web properties. They may be driving traffic to those websites.
The client is requesting 250 particular blog posts to be rewritten. This isn't the best content strategy in the world, but that's what you're being asked to do, so the BEST way to handle it is to completely rewrite every post so they are 100% unique.
If you were to remove the blog posts from the other websites and simply post them on the new website, you're running the risk of taking traffic away from the already established websites.
"Would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?" -- Yes, you run the risk of being penalized by Panda with such a large amount of duplicate content. Google wants to rank websites that provide value to visitors. If a website is entirely made up of content that already exists on another website, you're providing no added value to visitors. Again, you could remove the content from the other websites and 301 redirect to the new one.... but you're taking a lot of value away from those websites if you do that.
-
Hi Phillip,
Sorry - I meant to write: Would all of the blogs need to be removed from the website on which they are appearing?
So is the best course of action to have the articles taken off the platforms on which they appear before going ahead and putting them up on the new site?
Also could you explain how the new site might get hit by panda i.e. would google pick up on the fact that these blogs are already appearing elsewhere on the web and thereby penalise the new site for posting material that is already indexed by Google?
Thanks a million Phillip.
-
If you don't make them VERY unique from the originals, the new site won't perform very well. If the new site consists of nothing but 250 blog posts that were already discovered on other websites, you won't get good results. Simply keyword optimizing the posts won't be enough. They should be entirely re-written to avoid potential problems with Panda.
I'm not sure what you mean by this -- Would the articles need to be completely taken by all current publishers?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
How to overcome blog page 1, 2, 3, etc having no or duplicate meta info?
As the above what is the best way to overcome having the same meta info on your blog pages (not blog posts) So if you have 25 blog posts per page once you exceed this number you then move onto a second blog page, then when you get to 50 you then move onto a 3rd blog page etc etc So if you have thousands f blog pages what is the best method to deal with this rather than having to write 100s of different meta titkes & descriptions? Cheers
On-Page Optimization | | webguru20141 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Duplicate Content from on Competitor's site?
I've recently discovered large blocks of content on a competitors site that has been copy and pasted from a client's site. From what I know, this will only hurt the competitor and not my client since my guy was the original. Is this true? Is there any risk to my client? Should we take action? Dino
On-Page Optimization | | Dino640 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Duplicate Content for Spanish & English Product
Hi There, Our company provides training courses and I am looking to provide the Spanish version of a course that we already provide in English. As it is an e-commerce site, our landing page for the English version gives the full description of the course and all related details. Once the course is purchased, a flash based course launches within a player window and the student begins the course. For the Spanish version of the course, my target customers are English speaking supervisors purchasing the course for their Spanish speaking workers. So the landing page will still be in English (just like the English version of the course) with the same basic description, with the only content differences on that page being the inclusion of the fact that this course is in Spanish and a few details around that. The majority of the content on these two separate landing pages will be exactly the same, as the description for the overall course is the same, just that it's presented in a different language, so it needs to be 2 separate products. My fear is that Google will read this as duplicate content and I will be penalized for it. Is this a possibility or will Google know why I set it up this way and not penalize me? If that is a possibility, how should I go about doing this correctly? Thanks!
On-Page Optimization | | NiallTom0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0 -
Quick and easy Joomla 1.5 Duplicate content fix?
www.massduitrialalwyers.com has a TON of duplicate content based on the way joomla 1.5 uses articles. Do you have a tried and true method to eliminate (automated would be preferred) the issues>? if not, might you suggest a plug in that takes care of the rel canonical?
On-Page Optimization | | Gaveltek-173238
Cheers0