Is Noindex Enough To Solve My Duplicate Content Issue?
-
Hello SEO Gurus!
I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article.
Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well.
My concern is duplicate content.
In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit.
I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution.
Thanks in advance!
Kind Regards,
Mike
-
Definitely deal with the security issues! Good find there...
Regarding the client who wants to republish the same article on multiple sites, I think that noindexing it on all but the original site is perfectly fine.
Or, alternatively, place a canonical tag on the duplicate sites to let Google know where the true source lies.
-
Good thread and I agree with everything Brian has already said. One additional option that hasn't been mentioned is possibly using Repost.us . If your client's blogs are on WordPress, there is a nifty Repost.Us plugin, very easy to install. He could then use this to repost the content on his main blogs, without having duplicate content issues or problems for his SEO. It would get the content where he wants it, preserve authorship plus give a link back to his main site. He would also have the opportunity of monetizing his posts if that was something he wanted to do. Hope this is helpful!
Dana
-
Wow, that's new! Yes, I wouldn't be surprised if the plug-in is at fault.
Well, as usual, issues compound into new issues.
My many thanks for your help and insight, Brian.
Kind Regards,
Mike
-
Wasn't able to visit the site, got this warning, attached.
Kinda poignant that this warning from the Fiji site gave me a warning referencing the Pacific site, which is exactly the kind of thing we're talking about.
Wonder if the very plugin your client is using is causing this issue too. -
Sure, here's an example: this is the main website: beautifulpacific.com, with the blog being located at beautifulpacific.com/blog. One of the satellite sites is beautifulfiji.com, with its blog at beautifulfiji.com/blog.
-
_To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed. _
Agreed. Also, there's no reason he can't write a post for one audience that references a post he made on another domain. It's hard to get a good feel for the whole situation without viewing the sites and blogs themselves.
-
Many thanks for your reply, Brian.
The satellite websites are not where conversations/sales take place; they feed his main site. I agree that providing a feed via the blog's RSS would make more sense. And when you say, "but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in [noindexing]," I wholeheartedly agree. Even if it were to solve the duplicate content issue, it would preclude us from being able to put fresh content up on that blog and leverage it accordingly.
I can tell you that there is nothing nefarious in the client's idea here: his intentions are purely to give users fresh content to explore on the satellite sites. But as he relies on me to guide him in terms of SEO implications, I don't think he thought through how duplicate content could hurt him.
To me, the best-case scenario would be to use these blogs to pump out fresh, authoritative content for each satellite site blog -- a more intensive undertaking, to be sure, but a best practice -- and include an RSS feed.
-
Have you suggested he use an iframe to host the content from one site into the satellites?
Or maybe simply a feed to show the fresh content to visitors?
Does he convert on those satellite sites or are they micros to drive to the main?The thing is, it is definitely going to be duplicate content, and since the host is presumably the same... well... Not good.
I would ask: "why?" He is expecting to get links to this content on this site one day, the same content on this site the next? If it's a good post, what would happen if someone shares it socially from one domain, and those exposed to it see it elsewhere?
I think noindexing is a good half measure, but if the point of the content is to be consumed, enjoyed, attract social shares and links, build traffic and then convert, then there's really little if any gain to be had in even doing that. A noindexed blog post getting links? A noindexed blog category getting social buzz?
Force your client to understand the end goal. If he just wants something for them to read, add a feed. Then the social shares and links will do some good to at least the most important domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content analysis
Hi all,We have some pages being flagged as duplicates by the google search console. However, we believe the content on these pages is distinctly different (for example, they have completely different search results returned, different headings etc). An example of two pages google finds to be duplicates is below. if anyone can spot what might be causing the duplicate issue here, would very much appreciate suggestions! Thanks in advance.
Technical SEO | | Eric_S
Examples: https://www.vouchedfor.co.uk/IFA-financial-advisor-mortgage/harborne
https://www.vouchedfor.co.uk/accountant/harborne0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
Duplicate Content
HI There, Hoping someone can help me - before i damage my desk banging my head. Getting notifications from ahrefs and Moz for duplicate content. I have no idea where these weird urls have came from , but they do take us to the correct page (but it seems a duplicate of this page). correct url http://www.acsilver.co.uk/shop/pc/Antique-Vintage-Rings-c152.htm Incorrect url http://www.acsilver.co.uk/shop/pc/vintage-Vintage-Rings- c152.htm This is showing for most of our store categories 😞 Desperate for help as to what could be causing these issues. I have a technical member of the ecommerce software go through the large sitemap files and they assured me it wasn't linked to the sitemap files. Gemma
Technical SEO | | acsilver0 -
Duplicate Content Reports
Hi Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0 -
Is this considered as duplicate content?
One of my clients has a template page they have used repeatedly each time they have a new news item. The template includes a two-paragraph customer quote/testimonial for the company. So, they now have 100+ pages with the same customer quote. The rest of the page content / body copy is unique. Is there any likelihood of this being considered duplicate content?
Technical SEO | | bjalc20110