Possible to recover from Thin/Duplicate content penalties?
-
Hi all, first post here so sorry if in wrong section.
After a little advice, if I may, from more experienced SEOers than myself with regards to writing off domains or keeping them.
To cut a long story short I do a lot of affiliate marketing, back in the day (until the past 6 months or so) you could just take a merchant's datafeed and with some SEO outrank them for individual products. However, since the Google Panda update this hasn't worked as well and now it's much hard to do - which is better for the end user.
The issue I have is that I got lazy and tried to see if I could still get some datafeeds to rank with only duplicate content. The sites ranked very well at first but after a couple of weeks died massively. They went from 0 to 300 hits a day in a matter of 24 hours and back to 2 hits a day. The sites now not rank for anything which is obviously because they are duplicate content.
The question I have is are these domains dead, can they be saved? Not talking about duplicate content but as a domain itself. I used about 10 domains to test things, they ranged from DA 35 to DA 45 - one of the tests being can a domain with reasonable DA rank for duplicate content.
Seeing as the test didn't work I want to use the domains for proper sites with proper unique content, however so far although the new unique content is getting indexed it is suffering from the same ranking penalties the duplicate (and now deleted content) pages had.
Is it worth trying to use these domains, will Google finally remove the penalty when they notice that the bad content is no longer on the site or are the domains very much dead?
Many thanks
-
thanks, that's along the lines of what I was thinking. Given the age and the score of the domains I would rather word to get them back again than start again with brand new domains.
-
I think it's worth a try. If you were hit for duplicate and thin content you likely were hit with a Panda penalty. Panda refreshes from time to time and if Google sees that the problem is no longer there then the penalty is lifted.
So, if you built a good quality domain on the same url, the next time there is a Panda refresh (which is usually within 1-2 months) you should be able to rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with lot of old content that doesn't drive traffic - delete?
Hi community, i hope someone can help me with this, We are migrating our e-commerce site next februari. I'm preparing the content migration. For a large part exact copies of our product listing and product detail pages will be migrated.
Content Development | | Marketing-Omoda
However, we also have a lot of old blog content, which is, because of seasonality and trendiness, outdated and doesn't drive traffic anymore. It actually is just worthless content. (Not only as a traffic driver, this also counts for extremely low to none internal driven traffic (both internal search and internal navigation). We have about 4.000+ blogs of which about 100 drive the most traffic (mostly incited by e-mail and social campaigns and internal navigation promoted on important category landing pages during some period. Is it a bad signal to search engines to delete these old content pages? I.a.: going from a content-rich to a content-poor site?
Off course I will migrate the top 100 traffic earning content and provide proper redirects to them0 -
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Duplicate Content
I am wondering what is the best way to show google that there is duplicate content on the page. for example on our product pages they are unique content except we give the same guarantee and promise on every product providing some duplicate content. What is the best way to fix this issue?
Content Development | | DoRM0 -
With the structure of WordPress when multiple tags are selected, SEOMoz reports show each URL/tag as duplicated content? What to do?
wordpress.com/blogpost/tag/word1 wordpress.com/blogpost/tag/word 2 etc. Same page, but WP generates multiple URLs for each tag. in reports, this shows as duplicate content. Is it something to worry about? If yes, what is the best fix?
Content Development | | VividImage0 -
Prevent average users from copying content and pasting into their websites
Please do not respond with a "you can't stop them" comment, I understand this. Most of our pages have content that is duplicated across multiple domains. The recent Google algorithm update focused on penalizing pages that have duplicate content, and it could be one of the reasons that we have been seeing traffic loss. I'm looking for some type of javascsript/php code that will help minimize this issue.If could be code that does not allow you to copy and paste the code without turning of javascript or a dialog box pops up and says "this content is copyright protected, anyone copying this content is subject to legal action" I've found one script that might work http://www.ioncube.com/html_encoder.php My questions are still the same: 1 What is the best method to achieve my objective? 2 Will this additional code affect how the webbots see our site and or affect rankings? I know that anyone can figure out how to get the code, I am trying to mitigate by providing a warning about copyright infringement and making it more challenging to copy our content. Please do not respond with a "you can't stop them" comment, etc, I understand this. Thank you for your comments!
Content Development | | 4RealLocal0 -
Keeping web site fresh re content
We currently use a wp blog but we don not host on the web domain. What are the advanatges to moving the blog to the domain The only 1 I can think of is every time we update the blog this should help keep the web site fresh with new content .
Content Development | | NotThatFast0 -
Blogger & Blogspot Content - Move Across To Own Domain?
Hey, A few new clients have blogs hosted on blogger & blogspot, the first advice of mine is to set up a blog hosted on their company domain. It's usually easy to convince them of the benefits. What should happen to all the content on the existing blog? One blog in question has over 100 entries, good content with a lot of links back to the business domain. The blog itself has less than 10 links pointing in but a domain mozrank 3.5. In this example, my gut is telling me to leave it as is, and start fresh on the own domain. What about if there's less then 10 posts? At what point should the content be moved over to the new blog? Thanks for your thoughts.
Content Development | | LukeyJamo0 -
Does content have a shelf life for link building efforts?
Do you think that content (that doesn't have a date attached) has a shelf life? Especially content that is effectively timeless such as a quiz? I've noticed in my link building efforts that most links are achieved within the first couple of weeks, and that there seems to be a point of diminishing returns. Why do you think that may be?
Content Development | | nicole.healthline0