How to Not Scrap Content, but still Being a Hub
-
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic.
One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site.
For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight).
We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site.
So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles.
One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub.
Thoughts?
Thank you,
nick
-
I honestly can't offer any short term suggestions. It's a big challenge to know what the best short term path is. Ultimately, you'll need to remove all the scraped content. Do that without replacing it and in the short term, you won't see any gains, though you may even see some short term losses as it's possible you're not being purely penalized.
-
Alan,
Thank you for your thoughts. I agree we need to change our strategy and move away from scraped content. Any technical work arounds we try to do (like iFrame) may work now, but ultimately we seem to just be delaying the inevitable.
Since that strategy will take a while to implement, what would you recommend for the shorter term?
-
Nick,
You're in a difficult situation, to say the least. iFrames were a safe bet a couple years ago, however Google has gotten better and better at discovering content contained in previously safe environments within the code. And they're just going to get better at it over time.
The only truly safe solution for a long term view is to change strategy drastically. Find quality news elsewhere, and have content writers create unique articles built on the core information contained in those. Become your own news site with a unique voice.
The expense is significant given you'll need full time writers, however with a couple entry level writers right out of college, or just a year or two into the content writing / journalism path, you've got a relatively low cost of entry. The key is picking really good talent.
I was able to replace an entire team of 12 poorly chosen writers with 3 very good writers, for example.
The other reality with that is needing to lose all the scraped content. It's got to go. You can't salvage it, or back-date newly written content around it, not in the volume you're dealing with. So you're going to have to earn ranking all over again, but through real, value added reasons.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
Content placement in HTML and display
Does Google penalize for content being placed at the top of the page and display for users at bottom of the page? This technique is done by CSS. Thank you in advance for your feedback!
White Hat / Black Hat SEO | | Aerocasillas0 -
20-30% of our ecommerce categories contain no extra content, could this be a problem
Hello, About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0 -
Copied Content/ Copied Website/
Hello guys, I was checking my product descriptions and I found out that there is a website that is using my descriptions word by word, also they use company name, product images, they have a link that sends you to my site, contact form.. I tried to purchase something and the order came through our email, but i made an inquire and it didn't come through. Also they have a sub-folder with my company name. Also they have url's with my company name, and this isn't right is it? I am confused and honestly I don't know what to do, we don't take part to any affiliation program or anything like that and we don't ship out of Europe. This is a Chinese website. Just for curiosity, I noticed that one of our competitors is there as well, and it does seem weird. Here is the links: www.everychina . com/company/repsole_limited-hz1405d06.html
White Hat / Black Hat SEO | | PremioOscar0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0