How to Not Scrap Content, but still Being a Hub
-
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic.
One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site.
For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight).
We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site.
So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles.
One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub.
Thoughts?
Thank you,
nick
-
I honestly can't offer any short term suggestions. It's a big challenge to know what the best short term path is. Ultimately, you'll need to remove all the scraped content. Do that without replacing it and in the short term, you won't see any gains, though you may even see some short term losses as it's possible you're not being purely penalized.
-
Alan,
Thank you for your thoughts. I agree we need to change our strategy and move away from scraped content. Any technical work arounds we try to do (like iFrame) may work now, but ultimately we seem to just be delaying the inevitable.
Since that strategy will take a while to implement, what would you recommend for the shorter term?
-
Nick,
You're in a difficult situation, to say the least. iFrames were a safe bet a couple years ago, however Google has gotten better and better at discovering content contained in previously safe environments within the code. And they're just going to get better at it over time.
The only truly safe solution for a long term view is to change strategy drastically. Find quality news elsewhere, and have content writers create unique articles built on the core information contained in those. Become your own news site with a unique voice.
The expense is significant given you'll need full time writers, however with a couple entry level writers right out of college, or just a year or two into the content writing / journalism path, you've got a relatively low cost of entry. The key is picking really good talent.
I was able to replace an entire team of 12 poorly chosen writers with 3 very good writers, for example.
The other reality with that is needing to lose all the scraped content. It's got to go. You can't salvage it, or back-date newly written content around it, not in the volume you're dealing with. So you're going to have to earn ranking all over again, but through real, value added reasons.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content site not penalized
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
White Hat / Black Hat SEO | | KenSchaefer0 -
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
No cache still a good link for disavow?
Hi Yall, 2 scenarios: 1. I'm on the border line of disavowing some websites that link to me. If the page is N/A (not available) for the cache, does that mean i should disavow them? 2. What if the particular page was really good content and the webmaster just has the worse seo skills in not interlinking his old blogs, hence why the page that's linking to me is N/A for cache, should i still disavow it? Thanks
White Hat / Black Hat SEO | | Shawn1240 -
Is Yahoo! Directory still a beneficial SEO tactic
For obvious reasons, we have submitted our clients to high authority directories such as Yahoo! Directory and Business.com. However, with all of the algorithm updates lately, we've tried to cut back on the paid directories that we submit our clients to. Having said that, my question is, is Yahoo! Directory still a beneficial SEO tactic? Or are paid directories, with the exception of BBB.com, a bad SEO tactic?
White Hat / Black Hat SEO | | MountainMedia0 -
DIV Attribute containing full DIV content
Hi all I recently watched the latest Mozinar called "Making Your Site Audits More Actionable". It was presented by the guys at seogadget. In the mozinar one of the guys said he loves the website www.sportsbikeshop.co.uk and that they have spent a lot of money on it from an SEO point of view (presumably with seogadget) so I decided to look through the source and noticed something I had not seen before and wondered if anyone can shed any light. On this page (http://www.sportsbikeshop.co.uk/motorcycle_parts/content_cat/852/(2;product_rating;DESC;0-0;all;92)/page_1/max_20) there is a paragraph of text that begins with 'The ever reliable UK weather...' and when you via the source of the containing DIV you will notice a bespoke attribute called "threedots=" and within it, is the entire text content for that DIV. Any thoughts as to why they would put that there? I can't see any reason as to why this would benefit a site in any shape or form. Its invalid markup for one. Am I missing a trick..? Thoughts would be greatly appreciated. Kris P.S. for those who can't be bothered to visit the site, here is a smaller version of what they have done: This is an introductory paragraph of text for this page.
White Hat / Black Hat SEO | | yousayjump0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640