How to Not Scrap Content, but still Being a Hub
-
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic.
One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site.
For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight).
We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site.
So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles.
One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub.
Thoughts?
Thank you,
nick
-
I honestly can't offer any short term suggestions. It's a big challenge to know what the best short term path is. Ultimately, you'll need to remove all the scraped content. Do that without replacing it and in the short term, you won't see any gains, though you may even see some short term losses as it's possible you're not being purely penalized.
-
Alan,
Thank you for your thoughts. I agree we need to change our strategy and move away from scraped content. Any technical work arounds we try to do (like iFrame) may work now, but ultimately we seem to just be delaying the inevitable.
Since that strategy will take a while to implement, what would you recommend for the shorter term?
-
Nick,
You're in a difficult situation, to say the least. iFrames were a safe bet a couple years ago, however Google has gotten better and better at discovering content contained in previously safe environments within the code. And they're just going to get better at it over time.
The only truly safe solution for a long term view is to change strategy drastically. Find quality news elsewhere, and have content writers create unique articles built on the core information contained in those. Become your own news site with a unique voice.
The expense is significant given you'll need full time writers, however with a couple entry level writers right out of college, or just a year or two into the content writing / journalism path, you've got a relatively low cost of entry. The key is picking really good talent.
I was able to replace an entire team of 12 poorly chosen writers with 3 very good writers, for example.
The other reality with that is needing to lose all the scraped content. It's got to go. You can't salvage it, or back-date newly written content around it, not in the volume you're dealing with. So you're going to have to earn ranking all over again, but through real, value added reasons.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible that my Da and Pa will be high and i’m still not ranking?
what would be those reasons ? factors that im not ranking? what bad seo practices to steer clear? need an anwer in depth i would really appreciate the answers Thanks
White Hat / Black Hat SEO | | calvinkj0 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Cloaking - is this still working ? And how ?
Hello, Recently i read about all the cloaking world.
White Hat / Black Hat SEO | | WayneRooney
I search some information on the internet about it and i fine this service : http://justcloakit.com/.
Since I'm pretty new to whole this "cloaking world" so I have a few questions from from experts in this field. Is this still working on SEO since all the Google update recently ?
How easy is that for someone that don't have much experience and knowledge on php and servers stuff ?
Is there are more sites such as the above example ? In general i have the budget and i don't think its very hard to learn all the technical part but i just want to know if this is something
that still working, is that good investment in your opinion ? (As its not really cheap) Cheers and thank you for your help0 -
20-30% of our ecommerce categories contain no extra content, could this be a problem
Hello, About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
It looks to me like crap still wins
Wherever you look, you see experts advising people not to use automated tools. To do things the right way. That a website with good content will win over a site with tons of junky links. So I came to seomoz for some enlightenment. I have a website that I created, it gets an A here on the website auditor. I have written over fifty completely original articles. I am barely making spot 10 in and out. The sites that are ranking are terrible. Some have one post, have completely wrong information, have pasted the product page as their own. Have no privacy page, contact page etc. Many of them are in broken English and full of misspellings. So I go the Open Site Explorer here and what do i find? Seomoz has it right on the nose. The site authority,linking domains etc are highest for the #1 site, #2 etc. So I examine the links and what do I find? Quality backlinks? Authority backlinks? Hardly. I find completely junk links, that were made with xrumer or scrapebox . Russian bride sites, completely unrelated to the niche. Backlinks that were purchased on Fiverr. These are the types of backlinks Ive avoided. The kind the experts say to stay away from. Yet these people are making serious money with lousy websites and lousy backlinks. Ive looked at others and its the same thing. Content is king? I dont think so. It looks to me like I SHOULD be making tons of these lousy links. Im not sure what direction to go in at this point. So Id like to hear some suggestions.
White Hat / Black Hat SEO | | vansy0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0