Is there a way to make Google realize/detect scraper content?
-
Good morning,Theory states that duplicated content reduces certain keywords’ position in Google. It also says that a web who copy content will be penalized. Furthermore, we have spam report tools and the scraper report to inform against these bad practices.In my case: the website, both, sells content to other sites and write and prepare its own content which is not in sale. However, other sites copy these last ones, publish them and Google do not penalize their position in results (not in organic results neither in Google news), even though they are reported using Google tools for that purpose.Could someone explain this to me? Is there a way to make Google realize/detect these bad practices?Thanks
-
I've found backlinks in scraper websites linking to the scraped website I am taking care of.
They are in css, images, forms.
What's the point in doing it on their side?
-
Stolen content is a big issue today and recent reports have shown that people who steal the content from you will usually knock you out of your search engine position, no matter what your authority, backlink, or social share profiles look like.
This great presentation given by Jon Earnshaw at Brighton SEO last week gives a better idea of how it has affected other websites : http://www.slideshare.net/jonathanearnshaw/is-your-content-working-better-for-someone-else
Google use to have a Scraper report that you could file the offending site and get it removed from the SERPS but they have removed this.
I found a similar way to report the stolen content on this blog post :
http://www.techng.info/removing-your-stolen-content-from-google-search-using-dmca/
Hope this answers your question, even if it is a bit delayed from the original post
-
Hello,
The reporting tools are not particularly useful in this scenario as duplicate content is not a penalty-worthy situation. While Panda is used to destroy spam-oriented content, duplicate content is treated as more of a null/void situation than as a penalty.
For example, when you place your newly-created original content and it is crawled and indexed, Google attributes your domain with being the origin of said content. If another website showcases this content, it is recognized as duplicate by Google (which has compared it to your indexed version) and given no benefit or penalty. In effect, using duplicate content is merely a neutral practice - it's the spam that Google is really after.
Here's a beginner's report on duplicate content that spells it out quite nicely:
https://moz.com/learn/seo/duplicate-content
As Charles mentioned, copied content is not an automatic ban sentence. If it is within "acceptable limits" there is not a detrimental impact to the website. However, if the website is made up of purely copied content from multiple sources, and spams links or keyword stuffs, it will be dealt with accordingly.
In short, this website will not be penalized in the fashion you desire unless they are spamming or keyword stuffing (among other penalty-worthy offences). Your best bet is to beat them out by building up your link profile and continuing to post valuable, original content.
Let me know if there is anything else I can help with.
Rob
-
Theory states that duplicated content reduces certain keywords’ position in Google.
Wrong. Google might omit duplicate results or ban sites practising it, but it doesn't lower rankings based on number of duplicates or something. Otherwise wikipedia or any aggregating websites like car dealers etc would be nowhere to be found.
It also says that a web who copy content will be penalized.
Semi-wrong. It will be penalized if it's spammy and overdoing it.
Watch this video of Matt Cutts on duplicate content - https://www.youtube.com/watch?v=mQZY7EmjbMA
So, my understanding is that there is no 100% working way of getting down scrapers, because some of them are actually "good" scrapers. Like Facebook! - the biggest scraper in the world.
So, to beat them in rankings, just make sure that you are an authority in your industry, have awesome backlink profile and all aspects of SEO are properly implemented. And yes, sometimes those penalization tools can help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any way to force a URL out of Google index?
As far as I know, there is no way to truly FORCE a URL to be removed from Google's index. We have a page that is being stubborn. Even after it was 301 redirected to an internal secure page months ago and a noindex tag was placed on it in the backend, it still remains in the Google index. I also submitted a request through the remove outdated content tool https://www.google.com/webmasters/tools/removals and it said the content has been removed. My understanding though is that this only updates the cache to be consistent with the current index. So if it's still in the index, this will not remove it. Just asking for confirmation - is there truly any way to force a URL out of the index? Or to even suggest more strongly that it be removed? It's the first listing in this search https://www.google.com/search?q=hcahranswers&rlz=1C1GGRV_enUS753US755&oq=hcahr&aqs=chrome.0.69i59j69i57j69i60j0l3.1700j0j8&sourceid=chrome&ie=UTF-8
Intermediate & Advanced SEO | | MJTrevens0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
We will be switching our shopping cart platform from volusion to magento and really cautious / nervous about our rankings / seo stuff. any advice for anyone that has migrated stores, etc. these urls are years old, etc.
shopping cart platform switch and SEO. What do you suggest? What's the best way to ensure we keep rankings.
Intermediate & Advanced SEO | | PaulDylan0 -
How long for Google Webmaster tools to update/reflect link changes
Hi all, Does anyone know or have experience of how long GWMT takes to update its data?, we did some work on our link profile back in October/November but are still seeing old links (removed) showing in GWMT. Thanks in advance,
Intermediate & Advanced SEO | | righty0 -
Google Places
If you rank on google places, I have noticed that you do not rank on the front page as well. I have a site that ranks on front page for it's keywords; however, because they are (1) on google places, they don't show up when someone is localized to that area. They show up on google places but not on front page. If you turn of localization, they are first in serps. How can I get around this? Two separate sites? One for Google+ (Places) and one for SERPS?
Intermediate & Advanced SEO | | JML11790 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Keyword/Content Consistency
My question is: If you have a keyword that is searched more when it's spelled wrong then when it's spelled right - what do you do? Do you do the misspelled word or keep true to the spelling and say oh well to SEO? Also - Along the same lines of that question: What if you have a keyword that has a - in the middle of it. For instance: website and web-site (this isn't the keyword just an example). and drupal website is searched more then drupal web-site but wordpress web-site is searched more then wordpress website. Technically website is the correct spelling and way to write it, but people put web-site (again not the case in reality - just an example).
Intermediate & Advanced SEO | | blackrino0