Is there a way to make Google realize/detect scraper content?
-
Good morning,Theory states that duplicated content reduces certain keywords’ position in Google. It also says that a web who copy content will be penalized. Furthermore, we have spam report tools and the scraper report to inform against these bad practices.In my case: the website, both, sells content to other sites and write and prepare its own content which is not in sale. However, other sites copy these last ones, publish them and Google do not penalize their position in results (not in organic results neither in Google news), even though they are reported using Google tools for that purpose.Could someone explain this to me? Is there a way to make Google realize/detect these bad practices?Thanks
-
I've found backlinks in scraper websites linking to the scraped website I am taking care of.
They are in css, images, forms.
What's the point in doing it on their side?
-
Stolen content is a big issue today and recent reports have shown that people who steal the content from you will usually knock you out of your search engine position, no matter what your authority, backlink, or social share profiles look like.
This great presentation given by Jon Earnshaw at Brighton SEO last week gives a better idea of how it has affected other websites : http://www.slideshare.net/jonathanearnshaw/is-your-content-working-better-for-someone-else
Google use to have a Scraper report that you could file the offending site and get it removed from the SERPS but they have removed this.
I found a similar way to report the stolen content on this blog post :
http://www.techng.info/removing-your-stolen-content-from-google-search-using-dmca/
Hope this answers your question, even if it is a bit delayed from the original post
-
Hello,
The reporting tools are not particularly useful in this scenario as duplicate content is not a penalty-worthy situation. While Panda is used to destroy spam-oriented content, duplicate content is treated as more of a null/void situation than as a penalty.
For example, when you place your newly-created original content and it is crawled and indexed, Google attributes your domain with being the origin of said content. If another website showcases this content, it is recognized as duplicate by Google (which has compared it to your indexed version) and given no benefit or penalty. In effect, using duplicate content is merely a neutral practice - it's the spam that Google is really after.
Here's a beginner's report on duplicate content that spells it out quite nicely:
https://moz.com/learn/seo/duplicate-content
As Charles mentioned, copied content is not an automatic ban sentence. If it is within "acceptable limits" there is not a detrimental impact to the website. However, if the website is made up of purely copied content from multiple sources, and spams links or keyword stuffs, it will be dealt with accordingly.
In short, this website will not be penalized in the fashion you desire unless they are spamming or keyword stuffing (among other penalty-worthy offences). Your best bet is to beat them out by building up your link profile and continuing to post valuable, original content.
Let me know if there is anything else I can help with.
Rob
-
Theory states that duplicated content reduces certain keywords’ position in Google.
Wrong. Google might omit duplicate results or ban sites practising it, but it doesn't lower rankings based on number of duplicates or something. Otherwise wikipedia or any aggregating websites like car dealers etc would be nowhere to be found.
It also says that a web who copy content will be penalized.
Semi-wrong. It will be penalized if it's spammy and overdoing it.
Watch this video of Matt Cutts on duplicate content - https://www.youtube.com/watch?v=mQZY7EmjbMA
So, my understanding is that there is no 100% working way of getting down scrapers, because some of them are actually "good" scrapers. Like Facebook! - the biggest scraper in the world.
So, to beat them in rankings, just make sure that you are an authority in your industry, have awesome backlink profile and all aspects of SEO are properly implemented. And yes, sometimes those penalization tools can help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google-selected canonical makes no sense
Howdy, fellow mozzers, We have added canonical URL to this page - https://www.dignitymemorial.com/obituaries/houston-tx/margot-schurig-8715369/share, pointing to https://www.dignitymemorial.com/obituaries/houston-tx/margot-schurig-8715369 When I check in Google search console, there are no issues reported with that page, and Google does say that it was able to properly read the canonical URL. Yet, it still chooses the page itself as canonical. This doesn't make sense to me. (Here is the link to the screenshot: https://dmitrii-regexseo.tinytake.com/tt/MzU0Mjc0M18xMDY2MTc4Ng) Has anyone dealt with this type of issue, and were you able to resolve it?
Intermediate & Advanced SEO | | DmitriiK0 -
Shall we add engaging and useful FAQ content in all our pages or rather not because of duplication and reduction of unique content?
We are considering to add at the end of alll our 1500 product pages answers to the 9 most frequently asked questions. These questions and answers will be 90% identical for all our products and personalizing them more is not an option and not so necessary since most questions are related to the process of reserving the product. We are convinced this will increase engagement of users with the page, time on page and it will be genuinely useful for the visitor as most visitors will not visit the seperate FAQ page. Also it will add more related keywords/topics to the page.
Intermediate & Advanced SEO | | lcourse
On the downside it will reduce the percentage of unique content per page and adds duplication. Any thoughts about wether in terms of google rankings we should go ahead and benefits in form of engagement may outweight downside of duplication of content?0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
Above the Fold Content
How important is the placement of unique content "Above the Fold". Will attention grabbing images suffice or must their be a lot of unique text?
Intermediate & Advanced SEO | | casper4340 -
How do you archive content?
In this video from Google Webmasters about content, https://www.youtube.com/watch?v=y8s6Y4mx9Vw around 0:57 it is advised to "archive any content that is no longer relevant". My question is how do you exactly do that? By adding noindex to those pages, by removing all internal links to that page, by completely removing those from the website? How do you technically archive content? watch?v=y8s6Y4mx9Vw
Intermediate & Advanced SEO | | SorinaDascalu1 -
Does Google Index an Alert Div w/Delayed Hide
We have a div at the top of a client's the page that displays an alert to the user. After 30 seconds it is rendered hidden. Does Google index this? Does Google take this into account when it ranks the page?
Intermediate & Advanced SEO | | WEOMedia0 -
Blog/Shop/Forum site structure - are we right to make these changes?
We run a fairly large online community with a popular blog and Europe's largest online shop for drift-specific motor sport parts and our website has been around since 2004 I believe. Since it was launched, the blog (or previous CMS system) has been at the domain root, the forums have been located at /forum and the shop at /shop (or similar) but we have decided to move things around a bit and would like some comments as to whether we are doing the right thing or if you would make any addition or different changes to us. Currently the entire website gets around 3m page views per month from 500,000 visitors, but this is split roughly 75% to the forums, 10% to the shop and 15% to the blog (but remember the blog is at the root so anyone who visits our homepage "visits" the blog). We plan to move the shop to the domain root (since the shop provides the income for the business - surely it should be the 1st thing visitors see?), the blog from root to /blog and the forums will stay where they are at /forum. We have read Steven Macdonald's post here, and have taken notes to help minimize traffic loss and disruption to our army of users and hopefully avoid too many penalties from Google and plan to: 301 redirect old URLs to new ones where they have changed. Submit new site maps to search engines. Update old links where we have control (such as forums where we are paid traders etc.). Send out a newsletter to our subscribers. Update our forum members. Fix errors via WMT before and after the re-structure. Should we be taking this opportunity to actually set each of the three sections of the site to it's own sub domain? Our thoughts are that if we are disrupting things, it's surely best to have lots of disruption once rather than a little bit of disruption several times over a 3-6 month period? OSE shows us to have roughly 1500 inbound links to /shop, 2100 to /forum and 4800 to the root / - if we proceed with our plan and put 301 redirects in place this seems to be the best plan to retain the value of these links but if we were to switch to sub domains would the 301s lose most of the link values due to them being on "different" domains? Any help, advise or suggestions are very welcome but comments from experience are what we are seeking ideally! Thanks Jay
Intermediate & Advanced SEO | | DWJames0 -
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Intermediate & Advanced SEO | | NEWCRAFT0