Is Google able to determine duplicate content every day/ month?
-
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site).
Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday.
Have you seen or heard something similar?
-
Sorting out Google's timelines is tricky these days, because they aren't the same for every process and every site. In the early days, the "Google dance" happened about once a month, and that was the whole mess (index, algo updates, etc.). Over time, index updates have gotten a lot faster, and ranking and indexation are more real-time (especially since the "Caffeine" update), but that varies wildly across sites and pages.
I think you also have to separate a couple of impacts of duplicate content. When it comes to filtering - Google excluding a piece of duplicate content from rankings (but not necessarily penalizing the site), I don't see any evidence that this takes a couple of months. It can Google days or weeks to re-cache any given page, and to detect a duplicate they would have to re-cache both copies, so that may take a month in some cases, realistically. I strongly suspect, though, that the filter itself happens in real-time. There's no good way to store a filter for every scenario, and some filters are query-specific. Computationally, some filters almost have to happen on the fly.
On the other hand, you have updates like Panda, where duplicate content can cause something close to a penalty. Panda data was originally updated outside of the main algorithm, to the best of our knowledge, and probably about once/month. Over the more than a year since Panda 1.0 rolled out, though, it seems that this timeline accelerated. I don't think it's real-time, but it may be closer to 2 weeks (that's speculation, I admit).
So, the short answer is "It's complicated" I don't have any evidence to suggest that filtering duplicates takes Google months (and, actually, have anecdotal evidence that it can happen much faster). It is possible that it could take weeks or months to see the impact of duplicates on some sites and in some situations, though.
-
Hi Donnie,
Thanks for your reply, but I was already aware of the fact that Google had/ has a sandbox. I had to mention this within my question. I'm looking more for an answer around the fact if Google is able to determine on what basis if pages are duplicate.
Because I saw dozens of cases where our content was indexed and we linked/ linked not back to the 'original' source.
Also want to make clear that in all of these cases the duplicate content was in agreement with the original sources just to be sure.
-
In the past google had a sandbox period before any page (content) would rank. However, now everything is instant. (just learned this today @seomoz)
If you release something, Google will index it as fast as possible. If that info gets duplicated Google will only count the first one indexed. Everyone else loses brownie points unless they trackback/link back to the main article (first indexed).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks or content? What is the problem here?
My Moz ranking is 29, Ahrefs Domain rank is 49 and Majestic Citation flow is 44 whereas trust flow is 17. This is a plain question - If the above are the rankings for backlink problem, then where is the problem more likely to be found? Backlink or content? In more detail - My site has been dropping in search since the last few weeks. With these rankings, is it likely that a backlink related problem is there? I myself agree that some content is poor and thin and there is a problem of plagiarism also. But overall, do i keep focusing on content? I do not know how good are these rankings as shown above. My URL is www.marketing91.com Please let me know whether the backlink profile looks good or not? So that at least i am not worried that there is a backlink problem as well. (i will surely work on toxic links soon) I can worry only on content.
Reporting & Analytics | | marketing910 -
Splitting Google analytics data
Hi Everyone I'm not sure if this can be done but thought i would ask anyway. One of our clients has a website which is a 3 tiered website, basically this means different access levels for different users depending on their ip address. The split is as follows (business 1, business 2 and the general public all see different things and areas within the website) Now what we are wanting to do is essentially split our analytics data for each of the 3 different users on the site, Can this be done through Google analytics? Thanks in advance. ps If im not clear enough let me know and ill try clear it up
Reporting & Analytics | | TheZenAgency0 -
Site account in Google Analytics
Hello I have a question about my site account. On 2014, during a week, my ID tracking of Google Analytics was removed of the site, in this period the volume of users and sessions is lower than the other weeks. But I don't understand why are the sessions and users still reporting during this period without ID Tracking
Reporting & Analytics | | Arkix0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
How to hook up a ppc campaign to a google + Page
Greetings,
Reporting & Analytics | | Nightwing
Sometimes you just want to give Google a big slap for making straight forward requests damn impossible. So all i ma trying to ad is point a ppc ad at this Google + account <a>https://plus.google.com/118393512656496298734#118393512656496298734/posts</a> But i get a warning sign saying:
"The URL must be for a Google+ page, not a personal profile" I then spend half an hour tring to find a Google + page but get no where fast 😞 Warning message illustrated here:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-page-plus_zps46ff995a.jpg So my question is please how to a get the Google + page for this account:
<a>https://plus.google.com/118393512656496298734#118393512656496298734/posts</a> Any insights welcome!
David0 -
Setting up Google Analytics for Subsites
I currently have one main .com site and am planning on launching geo-location subsites .co.uk, .com.au, .ru, etc... Traffic will flow between both sites and some of the content on the subsites will be duplicate and therefore include a canonical tag to the main site. I want to set up GA to capture who is going to the subsites and vice versa and correctly capture crossover traffic. Any advice on implementing advanced analytics directly (or links to sources that will direct me the right direction for this project)
Reporting & Analytics | | theLotter0 -
Results Google in different browsers
I have changed one of my company pages. Now I saw that the Google position is different using IE of Firefox. In IE we are at position 1, in Firefox we are at position 4. Does anybody know, why this is? And how I could change this for different browsers? At the end the results are from Google.
Reporting & Analytics | | vliegticketsnl0 -
Setting up Analytics on a Site that Uses Frames For Some Content
I work with a real estate agent and he uses strings from another tool to populate the listings on his site. In an attempt to be able to track traffic to both the framed pages and the non-framed pages he has two sets of analytics code on his site - one inside the frame and one for the regular part of the site. (there's also a third that the company who hosts his site and provides all these other tools put on his site - but I don't think that's really important to this conversation). Not only is it confusing looking at the analytics data, his bounce rate is down right unmanageable. As soon as anyone clicks on any of the listings they've bounced away. Here's a page - all of those listings below " Here are the most recent Toronto Beaches Real Estate Listings" are part of a frame. http://eastendtorontohomes.com/toronto-beach-real-estate-search/ I'm not really sure what to do about it or how to deal with it? Anyone out there got any good advice? And just in case you're wondering there aren't any other options - apart from spending thousands to build his own database thingie. We've thought about that (as other agents in the city have done that), but just aren't sure it's worth it. And, quite frankly he doesn't want to spend the money.
Reporting & Analytics | | annasus0