How are they avoiding duplicate content?
-
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at:
http://www.worldsoccershop.com/23147.html
http://www.foxsoccershop.com/23147.html
http://www.soccernetstore.com/23147.html
You can see that practically everything is the same including:
-
product URL
-
product title
-
product description
My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
-
-
The answer is right in your question - "runs a number of whitelabel sites". As mentioned, it is largely due to the original publisher publishing the content first and getting indexed - from there, anytime the google bot stumbles across the same content - it will figure out that it has seen the content before, and attribute the ranking to the original. Something that google themselves covered last year here (although more specifically for news at the time).
Duplicate content unfortunately isn't just "not shown" by the search engines (imagine how "clean" the SERPS would be if that were the case!) it's just ranked lower than the original publisher that google is aware of. Occasionally you will get the odd page that will rank from a different domain - but that is usually due to being fresh content, I have seen this myself with my own content being aggregated by a large news site - they might outrank me on occasion for a day on one or two pieces - but my original url comes out on top in the end.
-
They rank as #1 for the relevant terms. It is very clear Google feels they are the original source of the content, and the other sites are duplicates.
I don't have a crystal ball to see the future, but based on current information, the original source site is not suffering in any manner.
-
Interesting feedback - are worldsoccershop (the original source) likely to suffer any penalties as a result of the whitelabel sites carrying the duplicate content?
-
Hey
I just did a search for some phrases I found on one of their product pages and I wrapped up this long query in double quotes.
"Large graffiti print on front that illustrates the club's famous players and history. The traditional blue jersey has gold details including team badge, adidas logo and sponsor design"
the results that are returned shows the worldsoccershop.com result first & second and therefore they seem to be an authority on this product description.
I have a client that is setting up a store to take on some rather big boys like notonthehighstreet.com and in this industry where they have several, established competitors for each product the big authority stores seem to rank for the generic product descriptions with no real issues.
This is ultimately difficult for the smaller stores as whilst they have less resources, pages on my clients site that use these duplicate descriptions are just getting filtered out of the results. We can see this filtering in action with very specific searches like the one above where we get the 'we have filtered out similar results' message in the search results and low and behold, my clients results are in those that are filtered.
So, to answer your original question:
They have not 'coded' anything in a specific way and there is nothing you are missing as such. They are just an authority site and as such are 'getting away with it' - which, for the smaller players, kind of sucks. That said, only the worldofsoccer pages are returned so the other sites could well be filtered out.
Still, as I am coaching our client, see this not as a problem but as an opportunity. By creating unique content, we can hopefully piggy back other more authoritative sites that are all returning an exact same product description and whilst I don't expect us to get 1st place, we can work towards first page and out of that filter.
Duplicate content is a massive problem and on this site we are working on there is one product description that copyscape tells us is on 300 other sites. Google wants to return rich result sets, some shops, some information, some pictures etc and not just 10 sets of the same thing so dare to be different and give them a reason to display your page.
Hope it helps
Marcus -
My question is, why is Google not classing this as duplicate content?
Why do you feel this content has not been flagged as duplicate content?
The reasonable search for these pages is Barcelona Soccer Jersey. Only one of the three sites has results for this term in the top 50, and it is the #1 and #2 results. If this was not duplicate content, you would expect to find the other two sites listed on the first page of google results as well.
The perfect search for the page (very longtail and unrealistic) is Barcelona 11/12 home soccer jersey. For this result, the worldsoccershop.com site ranks as #1 and 3, the foxsoccershop ranks as #8 which is a big drop down considering the content is the same, and the soccernetstore.com site is not in the top 50 results.
The other two sites have clearly been identified as duplicate content or are otherwise being penalized quite severely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Different language with direct translation: duplicate content, meta?
For a site that does NOT want a separate subdomain, or directory, or TLD for a country/language would the directly translated page (static) content/meta be duplicate? (NOT considering a translation of the term/acronym which could exist in another language) i.e. /SEO-city-state in English vs. /SEO-city-state Spanish -In this example a term/acronym that is the same in any language. Outside of duplicate content, are their other conflict potentials in rankings you can think of?
Intermediate & Advanced SEO | | bozzie3110 -
Bigcommerce & Blog Tags causing Duplicate Content?
Curious why moz would pick up our blog tags as causing duplicate content, when each blog has a rel canonical tag pointing to either the blog post itself and on the tag pages points to the blog as a whole. Kinda want to get rid of the tags in general now, but also feel they can add some extra value to UX later on when we have many more blog posts. Curious if anyone knows a way around this or even a best solution practice when faced with such odd issues? I can see why the duplicate content would happen, but when grouping content into categories?
Intermediate & Advanced SEO | | Deacyde0 -
Will using 301 redirects to reduce duplicate content on a massive scale within a domain hurt the site?
We have a site that is suffering a duplicate content problem. To help resolve this we intend to reduce the amount of landing pages within the site. There are a HUGE amount of pages. We have identified the potential to reduce the pages by half at first by combing the top level directories, as we believe they are semantically similar enough that they no longer warrant being seperated.
Intermediate & Advanced SEO | | Silkstream
For instance: Mobile Phones & Mobile Tablets (Its not mobile devices). We want to remove this directory path and 301 these pages to the others, then rewrite the content to include both phones and tablets on the same landing page. Question: Would a massive amount of 301's (over 100,000) cause any harm to the general health of the website? Would it affect the authority? We are also considering just severing them from the site, leaving them indexed but not crawlable from the site, to try and maintain a smooth transition. We dont want traffic to tank. Has anyone performed anything similar? Id be interested to hear all opinions. Thanks!0 -
Duplicate Content Warning For Pages That Do Not Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Intermediate & Advanced SEO | | southcoasthost0 -
Duplicate content via dynamic URLs where difference is only parameter order?
I have a question about the order of parameters in an URL versus duplicate content issues. The URLs would be identical if the parameter order was the same. E.g.
Intermediate & Advanced SEO | | anthematic
www.example.com/page.php?color=red&size=large&gender=male versus
www.example.com/page.php?gender=male&size=large&color=red How smart is Google at consolidating these, and do these consolidated pages incur any penalty (is their combined “weight” equal to their individual selves)? Does Google really see these two pages as DISTINCT, or does it recognize that they are the same because they have the exact same parameters? Is this worth fixing in or does it have a trivial impact? If we have to fix it and can't change our CMS, should we set a preferred, canonical order for these URLs or 301 redirect from one version to the other? Thanks a million!0 -
Duplicate Content Help
seomoz tool gives me back duplicate content on both these URL's http://www.mydomain.com/football-teams/ http://www.mydomain.com/football-teams/index.php I want to use http://www.mydomain.com/football-teams/ as this just look nice & clean. What would be best practice to fix this issue? Kind Regards Eddie
Intermediate & Advanced SEO | | Paul780 -
Duplicate Content | eBay
My client is generating templates for his eBay template based on content he has on his eCommerce platform. I'm 100% sure this will cause duplicate content issues. My question is this.. and I'm not sure where eBay policy stands with this but adding the canonical tag to the template.. will this work if it's coming from a different page i.e. eBay? Update: I'm not finding any information regarding this on the eBay policy's: http://ocs.ebay.com/ws/eBayISAPI.dll?CustomerSupport&action=0&searchstring=canonical So it does look like I can have rel="canonical" tag in custom eBay templates but I'm concern this can be considered: "cheating" since rel="canonical is actually a 301 but as this says: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html it's legitimately duplicate content. The question is now: should I add it or not? UPDATE seems eBay templates are embedded in a iframe but the snap shot on google actually shows the template. This makes me wonder how they are handling iframes now. looking at http://www.webmaster-toolkit.com/search-engine-simulator.shtml does shows the content inside the iframe. Interesting. Anyone else have feedback?
Intermediate & Advanced SEO | | joseph.chambers1