What constitutes duplicate content?
-
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details.
In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content.
I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
-
Hi Atul,
Different languages is NOT seen as duplicate content. If you take the same article and present it in both English and Spanish, that would be considered two unique articles.
-
Hi Ryan,
What about a site in 3 different languages Targetting 3 countries with same content ?
Would Google consider it as duplicate content.
Thanks
-
You rock!
-
Duplicate content is determined when a significant percentage of the content (i.e. the words on the page) is duplicated on another web page. Sorry if that seems too obvious but that's the definition. I am not sure what percentage SEOmoz or Google uses to determine duplicate content but based on your description it sounds like 90%+ of the content is duplicated.
Some options:
1. Add unique content to the page. Work to ensure at least 50% of the content is unique to the page. You can share information specific to the month involved. A Christmas pool party might be cold and you can talk about the holiday involvement. You can share images from the prior year's event, experiences from participants, etc.
2. Move the content to a single page. You can offer a single event page with information about each month the event is held.
3. You can maintain a single active page at a time. After the December event is over you can remove the page, publish the March 2012 page and 301 redirect the December URL to the March page.
4. You can canonicalize the various versions to a single page. If you take this step it is important to keep track of the indexed page version.
5. You can noindex all except one of the pages.
I suggest the above options are in order of preference. The first option will likely yield the best results. If you are not willing to go with that option, try the second option, and so forth.
-
Thx for the reply.
The events come up as separate instances when people are browsing our site. E.g., they browse all upcoming events at a certain swimming pool (rec center), and get a listing with a brief overview. Then they can click on an event details page. 1 event details page per event.
E.g., http://www.chatterblock.com/facility/12/crystal-pool-and-fitness-centre-victoria-bc/events/
From a user experience, this works great. question was asked because I don't want to be penalized for having duplicate content.
-
You could use the Rel="canoncial" tag but that will affect your search rankings.
http://www.seomoz.org/learn-seo/duplicate-content
Are people actually searching for each month separately? If not, this might not be much of a concern.
Could you create a page that lists all the dates and that would be a where you could point the Rel="canoncial" tag to?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
How to handle duplicate content with Bible verses
Have a friend that does a site with bible verses and different peoples thoughts or feelings on them. Since I'm an SEO he came to me with questions and duplicate content red flag popped up in my head. My clients all generate their own content so not familiar with this world. Since Bible verses appear all over the place, is there a way to address this from an SEO standpoint to avoid duplicate content issues? Thanks in advance.
Intermediate & Advanced SEO | | jeremyskillings0 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate or not ?
Hello, I have an ecommerce website with products I have many categories and more products are associated with several categories (I can not do otherwise). Urls of each product are not duplicated because I have : http://www.site.com/product-name However, my breadcrumb varies depending on the way. I have for example: If I go through the A section and sub-section Aa, my breadcrumb will:
Intermediate & Advanced SEO | | android_lyon
Home> Section A> subheading Aa> product 1 If >> I go through the B section and sub-section Ca, my breadcrumb will:
Home> Section B> subheading Ca> product 1 My question: is that with only a breadcrumb different for my product sheets, there is a duplication? My opinion ...... not because the url of the page is unique. Thank you for your feedback. Sorry for the english, i'm french 😉 D.0 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0