What constitutes duplicate content?
-
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details.
In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content.
I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
-
Hi Atul,
Different languages is NOT seen as duplicate content. If you take the same article and present it in both English and Spanish, that would be considered two unique articles.
-
Hi Ryan,
What about a site in 3 different languages Targetting 3 countries with same content ?
Would Google consider it as duplicate content.
Thanks
-
You rock!
-
Duplicate content is determined when a significant percentage of the content (i.e. the words on the page) is duplicated on another web page. Sorry if that seems too obvious but that's the definition. I am not sure what percentage SEOmoz or Google uses to determine duplicate content but based on your description it sounds like 90%+ of the content is duplicated.
Some options:
1. Add unique content to the page. Work to ensure at least 50% of the content is unique to the page. You can share information specific to the month involved. A Christmas pool party might be cold and you can talk about the holiday involvement. You can share images from the prior year's event, experiences from participants, etc.
2. Move the content to a single page. You can offer a single event page with information about each month the event is held.
3. You can maintain a single active page at a time. After the December event is over you can remove the page, publish the March 2012 page and 301 redirect the December URL to the March page.
4. You can canonicalize the various versions to a single page. If you take this step it is important to keep track of the indexed page version.
5. You can noindex all except one of the pages.
I suggest the above options are in order of preference. The first option will likely yield the best results. If you are not willing to go with that option, try the second option, and so forth.
-
Thx for the reply.
The events come up as separate instances when people are browsing our site. E.g., they browse all upcoming events at a certain swimming pool (rec center), and get a listing with a brief overview. Then they can click on an event details page. 1 event details page per event.
E.g., http://www.chatterblock.com/facility/12/crystal-pool-and-fitness-centre-victoria-bc/events/
From a user experience, this works great. question was asked because I don't want to be penalized for having duplicate content.
-
You could use the Rel="canoncial" tag but that will affect your search rankings.
http://www.seomoz.org/learn-seo/duplicate-content
Are people actually searching for each month separately? If not, this might not be much of a concern.
Could you create a page that lists all the dates and that would be a where you could point the Rel="canoncial" tag to?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wondering if creating 256 new pages would cause duplicate content issues
I just completed a long post that reviews 16 landing page tools. I want to add 256 new pages that compare each tool against each other. For example: Leadpages vs. Instapage Leadpages vs. Unbounce Instapage vs. Unbounce, etc Each page will have one product's information on the left and the other on the right. So each page will be a unique combination BUT the same product information will be found on several other pages (its other comparisons vs the other 15 tools). This is because the Leadpages comparison information (a table) will be the same no matter which tool it is being compared against. If my math is correct, this will create 256 new pages - one for each combination of the 16 tools against each other! My site now is new and only has 6 posts/pages if that matters. Want to make sure I don't create a problem early on...Any thoughts?
Intermediate & Advanced SEO | | martechwiz0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
Joomla Duplicate Page content fix for mailto component?
Hi, I am currently working on my site and have the following duplicate page content issues: My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33 My Uni Essays http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6 This happens 15 times Any ideas on how to fix this please? Thank you
Intermediate & Advanced SEO | | grays01800 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0 -
Duplicate content on index.htm page
How do I avoid duplicate content on the index.htm page . I need to redirect the spider from the /index.htm file to the main root of http://www.manandhisvan.com.au and hence avoid duplicate content. Does anyone know of a foolproof way of achieving this without me buggering up the complete site Cheers Freddy
Intermediate & Advanced SEO | | Fatfreddy0