How can I make it so that the various iterations (pages) do not come up as duplicate content ?
-
Hello,
I wondered if somebody could give me some advice.
The problem of various iterations of the clanedar page coming up as duplicate content.
There is a large calendar on my site for events and each time the page is viewed it is seen as duplicate content . How can I make it so that the various iterations (pages) do not come up as duplicate content ? Regards
-
Anthony
There's a few ways to do this, and it does depend a little on the specifics of how the site is set up but the best (and easiest) may be to use the URL parameter settings in Webmaster Tools.
You're going to log into webmaster tools, go to configuration->URL Parameters - set it to NOT index things beyond the ? (this may be numbers 2 and beyond or it may be everything)
How is this set up though? Does the calendar "start" and "end" somewhere or is it going off infinitely in either direction?
The robots.txt file won't keep the page out of the index.
If you can let us know some more specifics that would be great!
-Dan
-
No, no, no. You don't add these to calendar pages. You need to place a file called robots.txt in your root folder.
Read more here: http://www.seomoz.org/learn-seo/robotstxt
-
Gamer07
Thanks for this , it is very helpful. I'll try it out.
Just to clarify I need to add this to every calendar page which comes up as being duplicate? If so is there a quick way of doing it as there are (I kid you not) over 5,000 calendar pages which are coming up as duplicate. Obviously I'd prefer not to go through all of those manually.
Going forwards I take it that I add this to every calendar page. Is there there a proactive way of stopping it in the first place or is it just a case of remedying it (with this code) once it occurs . As obviously ideally I'd like to stop it occuring again in the future as opposed to keep remedying it again and again.
Many thanks
-
Hi,
When you change the page, check the URL, you will see parameters. Such as calendar?week=18
Open your robots.txt file and add this line
Disallow: /calendar?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exclude sorting options using nofollow to reduce duplicate content
I'm getting reports of duplicate content for pages that have different sorting options applied, e.g: /trips/dest/africa-and-middle-east/
On-Page Optimization | | benbrowning
/trips/dest/africa-and-middle-east/?sort=title&direction=asc&page=1
/trips/dest/africa-and-middle-east/?sort=title&direction=des&page=1 I have the added complication of having pagination combined with these sorting options. I also don't have the option of a view all page. I'm considering adding rel="nofollow" to the sorting controls so they are just taken out of the equation, then using rel="next" and rel="prev" to handle the pagination as per Google recommendations(using the default sorting options). Has anyone tried this approach, or have an opinion on whether it would work?0 -
Duplicate Content
I run a Business Directory, Where all the businesses are listed. I am having an issue.. with Duplication content. I have categories, Like A, B, C Now a business in Category A, User can filter it by different locations, by State, City, Area So If they filter it by the State and the State has 10 businesses and all of them are in one City. Both of the page
On-Page Optimization | | Adnan4SEO
The state filtered and the city filtered are same, What can i do to avoid that? canonical-url-tag or changing page Meta's and Body text? Please help 🙂0 -
Duplicate Content for Event Pages
Hi Folks, I have event pages for specific training courses running on certain dates, the problem I have is that MOZ indicates that I have 1040 duplicate content issues because I'm serving pages like this https://purplegriffon.com/event/2521/mop-practitioner I'm not sure how best to go about resolving this as, of course, although each event is unique in terms of it's start date, the courses and locations could be identical. Will Google penalise us for these types of pages, or will they even index them? Should I add a canonical link to the head of the document pointing to the related course page such as https://purplegriffon.com/courses/project-management/mop-management-of-portfolios/mop-practitioner. Will this solve the issue? I'm a little stuck on what to do for the best. Any advice would be much appreciated. Thanks. Kind Regards Gareth Daine
On-Page Optimization | | PurpleGriffon0 -
What is the threshold for a page title to be considered duplicate?
We're trying to find out what percentage of words in a page title need to be the same for it to be considered duplicate? We have search results pages for each country on our site, so the only difference in the page title is the country name and they are all being flagged as duplicate, even though they are not 100% duplicate.
On-Page Optimization | | Odjobob0 -
Can Page Authority of a site be higher than its Domain Authority?
I own a website called Takeyourtips.com. While doing a search no Google, I found that the page authority of the home page (31) is higher than the domain authority (23). I was wondering if it's really possible because my understanding was page authority of a page is determined by its domain authority. Therefore, it the domain authority of a website is 23, none of its page could have a higher page authority. Plus, upon consulting an SEO expert, I was told that neither Domain Authority or Page Authority of a page carries any importance as far as higher ranking of a website is concerned. Is this true? Thanks in advance for the answers. Cheers, Sushant large
On-Page Optimization | | suskanchan0 -
Suggestions to avoid duplicate content
Hi, we have about 6500 products, almost all with descriptions. SEOMOZ is showing about 2500 of them with duplicate content. The reason for this is that only one or two words are different for each product. For example, we have 500 award certificates. All are the same size and have the same description. But one is swimming, one baseball, one reading, etc, etc. Apparently the 1 word difference is not enough to differentiate. We have the same issue with our trophies - they are identical, except for figures. Does anyone have any good tips on how to change the content to avoid this issue and to avoid making up content for 2500 items? Thanks! Neil trophycentral.com
On-Page Optimization | | trophycentraltrophiesandawards0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0