Issue: Duplicate Page Content
-
Hello SEO experts,
I'm facing duplicate page content issue on my website. My website is a apartments rental website when client search apartment for availability. Automatic generate same url's. I've already block these url's in robots.txt file but facing same issue. Kindly guide me what can I do.
Here are some example links.
http://availability.website.com/booking.php?id=17&bid=220
http://availability.website.com/booking.php?id=17&bid=242
http://availability.website.com/booking.php?id=18&bid=214
http://availability.website.com/booking.php?id=18&bid=215
http://availability.website.com/booking.php?id=18&bid=256
http://availability.website.com/details.php?id=17&bid=220
http://availability.website.com/details.php?id=17&bid=242
http://availability.website.com/details.php?id=17&pid=220&bid=220
http://availability.website.com/details.php?id=17&pid=242&bid=242
http://availability.website.com/details.php?id=18&bid=214
http://availability.website.com/details.php?id=18&bid=215
http://availability.website.com/details.php?id=18&bid=256
http://availability.website.com/details.php?id=18&pid=214&bid=214
http://availability.website.com/details.php?id=18&pid=215&bid=215
http://availability.website.com/details.php?id=18&pid=256&bid=256
http://availability.website.com/details.php?id=3&bid=340
http://availability.website.com/details.php?id=3&pid=340&bid=340
http://availability.website.com/details.php?id=4&bid=363
http://availability.website.com/details.php?id=4&pid=363&bid=363
http://availability.website.com/details.php?id=6&bid=367
http://availability.website.com/details.php?id=6&pid=367&bid=367
http://availability.website.com/details.php?id=8&bid=168
http://availability.website.com/details.php?id=8&pid=168&bid=168Thanks and waiting for your response
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | -
You should probably set the www.website.com/the-mayflower/ version as the canonical. So the page on the subdomain, and all other copies, would ALL have a rel canonical tag that points to www.website.com/the-mayflower/.
I wouldn't block the subdomain based on what you've said, but complicated issues like this are difficult to fully diganose and prescribe fixes without seeing the site.
-
Thanks for your response. I've forget to mention one more thing in my question. I've same properties in main and sub domain. Example
Here is one property on main page.
http://www.website.com/the-mayflower/
and now same property on sub domain for availability.
http://availability.website.com/property.php?id=1
Maybe I'm facing duplicate page content and title issue for this reason. Can I block subdomain for search engine crawl?
-
You also need to find a way to stop this from happening. Ounce of prevention!
-
Agreed - robots.txt is not the way to go on this. Also, you can configure parameter handling in GWT to help with this.
-
I would make sure to use the rel="canonical" tag to designate which URL Google should consider to be the primary URL, regardless of any parameters appended to it. Here is some additional information -
https://support.google.com/webmasters/answer/139394?hl=en
http://googlewebmastercentral.blogspot.com/2013/04/5-common-mistakes-with-relcanonical.html
I would also recommend not using robots.txt in this case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content - default.html
I am showing a duplicate content error in moz. I have site.com and site.com/default.html How can I fix that? Should I use a canonical tag? If so, how would i do that?
On-Page Optimization | | bhsiao0 -
Which page to rank for a Keyword? Home Page or Deep Page?
So, we have a situation where there is one particular keyword we want to rank for. We have been up and down over the years, at our best probably position 4-5, and now at 20ish. Thats for our home page of course, which the majority of our linking is probably pointing at. We also have a sub page which is optimised for that particular service. The term is "web design brisbane".
On-Page Optimization | | MauriceKintek
So as you can imagine, Web Design is in itself a service and we offer others. Should we optimise our home page for it and remove the sub page?
Keep the sub page because its one our services and optimise both?
Do some kind of canonical thing?
Change our interlinking? All our competitors home pages seem to be the ones that rank, and it feels and looks better in results if its the home page, but if switching up to our sub page is better im all ears. Also if our sub page is somehow hurting or leaking SEO from the home page, id like to know as well. Would prefer to not have to provide a link, due to competition but if someone wants to know we can always PM.0 -
Landing page content and link distribution
Hey there fellow mozers, Need some advice, one of my clients asked us about the best way to distribute their content: number of restaurants per page, links and footer in their Landing pages. Here are 2 examples of what I mean: http://www.just-eat.es/adomicilio/madrid http://www.just-eat.es/adomicilio/pizza Thanks a lot!
On-Page Optimization | | Comunicare0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
How dangerous are duplicate page titles
We ran a SEO crawl and on our report it flag up duplicate pages titles, we investigate further and found that these were page titles from the same product line that had more than one page, e.g 1-50 (products) 51-100 (products) with a next button to move to the following 50 products. These where flagged as duplicate page titles ".../range-1/page-1" and ".../range-1/page-2" These titles are obviously being read as duplicates but because they are the same range we do not know what the best course of action is. We want to know how detrimental these page titles will be to our SEO if at all. If anyone could shed some light on this issue it would be a massive help. Thanks
On-Page Optimization | | SimonDixon0 -
Localised content/pages for identical products
I've got a question about localising the website of a nationwide company. We're a small dance school with nationwide (40 cities) coverage for around 40 products. Currently, we have one page for each product (style of dance), and one page for each city; the product pages cover keywords like 'cheerleading dance class' while the city pages target the 'london dance classes'-type keywords. To make 'localised product pages', I feel like we should make a page for every city/product combo 'London cheerleading classes' - but that seems like a nightmare for both writing sexy & original content, and link building/social stats. The other thing I can think of (which I refuse to do because it would look stupid & flag the page as keyword stuffed) is filling the page with the keyword phrases which are appropriate for every city. Is there another way to let google know 'this page is appropriate for these cities...'? We do currently list the cities a product is available in, but it doesn't seem to help local rankings very much. Would this just be a link building job, using hyper-targeted anchor texts (inc. city names) for each product? How do the pro's tackle this problem?
On-Page Optimization | | AlecPR0 -
Meta Descriptions - Duplicate Content?
I have created a Meta Description for a page that is optimized for SERPS. If I also put this exact content on my page for my readers, would this be considered duplicate content? The meta description and content will be listed on the same page with the same URL. Thanks for your help.
On-Page Optimization | | tuckjames0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30