Issue: Duplicate Page Content
-
Hello SEO experts,
I'm facing duplicate page content issue on my website. My website is a apartments rental website when client search apartment for availability. Automatic generate same url's. I've already block these url's in robots.txt file but facing same issue. Kindly guide me what can I do.
Here are some example links.
http://availability.website.com/booking.php?id=17&bid=220
http://availability.website.com/booking.php?id=17&bid=242
http://availability.website.com/booking.php?id=18&bid=214
http://availability.website.com/booking.php?id=18&bid=215
http://availability.website.com/booking.php?id=18&bid=256
http://availability.website.com/details.php?id=17&bid=220
http://availability.website.com/details.php?id=17&bid=242
http://availability.website.com/details.php?id=17&pid=220&bid=220
http://availability.website.com/details.php?id=17&pid=242&bid=242
http://availability.website.com/details.php?id=18&bid=214
http://availability.website.com/details.php?id=18&bid=215
http://availability.website.com/details.php?id=18&bid=256
http://availability.website.com/details.php?id=18&pid=214&bid=214
http://availability.website.com/details.php?id=18&pid=215&bid=215
http://availability.website.com/details.php?id=18&pid=256&bid=256
http://availability.website.com/details.php?id=3&bid=340
http://availability.website.com/details.php?id=3&pid=340&bid=340
http://availability.website.com/details.php?id=4&bid=363
http://availability.website.com/details.php?id=4&pid=363&bid=363
http://availability.website.com/details.php?id=6&bid=367
http://availability.website.com/details.php?id=6&pid=367&bid=367
http://availability.website.com/details.php?id=8&bid=168
http://availability.website.com/details.php?id=8&pid=168&bid=168Thanks and waiting for your response
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | -
You should probably set the www.website.com/the-mayflower/ version as the canonical. So the page on the subdomain, and all other copies, would ALL have a rel canonical tag that points to www.website.com/the-mayflower/.
I wouldn't block the subdomain based on what you've said, but complicated issues like this are difficult to fully diganose and prescribe fixes without seeing the site.
-
Thanks for your response. I've forget to mention one more thing in my question. I've same properties in main and sub domain. Example
Here is one property on main page.
http://www.website.com/the-mayflower/
and now same property on sub domain for availability.
http://availability.website.com/property.php?id=1
Maybe I'm facing duplicate page content and title issue for this reason. Can I block subdomain for search engine crawl?
-
You also need to find a way to stop this from happening. Ounce of prevention!
-
Agreed - robots.txt is not the way to go on this. Also, you can configure parameter handling in GWT to help with this.
-
I would make sure to use the rel="canonical" tag to designate which URL Google should consider to be the primary URL, regardless of any parameters appended to it. Here is some additional information -
https://support.google.com/webmasters/answer/139394?hl=en
http://googlewebmastercentral.blogspot.com/2013/04/5-common-mistakes-with-relcanonical.html
I would also recommend not using robots.txt in this case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
On-Page Optimization | | Jacksons_Fencing0 -
Duplicate page content
These two URLs are being flagged as 98% similar in the code. We're a large ecommerce site, and while it would be ideal to have unique product descriptions on each page we currently don't have the bandwith. Thoughts on what else might be triggering this duplicate content? https://www.etundra.com/restaurant-parts/cooking-equipment-parts/fryers/scoops-skimmers/fmp-175-1081-fryer-crumb-scoop/ https://www.etundra.com/restaurant-equipment/concession-equipment/condiment-pumps/tablecraft-664-wide-mouth-condiment-pump/ Thanks, Natalie
On-Page Optimization | | eTundra0 -
Duplicate product content/disclaimers for non-e-commerce sites
This is more a follow-up to Rand's recent Whiteboard "Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs." I posed my question in the comments, but unsure it will get picked up. My situation isn't exactly the same, but it's similar: Our site isn't an e-commerce site and doesn't have user reviews yet, but we do have maybe 8 pages across 2 product categories featuring very similar product features with duplicate verbiage. However, we don't want to re-write it because we want to make it easy for users to compare apples-to-apples to easily see which features are actually different. We also have to run disclaimers at the bottom of each page.\ Would i-framing the product descriptions and disclaimers be beneficial in this scenario, with the addition of good content? It would still be nice to have some crawlable content on those pages, so the i-framing makes me nervous unless we compensate with at least some above-the-fold, useful content that could be indexed. Thanks, Sarah
On-Page Optimization | | sbs2190 -
Multi channel product descriptions & dupe content issues
Hi When filling in inventory files to upload to the likes of Amazon clients will usually be copying and pasting the product descriptions from the website product descriptions into the Amazon product description field Should they really be re-written to avoid dupe content issues ? I presume not since it is the official description of the product. Please note that i'm talking here about the manufacturers website/product descriptions and their own Amazon shop descriptions. So theirs is the original authoritative description. Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
How do I get rid of duplicate page titles when using a php site?
Hi. I have an e-commerce site that sells a list of products. The list is divided into categories and then those categories for the various pages on the site. An example of a page title. would be given root/products.php?c=40 another page would be given root/products.php?c=41 Is there a way to structure the site with SEO in mind?
On-Page Optimization | | curtisgibbsiii0 -
Meta Descriptions - Duplicate Content?
I have created a Meta Description for a page that is optimized for SERPS. If I also put this exact content on my page for my readers, would this be considered duplicate content? The meta description and content will be listed on the same page with the same URL. Thanks for your help.
On-Page Optimization | | tuckjames0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0