Duplicate content issue with dynamically generated url
-
Hi,
For those who have followed my previous question, I have a similar one regarding dynamically generated urls.
From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page.
I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean=
but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing.
What is my solution for this? Nofollow these pages? Block them thru robots txt?
-
When I looked at your site, changing the criteria changed the listings on the page, so each page was unique. However, I'm guessing 100% of the listings can be accessed by just clicking through the pages of results without changing the criteria?
If you decide the best approach is to block the different versions of search results pages, I would consider using the canonical meta tag to specify the canonical (main) version of the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexing Issues
One of the main pages on my site, http://www.waikoloavacationrentals.com/kolea-rentals/condos, I have been having a hard time getting google to index it correctly or at all. It is one of the top pages on my site and should be in my sub links in google, but it is not even showing up in searches. Any input would be appreciated. The only red flap issue is the number of outgoing links, but that is the way the page is supposed to be. I would assume most real estate listing pages are very similar. Ultimately when you look at traffic, time on page, inbound links, etc. it is one of the top pages on my site in all those categories. Any input would be greatly appreciated.
On-Page Optimization | | RobDalton0 -
Dynamic URL Parameters + Woocommerce create 404 errors
Hi Guys,
On-Page Optimization | | jeeyer
Our latest Moz crawl shows a lot of 404-errors for pages that create dynamical links for users? (I guess it are dynamic links, not sure). Situation: On a page which shows products from brand X users can use the pagination icons on the bottom, or click on: View: / 24/48/All.
When a user clicks 48 the end of the link will be /?show_products=48
I think there were some pages that could show 48 products but do not exist anymore (because products are sold out for example), and that's why they show 404's and Moz reports them. How do I deal with these 404-errors? I can't set a 301-redirect because it depends on how many products are shown (it changes every time).
Should I just ignore these kind of 404-errors? Or what is the best way to handle this situation?0 -
How to remove duplicate content issues for thin page(containing oops no resulting found)
In this scenarios we have multiple different URLs but the page content is rendering same (containing Oops message) due to which content duplicate's issue arises.As soon as content for these URL,s are available then those pages duplicate issue will removed. So we want to remove duplicate issue not the page & Page URLs.
On-Page Optimization | | surabhi60 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Prevent indexing of dynamic content
Hi folks! I discovered bit of an issue with a client's site. Primarily, the site consists of static html pages, however, within one page (a car photo gallery), a line of php coding: dynamically generates a 100 or so pages comprising the photo gallery - all with the same page title and meta description. The photo gallery script resides in the /gallery folder, which I attempted to block via robots.txt - to no avail. My next step will be to include a: within the head section of the html page, but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated by the call to the php script residing a bit further down on the page? Dino
On-Page Optimization | | SCW0 -
Duplicate content issues with products page 1,2,3 and so on
Hi, we have this products page, for example of a landing page:
On-Page Optimization | | Essentia
http://www.redwrappings.com.au/australian-made/gift-ideas and then we have the link to page 2,3,4 and so on:
http://www.redwrappings.com.au/products.php?c=australian-made&p=2
http://www.redwrappings.com.au/products.php?c=australian-made&p=3 In SEOmoz, they are recognized as duplicate page contents.
What would be the best way to solve this problem? One easy way i can think of is to nominate the first landing page to be the 'master' page (http://www.redwrappings.com.au/australian-made/gift-ideas), and add canonical meta links on page 2,3 and so on. Any other suggestions? Thanks 🙂0