Boat broker - issues with duplicate content and indexing search results
-
Hello,
I have read a lot about optimising product pages and not indexing search results or category pages as ideally a person should be directed straight to a product page.
I am interested in how best to approach a site that is listing second hand products for sale - essentially a marketplace of second hand goods (in my case, www.boatshed.com - international boat brokers).
For example, we currently have 5 Colvic Sailer 26 boats for sale across the world - that is 5 boats of the same make and model but differing years, locations, sellers and prices.
My concern is with search results and 'category' pages. Unlike typical e-commerce sites, when someone searches for a 'Colvic sailer 26 for sale' I want them to go to a search results style page as it is more useful for them to see a list of boats than one random one that Google decides is most important (or possibly one it can match by location).
Currently we have 3 different URL types to show search results style pages (i.e. paginated lists of boats that include name, image and short description):
manufacturer URL's e.g. http://www.boatshed.com/colvic-manufacturer-145.html
category URL's e.g. barges http://www.boatshed.com/barges-category-55.html
and normal search results e.g. dosearch.php?form_boattype_textbox=&....I have noindexed the search results pages but our category and manufacturer URLs show up in search results and ultimately these are pages I want people to land on. I am however getting duplicate content warnings in Moz. Most boats are in several categories and all will come up on 1 manufacturer and one manufacturer and model page.
Both sets of URL's are in my opinion needed; lots of users search for exact makes / models and lots of users just search for the type of boat e.g. 'barge for sale' so both sets of landing pages are useful.
Any suggestions or thoughts greatly appreciated
Thanks
Ben
-
I have run into this same problem in multiple industries.
John Mueller at Google has answered my question multiple times on this subject saying that internal duplicate content is NOT an issue and especially if you can justify it being there a manual review of sorts would not identify a problem. Google will make a judgement call on what the best page to serve is by the users query.
In your case you are doing what should be done, the least helpful page is the actual product page and in many cases in the past I have no indexed them.
This same problem arises for many industries, real estate, office space, used cars etc...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content errors
I have multiple duplicate content errors in my crawl diagnostics. The problem is though that i already took care of these problems with the canonical tag but MOZ keeps saying there is a problem. For example this page http://www.letspump.dk/produkter/56-aminosyre/ has a canonical tag, but moz still says it has an error. Why is that?
On-Page Optimization | | toejklemme0 -
Duplicate Content - But it isn't!
Hi All, I have a site that releases alerts for particular problem/events/happenings. Due to legal stuff we keep the majority of the content the same on each of these event pages. The URLs are all different but it keeps coming back as duplicate content. The canonical tag is not right (i dont think for this) egs http://www.holidaytravelwatch.com/alerts/call-to-arms/egypt/coral-sea-waterworld-resort-sharm-el-sheikh-egypt-holiday-complaints-july-2014 http://www.holidaytravelwatch.com/alerts/call-to-arms/egypt/hotel-concorde-el-salam-sharm-el-sheikh-egypt-holiday-complaints-may-2014
On-Page Optimization | | Astute-Media0 -
Google is showing product rating of 1-star in search results when average rating is 3.7 - 4 stars
When searching for the brand name "SteriPEN", the #3 listing on the SERP is for one of SteriPEN's "Adventurer Opti" product at REI.com, On the SERP for this search, the REI listing for Adventurer Opti product displays showing the product as 1-star based on a product review from 2010. What we don't understand is the fact that the product history has a 3.7 to 4.2 rating on most websites. Why would a product with so many reviews and established history have 1 review that drives the 1-star rating from such a prominent retailer? Makes no sense. Any suggestions as to whom we might be able to contact for help with this is greatly appreciated. edit?usp=sharing
On-Page Optimization | | ReachMaineAgency0 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0 -
The crawl diagnosis indicated that my domain www.mydomain.com is duplicate with www.mydomain.com/index.php. How can I correct this issue?
How can I fix this issue when crawl diagnosis indicated that my www.mydomain.com is duplicate with www.mydomain.com/index.php? That suppose to be the same page and not duplicate, right?
On-Page Optimization | | jsevilla0 -
Is rel=canonical used only for duplicate content
Can the rel-canonical be used to tell the search engines which page is "preferred" when there are similar pages? For instance, I have an internal page that Google is showing on the first page of the SERPs that I would prefer the home page be ranked for. Both the home and internal page have been optimized for the same keyword. What is interesting is that the internal page has very few backlinks compared to the home page but Google seems to favor it since the keyword is in the URL. I am afraid a 301 will drop us from the first page of the SERPs.
On-Page Optimization | | surveygizmo0 -
Duplicate page title and content
Hello, I have an ecommerce store where we offer many similar products, and the main difference could be the color or memory storage. Due to this reason my main problem appears to be be duplicate page title and content. What is the best way to correct this issue? I cant make them different neither. I always include this particular difference in title or description. I guess it is not enough? any way to fix it? thanks!
On-Page Optimization | | tolyadem10 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0