Noindex, rel=cannonical, or no worries?
-
Hello, SEO pros,
We need your help with a case ↓
Introduction:
Our website allows individual contractors to create a webpage where they can show what services they offer, write something about themselves and show their previous projects in pictures. All the professions and services assigned accordingly are already in our system, so users need to pick a profession and mark all services they provide or suggest those which we missed to add.
We have created unique URLs for all the professions and services. We have internal search field and use a autocomplete to direct users to the right page.
**Example: **
PROFESSION
Carpenter (URL: /carpenters )
SERVICES
Decking (URL: /carpenters/decking)
Kitchens (URL: /carpenters/kitchens)
Flooring and staircases (URL: /carpenters/flooring-and-staircases)
Door trimming (URL: /carpenters/door-trimming)
Lock fitting (URL: /carpenters/lock-fitting)
Problem
We want to be found by Google search on all the services and give a searchers a list of all carpenters in our database who can provide a service they want to find.
We give 15 contractors per page and rank them by recommendations provided by their clients.
Our concern is that our results pages may be marked as duplicate since some of them give the same list of carpenters. All the best 15 carpenters offer door-trimming and lock-fitting. So, all the same 15 are shown in /carpenters, /carpenters/lock-fitting, /carpenters/door-trimming.
We don't want to be marked as spammers and loose points on domain trust, however we believe we give quality content since we gave what the searchers want to find - contractors, who offer what they need.
**Solution? **
- Noindex all service pages to avoid duplicate content indexed by Google
OR
- rel=canonical tag on service pages to redirect to profession page.
e.g. on /carpenters/lock-fitting page make a tag rel=canonical to /carpenters.
OR
- no worries, allow Google index all the professions and services pages. Benefits of indexing it all (around 2500 additional pages with different keywords) is greater than ttagging service pages with no index or rel=canonical and loosing the opportunities to get more traffic by service titles.
We need a solution which would be the best for our organic traffic
Many thanks for your precious time.
-
I would recommend:
If the page content is truly that similar to the the others, I'd recommend using the rel=canonical tag on service pages to point to the profession page as the authoritative page
OR
Add enough unique content to the service pages to allow them to not appear as non-duplicate. This would involve either having your in-house team developing useful content or forcing your users to enter a paragraph of text that would only be used on those service pages.
OR
Last resort, you could noindex the service pages to avoid duplicate content indexed by search engines. If you noindex, you will have more control over which pages you are telling the search engines are most important.
Scott O.
-
In my opinion the best option for your organic traffic is to try and keep the service pages. The caveat is that you need to substantially differentiate the content. You should brainstorm options with your team but some ideas that come to mind are obviously changing title tags ,adding some service specific descriptions to the top of those pages, incorporating some unique service specific video, add links to service specific DIY/Guide/Warnings/Other related educational pre-sales material.
If that becomes too much work for this phase of the project my next suggestion would be rel=canonical back to the profession page. Ensure your profession page is designed in a way that makes filtering to the service level the obvious call to action and you should be fine. This will hinder your ability to target all of those service level keywords with service level URL's but you could still create content around those niches and drive links back to the profession page.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
Worried about keyword stuffing penalty re: URLs
I've noticed a potential problem with a mult-location business (this is an example URL - not the actual name of the business) I sense this is OK: carsdepots.com/ashford/cars But then I noticed they've added cars to location part of URL in some instances (they have 6 locations in total and have done this with 5 of them): carsdepots.com/birmingham-cars/cars So we have cars in there 3 times (that's the maximum number of times in any URL but it looks a little spammy to me) I am tempted to remove yoga from the location names, or flatten the URL structure completely - your thoughts would be welcome, or perhaps I shouldn't even be worrying?
Intermediate & Advanced SEO | | McTaggart0 -
HTTPS in Rel Canonical
Hi, Should I, or do I need to, use HTTPS (note the "S") in my canonical tags? Thanks Andrew
Intermediate & Advanced SEO | | Studio330 -
Reinforcing Rel Canonical? (Fixing Duplicate Content)
Hi Mozzers, We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint. Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up? Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
Intermediate & Advanced SEO | | Travis-W0 -
REL prev/next on pages with additional sort parameters
We need a bit of advice on a site we are working on. Currently, the site displays items in the categories in order of date and all of the pages of the category listing are rel prev/next tagged correctly. This is great, and works really well - however we want to include some more sorting options (by popularity, name, file size... etc) into the mix. What's the best way to go about this using the correct tags? Is it better to NOINDEX all of the sorting options and just leave the default by date listings indexed? Also, we cannot canonical the sorted options to their counterparts because the page content would be different. Any ideas? Any help is greatly appreciated. Thanks.
Intermediate & Advanced SEO | | Peter2640 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0 -
Use rel=canonical to save otherwise squandered link juice?
Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?
Intermediate & Advanced SEO | | cadenzajon1