How to avoid duplicate content on internal search results page?
-
Hi,
according to Webmaster Tools and Siteliner our website have an above-average amount of duplicate content.
Most of the pages are the search results pages, where it finds only one result. The only difference in this case are the TDK, H1 and the breadcrumbs. The rest of the layout is pretty static and similar.
Here is an example for two pages with "duplicate content":
https://soundbetter.com/search/Globo
https://soundbetter.com/search/Volvo
Edit: These are legitimate results that happen to have the same result. In this case we want users to be able to find the audio engineers by 'credits' (musicians they've worked with). Tags. We want users to rank for people searching for 'engineers who worked with'. And searching for two different artists (credit tags) returns this one service provider, with different urls (the tag being the search parameter) hence the duplicate content.
I guess every e-commerce/directory website faces this kind of issue.
What is the best practice to avoid duplicate content on search results page?
-
It really depends on your developers and your budget. I do development and SEO, so this is how I would handle it. On searches that are returning just one result, I would put something in place to see how many results are returned, if it is only one result returned, in the head of the page I would set the canonical url for the search page to the actual page that is being returned as the result.
If more result is being returned, you can handle that in many different ways. One way would be to create a pseudo category out of the results page. I would use this sparingly and only for popular search terms. But you could have an extension written for your site that can give you some on page control of the text, the url, the meta areas, and things like that. I wrote a module for a platform I use a couple of years ago that does something like it. http://blog.dh42.com/search-pages-landing-pages/ You can get the gist of the idea by reading about it there, but that is one good way to handle a limited number of them to get them to rank better. I would not do it with every search result though, you might get a penalty.
-
Sorry, I misread it. I think either or in regards to the robots or on page is applicable. I think the on page would make them fall out faster though.
-
I wouldn't do a no follow however
I agree. My solution was to use NOINDEX, FOLLOW.
-
Thanks Prestashop for your answer.
Is there another solution other than no-indexing all our search results?
Like many sites (yelp, tripadvisor and others) our search results help drive traffic. They aggregate the answer to questions that are asked in searches, such as 'recording studios in london'.
https://soundbetter.com/search/Recording Studio - Engineer/London, UK
-
I would add it to the robots.txt file. Depending on how your cms is set up, you can grab the search string from the current url and also use the presence of it to fire a no index as well. I wouldn't do a no follow however, there is nothing bad about following it, it is just the indexing of the search pages.
-
Hey Prestashop
To add a little more clarity - would you:
a.) add /search/ to robots.txt, like so:
Disallow: /search/or
b.) add noindex/nofollow at page level: like so:
in the search results page template.I would opt for option b, but it would be interested to hear your thoughts too and why.
Thanks,
-
No-index your search results. Most platforms do it by default to eliminate that error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hybrid page showing in Google search results
Hello Mozzers We have two pages showing on page 1 of Google for the search term 'inset day sessions' This url is the correct page which we want site visitors to see. http://www.laughology.co.uk/teacher-workshop-s-inset-days/inset-days The other page page seems to be a strange hybrid of how the page used to look and the new content we have included. It's a mess and we don't want visitors clicking on this link. There is no menu link to this page on the site, but it is showing as a link In SH404sef http://www.laughology.co.uk/schools/teacher-workshop-s-inset-days/ What is the best way to deal with this? Thanks Ian nKOHYbn
Technical SEO | | Substance-create0 -
150+ Pages of URL Parameters - Mass Duplicate Content Issue?
Hi we run a large e-commerce site and while doing some checking through GWT we came across these URL parameters and are now wondering if we have a duplicate content issue. If so, we are wodnering what is the best way to fix them, is this a task with GWT or a Rel:Canonical task? Many of the urls are driven from the filters in our category pages and are coming up like this: page04%3Fpage04%3Fpage04%3Fpage04%3F (See the image for more). Does anyone know if these links are duplicate content and if so how should we handle them? Richard I7SKvHS
Technical SEO | | Richard-Kitmondo0 -
Need Help On Proper Steps to Take To De-Index Our Search Results Pages
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of. So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed. By tag pages I mean: Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to? So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can someone comment on what might be the best, safest, or fastest route? Thanks so much for any help you might offer me!! Craig So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time? OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages? Can you tell me which would be the best, fastest and safest routes?
Technical SEO | | TheCraig0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Errors - 7300 - Duplicate Page Content..Help me..
Hi, I just received the crawl report with 7300 errors of duplicate page content. Site built using php. list of errors will be like this.. http://xxxxx.com/channels/ http://xxxxx.com/channels/?page=1 http://xxxxxx.com/channels/?page=2 I am not good in coding and using readymade script for this website. could anyone guide me to fix this issue? Thanks.
Technical SEO | | vilambara0 -
Duplicate Content Errror
I am getting a duplicate content error for urls for the "tags" or categories pages for my blog. These are some the URLs that SEOmoz is saying are errors, or duplicate pages. http://sacmarketingagency.com/blog/?Tag=Facebook http://sacmarketingagency.com/blog/?Tag=content+marketing http://sacmarketingagency.com/blog/?Tag=inbound+marketing As you can see, they are just the pages that are aggregating certain blog post based on how we tagged them with the appropriate category. Is this really a problem for our SEO, if so any suggestions on how to fix this?
Technical SEO | | TalkingSheep0 -
Search/Search Results Page & Duplicate Content
If you have a page whose only purpose is to allow searches and the search results can be generated by any keyword entered, should all those search result urls be no index or rel canonical? Thanks.
Technical SEO | | cakelady0 -
Duplicate Page Title
First i had a problem with duplicate title errors, almost every page i had was double because my website linked to both www.funky-lama.com and funky-lama.com I changed this by adding a code to htaccess to redirect everything to www.funky-lama.com, but now my website was crawled again and the errors were actually doubled. all my pages now have duplicate title errors cause of pages like this www.funky-lama.com/160-confetti-gitaar.html funky-lama.com/160-confetti-gitaar.html www.funky-lama.com/1_present-time funky-lama.com/1_present-time
Technical SEO | | funkylama0