Search Pages outranking Product Pages
-
A lot of the results seen in the search engines for our site are pages from our search results on our site, i.e. Widgets | Search Results
This has happened over time and wasn't intentional, but in many cases we see our search results pages appearing over our actual product pages in search, which isn't ideal.
Simply blocking indexing of these pages via robots wouldn't be ideal, at least all at once as we would have that period of time where those Search Results pages would be offline and our product pages would still be at the back of ranking.
Any ideas on a strategy to replace these Search Results with the actual products in a way that won't hurt us too bad during the transition? Or a way to make the actual product pages rank above the search results? Currently, it is often the opposite.
Thanks!
Craig
-
Thanks again for the answers!
Yeah, totally getting you on the Search within search issue. Wish we had known about that a couple of years ago. Did an analytics check and most of our non-home page traffic is coming from Search Results in serps. According to inurl, we have about 200,000 indexed SearchResult pages and based on some data I pulled up, they are our highest traffic non-home page pages, but also the least converting.
I think 301 re-directs on these would be rather tricky. I mean, if someone does a search on our site, they should get the search results page showing them several options, not be shot directly to a single product which might not be the one they need. It would be rather confusing for our regular customers as well.
But I agree we need to do something here, because conversely, our product pages, while getting the least traffic are the highest converters.
My only thought is that we would need to:
1. Find a list of all of the indexed Search Result pages, or at least the ones that have been hit over the last year or so. What would be the best way to do that? Screaming Frog? Analytics?
2. Create a script that analyzes these for the keywords used in them and find a suitable item to re-direct to based on the keyword extracted.
3. 301 re-direct them.
4. Change our current search results urls to include something that would not be included in these original pages so separate them from the old pages that are now being re-directed so that current searchers don't get re-directed as well.
5. Set the search results pages to no - index. Is that the best way to handle that? If we did robots.txt, then we would be breaking the link flow of the site wouldn't we? Don't we need the bots to crawl the search pages to lead to the product pages, or is the sitemap all that is needed?
Thanks for the time and answers!
Craig
-
Hello Craig,
I've dealt with this issue on several client sites and typically opt for noindexing the search pages (sometimes even blocking in Robots.txt) as recommended by others here - especially if you can't make any of them static.
In terms of the product pages, it could be helpful to the visitor if they search for "Specific Product A" for you to just go ahead and land them on the "Specific Product A" page, either via a 301 redirect from the search result page, or by serving up the product page in the first place. This would take care of usability as well as your issue with search engines.
I would not gradually implement something here, as that could be even more confusing to search engines. Do you want the search pages indexed or not?
What I have seen is a temporary blip in traffic (a few weeks at most) followed by improvement in traffic due to an improvement in product page rankings as a result. Every situation is different though, and this assumes good implementation.
Looking at this from Google's perspective, understand that they ARE the search engine so why would they want to send the user to yet another set of search results? Google should know which page on your site to send visitors. They don't need an intermediary, which is why their guidelines say this:
"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Good luck and let us know how it turns out!
-
Oh, and also, just to clarify.... Are you saying what we should do is 301:
http://oursite.com/SearchResult.html?Text=Monkeys+Ate+Soul
to, let's say
http://oursite.com/ProductPage.html?Title=TheMonkeysAteMySoul
That would be ok?
Thanks!
-
Thanks Jesse. Sounds like a big undertaking, but something we need to move on. Question... How accurate is "site:yoururl.com inurl:search"? I just did a test on it and the number of results that came back is way lower than what it should have been based on how our sitemaps are shown to be indexed in webmastertools.
Thanks for taking the time to answer!
Craig
-
I would be careful about allowing search pages to continually index. You will most likely end up with hundreds if not thousands of low value pages that may cause you to fall into a Panda algo penalty. Simply do a site:yoururl.com inurl:search (or whatever parameter you use ) to see how many pages you have indexed for search results.
You could find the page search pages that are out ranking your product page and 301 them if the traffic is substantial. Otherwise, I would say that by noindexing the search pages, you should reduce the competition for those product pages and they should start to rank and hopefully convert better.
I've had to do the the same for several sites because of a panda penalty so I can't speculate on traffic levels.
-
Thanks Zora! Yeah, these are all going to be dynamic unfortunately, and there are a lot of them. In the hundreds of thousands. So, we would need some type of transition strategy. I would be concerned that a one time no-index all at once would be quite problematic.
Just curious if anyone else had to transition in this way and was able to do so successfully.
Thanks for the feedback!!
Craig
-
We had the same problem, but decided to embrace it.
I started optimizing and adding content to a few of the search results pages (and made them static, not dynamic) and now they rank fairly well.
However, for dynamic search pages I suggest you noindex them.
Google recommends it, and it's best to follow their recommendations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing a product page from "example.com" to "example.com/keyword" affect SEO and Ranking?
We're in a situation to move the page from "example.com" to "example.com/keyword". And adding new content to the "example.com" page. Does this change affect our ranking? If so how can we overcome this problem? Can anyone help?
On-Page Optimization | | Mohamednatheem0 -
Is it better to keep a glossary or terms on one page or break it up into multiple pages?
We have a very large glossary of over 1000 industry terms on our site with links to reference material, embedded video, etc. Is it better for SEO purposes to keep this on one page or should we break it up into multiple pages, a different page for each letter for example? Thanks.
On-Page Optimization | | KenW0 -
Duplicate pages and slight product variations
Hi, I'm new here, first post... I've started working on an existing Magento website which is selling furniture. There are products such as leather dining chairs which have very detailed product descriptions. The problem is that there is separate a page for every colour the chair comes in (with exactly the same on-page text), so the page is effectively duplicated 5 times, one for red, one for blue etc... This is made even worse by the fact that the website builder has listed the products in multiple different categories. This means that the same basic product description is in use on maybe 20 or so pages. How would you guys deal with product descriptions for multiple, very similar products where only the colour is different? There's also the problem of very similar title tags etc... Thanks for any help. Very much appreciated.
On-Page Optimization | | JM67
J.0 -
Deleted pages still registering as 404 pages.
I have a an all html site that I can only work on through the ftp. The previous marketing company ran a script that built thousands of location landing pages, but all they did was change the tags and headers and the keywords in the pages, other than that they are all duplicate pages. I removed them, but Google is reading them as 404 pages. How do I tell Google those pages don't exist? or do I just need to let the bots crawl it a few times and it will see that eventually?
On-Page Optimization | | SwanJob0 -
Ecommerce, Adding Content To Categories/Product Pages
In an eCommerce store, when is it appropriate to add quality category pages content and when is it more appropriate to add content to the actual product pages instead?
On-Page Optimization | | BobGW0 -
Product pages optimization impact on sub level domains.
How great of an impact will the optimization of product pages have an impact on a sub level domains?
On-Page Optimization | | Martin_Harris0 -
Autogenerated pages
My main product is database conversion software. As it supports tons of databases, it's fairly easy to generate thousands of landing pages simply by variating source/target database names, connection information etc. In fact, I autogenerated almost 25k pages that way. As I didn't want to jeopardize my main site, I placed all that content to a new microsite (www.fullconvert.com) which had no history and no inbound links. Results were nice - site is live two months and in second month already had 1300 visitors. Now, my question is - should I create the same thing on my (old and rather authoritative) main site www.spectralcore.com? I could use a different template to avoid duplicate content. Of course, my main concern is being penalized by Google. In my opinion, this autogenerated content is fine because it provides (tons of) laser-focused landing pages, so visitors will instantly recognize they found what they're looking for. But Google might disagree! What do you think? Is there a danger in trying to leverage authority of my main site in adding 20k+ autogenerated pages with inbound no links to them?
On-Page Optimization | | metadata0 -
Ranking for specific pages
HI, Lets say my website is abc.com and my targeted keyword is abc for index page. Internal pages, like abc.com/apple.htm, abc.com/banana.htm. The targeted keyword for apple.htm is fresh apples, buy apples, and for banana.htm, fresh banana, buy banana. How to define these keywords in the campaign. Please suggest. Thanks.
On-Page Optimization | | younus0