Search Pages outranking Product Pages
-
A lot of the results seen in the search engines for our site are pages from our search results on our site, i.e. Widgets | Search Results
This has happened over time and wasn't intentional, but in many cases we see our search results pages appearing over our actual product pages in search, which isn't ideal.
Simply blocking indexing of these pages via robots wouldn't be ideal, at least all at once as we would have that period of time where those Search Results pages would be offline and our product pages would still be at the back of ranking.
Any ideas on a strategy to replace these Search Results with the actual products in a way that won't hurt us too bad during the transition? Or a way to make the actual product pages rank above the search results? Currently, it is often the opposite.
Thanks!
Craig
-
Thanks again for the answers!
Yeah, totally getting you on the Search within search issue. Wish we had known about that a couple of years ago. Did an analytics check and most of our non-home page traffic is coming from Search Results in serps. According to inurl, we have about 200,000 indexed SearchResult pages and based on some data I pulled up, they are our highest traffic non-home page pages, but also the least converting.
I think 301 re-directs on these would be rather tricky. I mean, if someone does a search on our site, they should get the search results page showing them several options, not be shot directly to a single product which might not be the one they need. It would be rather confusing for our regular customers as well.
But I agree we need to do something here, because conversely, our product pages, while getting the least traffic are the highest converters.
My only thought is that we would need to:
1. Find a list of all of the indexed Search Result pages, or at least the ones that have been hit over the last year or so. What would be the best way to do that? Screaming Frog? Analytics?
2. Create a script that analyzes these for the keywords used in them and find a suitable item to re-direct to based on the keyword extracted.
3. 301 re-direct them.
4. Change our current search results urls to include something that would not be included in these original pages so separate them from the old pages that are now being re-directed so that current searchers don't get re-directed as well.
5. Set the search results pages to no - index. Is that the best way to handle that? If we did robots.txt, then we would be breaking the link flow of the site wouldn't we? Don't we need the bots to crawl the search pages to lead to the product pages, or is the sitemap all that is needed?
Thanks for the time and answers!
Craig
-
Hello Craig,
I've dealt with this issue on several client sites and typically opt for noindexing the search pages (sometimes even blocking in Robots.txt) as recommended by others here - especially if you can't make any of them static.
In terms of the product pages, it could be helpful to the visitor if they search for "Specific Product A" for you to just go ahead and land them on the "Specific Product A" page, either via a 301 redirect from the search result page, or by serving up the product page in the first place. This would take care of usability as well as your issue with search engines.
I would not gradually implement something here, as that could be even more confusing to search engines. Do you want the search pages indexed or not?
What I have seen is a temporary blip in traffic (a few weeks at most) followed by improvement in traffic due to an improvement in product page rankings as a result. Every situation is different though, and this assumes good implementation.
Looking at this from Google's perspective, understand that they ARE the search engine so why would they want to send the user to yet another set of search results? Google should know which page on your site to send visitors. They don't need an intermediary, which is why their guidelines say this:
"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Good luck and let us know how it turns out!
-
Oh, and also, just to clarify.... Are you saying what we should do is 301:
http://oursite.com/SearchResult.html?Text=Monkeys+Ate+Soul
to, let's say
http://oursite.com/ProductPage.html?Title=TheMonkeysAteMySoul
That would be ok?
Thanks!
-
Thanks Jesse. Sounds like a big undertaking, but something we need to move on. Question... How accurate is "site:yoururl.com inurl:search"? I just did a test on it and the number of results that came back is way lower than what it should have been based on how our sitemaps are shown to be indexed in webmastertools.
Thanks for taking the time to answer!
Craig
-
I would be careful about allowing search pages to continually index. You will most likely end up with hundreds if not thousands of low value pages that may cause you to fall into a Panda algo penalty. Simply do a site:yoururl.com inurl:search (or whatever parameter you use ) to see how many pages you have indexed for search results.
You could find the page search pages that are out ranking your product page and 301 them if the traffic is substantial. Otherwise, I would say that by noindexing the search pages, you should reduce the competition for those product pages and they should start to rank and hopefully convert better.
I've had to do the the same for several sites because of a panda penalty so I can't speculate on traffic levels.
-
Thanks Zora! Yeah, these are all going to be dynamic unfortunately, and there are a lot of them. In the hundreds of thousands. So, we would need some type of transition strategy. I would be concerned that a one time no-index all at once would be quite problematic.
Just curious if anyone else had to transition in this way and was able to do so successfully.
Thanks for the feedback!!
Craig
-
We had the same problem, but decided to embrace it.
I started optimizing and adding content to a few of the search results pages (and made them static, not dynamic) and now they rank fairly well.
However, for dynamic search pages I suggest you noindex them.
Google recommends it, and it's best to follow their recommendations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Random important product pages dropped out of index week ending Dec 22: why???
Hello We've been around a very long time, and I have a long running pet set of core terms and pages tracked using Moz and other tools. With no changes to the content or site or htaaccess or robots.txt or sitemap, insignificant backlink changes etc, we saw a ton of important product pages drop out of the index the week ending December 22 2019. We are still ranking for many of the terms associated, but at far worse positions since the pages G is choosing instead for those terms are not as focused. I need to be clear that this has not happened across the board, but seemingly at random. When I look in G Search Console, the pages are submitted and indexed (last crawl yesterday), mobile friendly, have breadcrumbs, and the only warning are product level for lack of optional fields under offers (nothing new, not particular to the dropped pages in question here). So, what happened the week ending December 22???? Should I expect the dust to settle and the pages to return? Extremely strange. Thx
On-Page Optimization | | jamestown0 -
I'm looking to put a quite length FAQs tab on product pages on an ecommerce site. Am I likely to have duplicate content issues?
On an ecommerce site we have unique content on the product pages (i.e. descriptions), as well as the usual delivery and returns tabs for customer convenience. From this we haven't had any duplicate content issues or warnings, which seems to be the case industry-wide. However, we're looking to add a more lengthy FAQs tab which is still highly relevant to the customer but contains a lot more text than the other tabs. The product descriptions are also relatively small. Do you think this will cause potential duplicate content issues or should it be treated the same as a delivery tab, for instance?
On-Page Optimization | | creativemay0 -
Best practices for marking up product pages on eccomerce site (SEO noob)
After analyzing the code on various competitors eccomerce sites I wanted to seek advice on best practices for marking up individual product pages for keywords and descriptions. My competition is all over the map as far as utilizing keywords and descriptions, some have few keywords while others have many and vice versa for descriptions. What is the best method for marking up product pages on an eccomerce site for keywords and descriptions? In addition, is it okay to utilize the same keywords for multiple products that may be under the same category? or is this considered duplicate content? Thanks for the help, if you have any resources for SEO and eccomerce sites I would greatly appreciate the guidance best,Michelle & Blake
On-Page Optimization | | LeapOfBelief0 -
Search by popular terms
This may be a simple question and possibly already answered to death - but I think i'm not asking or googling the right question so I haven't been able to get a good answer for it. Some websites have a feature at the bottom of their page where one can "search by popular terms", consisting of keywords linked to the search result of that term. Some websites do this: http://www.kogan.com/au/search-terms/ This must be obviously an optimisation technique, but how does this benefit ranking? Are there any down sides to this? Is this still a current good practice?
On-Page Optimization | | central60 -
How to treat pages that are removed?
I have a website that need be very up-to-date, I mean, pages can be published just for 30 days, after that it should be unpublished. Everyday more than 300 pages is "removed", For theses pages I am returning http code "410" (Gone), also I remove from the sitemap. Now, I am checking Google WebMasterTools and I am getting thousands of pages not found. So... My questions Does it have SEO impact? How is the best approach to treat it?
On-Page Optimization | | thobryan0 -
What is on page links?
Hi - i would like to know exactly what an on page link is? i understand the linking system however cant work what exactly what an on page link is? Thanks
On-Page Optimization | | OasisLandDevelopment0 -
Sold Products appear as duplicate pages 'Page Not Found' ???
Hi there, I'm down to just 6 duplicate page warnings but I'm not sure how to deal with this one: Information Page Not Found! http://www.vintageheirloom.com/index.php?route=information/information&information_id=6 My Ecommerce shopping site products are unique, 1 of a kind. So once one product has sold and been delivered we take the product off our website, hence the Information Page Not Found! As I understand when search engines re-index these warnings will drop off but new sold products would replace them. So redirecting seems like hard work and never ending. Is it ok to ignore these warnings? Thanks Mozzers..
On-Page Optimization | | well-its-1-louder0 -
Some of my pages are ranking for terms which I want other pages to rank for. What can I do to effectively switch the ranking?
Some of the pages are ranking for terms I have optimised other pages for. The pages which are ranking are quite rightly falling, because they aren't optimised for the terms they're showing for. However, I have pages which are optomised for those terms. How do I switch the SERPS to the page I want?
On-Page Optimization | | GlobalLingo1