Search Pages outranking Product Pages
-
A lot of the results seen in the search engines for our site are pages from our search results on our site, i.e. Widgets | Search Results
This has happened over time and wasn't intentional, but in many cases we see our search results pages appearing over our actual product pages in search, which isn't ideal.
Simply blocking indexing of these pages via robots wouldn't be ideal, at least all at once as we would have that period of time where those Search Results pages would be offline and our product pages would still be at the back of ranking.
Any ideas on a strategy to replace these Search Results with the actual products in a way that won't hurt us too bad during the transition? Or a way to make the actual product pages rank above the search results? Currently, it is often the opposite.
Thanks!
Craig
-
Thanks again for the answers!
Yeah, totally getting you on the Search within search issue. Wish we had known about that a couple of years ago. Did an analytics check and most of our non-home page traffic is coming from Search Results in serps. According to inurl, we have about 200,000 indexed SearchResult pages and based on some data I pulled up, they are our highest traffic non-home page pages, but also the least converting.
I think 301 re-directs on these would be rather tricky. I mean, if someone does a search on our site, they should get the search results page showing them several options, not be shot directly to a single product which might not be the one they need. It would be rather confusing for our regular customers as well.
But I agree we need to do something here, because conversely, our product pages, while getting the least traffic are the highest converters.
My only thought is that we would need to:
1. Find a list of all of the indexed Search Result pages, or at least the ones that have been hit over the last year or so. What would be the best way to do that? Screaming Frog? Analytics?
2. Create a script that analyzes these for the keywords used in them and find a suitable item to re-direct to based on the keyword extracted.
3. 301 re-direct them.
4. Change our current search results urls to include something that would not be included in these original pages so separate them from the old pages that are now being re-directed so that current searchers don't get re-directed as well.
5. Set the search results pages to no - index. Is that the best way to handle that? If we did robots.txt, then we would be breaking the link flow of the site wouldn't we? Don't we need the bots to crawl the search pages to lead to the product pages, or is the sitemap all that is needed?
Thanks for the time and answers!
Craig
-
Hello Craig,
I've dealt with this issue on several client sites and typically opt for noindexing the search pages (sometimes even blocking in Robots.txt) as recommended by others here - especially if you can't make any of them static.
In terms of the product pages, it could be helpful to the visitor if they search for "Specific Product A" for you to just go ahead and land them on the "Specific Product A" page, either via a 301 redirect from the search result page, or by serving up the product page in the first place. This would take care of usability as well as your issue with search engines.
I would not gradually implement something here, as that could be even more confusing to search engines. Do you want the search pages indexed or not?
What I have seen is a temporary blip in traffic (a few weeks at most) followed by improvement in traffic due to an improvement in product page rankings as a result. Every situation is different though, and this assumes good implementation.
Looking at this from Google's perspective, understand that they ARE the search engine so why would they want to send the user to yet another set of search results? Google should know which page on your site to send visitors. They don't need an intermediary, which is why their guidelines say this:
"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Good luck and let us know how it turns out!
-
Oh, and also, just to clarify.... Are you saying what we should do is 301:
http://oursite.com/SearchResult.html?Text=Monkeys+Ate+Soul
to, let's say
http://oursite.com/ProductPage.html?Title=TheMonkeysAteMySoul
That would be ok?
Thanks!
-
Thanks Jesse. Sounds like a big undertaking, but something we need to move on. Question... How accurate is "site:yoururl.com inurl:search"? I just did a test on it and the number of results that came back is way lower than what it should have been based on how our sitemaps are shown to be indexed in webmastertools.
Thanks for taking the time to answer!
Craig
-
I would be careful about allowing search pages to continually index. You will most likely end up with hundreds if not thousands of low value pages that may cause you to fall into a Panda algo penalty. Simply do a site:yoururl.com inurl:search (or whatever parameter you use ) to see how many pages you have indexed for search results.
You could find the page search pages that are out ranking your product page and 301 them if the traffic is substantial. Otherwise, I would say that by noindexing the search pages, you should reduce the competition for those product pages and they should start to rank and hopefully convert better.
I've had to do the the same for several sites because of a panda penalty so I can't speculate on traffic levels.
-
Thanks Zora! Yeah, these are all going to be dynamic unfortunately, and there are a lot of them. In the hundreds of thousands. So, we would need some type of transition strategy. I would be concerned that a one time no-index all at once would be quite problematic.
Just curious if anyone else had to transition in this way and was able to do so successfully.
Thanks for the feedback!!
Craig
-
We had the same problem, but decided to embrace it.
I started optimizing and adding content to a few of the search results pages (and made them static, not dynamic) and now they rank fairly well.
However, for dynamic search pages I suggest you noindex them.
Google recommends it, and it's best to follow their recommendations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Contents in Order Pages of Multiple Products
Hi, I have a website containing 30 software products. Each product has an order page. The problem is that the layout and content of these 30 order pages are very similar, except for the product name, for example: https://www.datanumen.com/access-repair-order/
On-Page Optimization | | ccw
https://www.datanumen.com/outlook-repair-order/
https://www.datanumen.com/word-repair-order/ Siteliner has reports these pages as duplicate contents. I am thinking of noindex these pages. However, in such a case, if a user search for "DataNumen Outlook Repair order page", then he will not be able to see the order page of our product, which drives the revenue go away. So, how to deal with such a case? Thank you.1 -
Pages Not Showing Up In Search After Being Indexed....
Hello, I'm trying optimize my google search for local business in Fort Myers, FL area. I've created specific keywords that people are searching and have made my page title and on-page experience reflect this relevency. However, the page isn't appearing anywhere on google search even though search console has stated that the page has been indexed. My question is, how do I trouble shoot this so that I can appear in local search for these search terms?
On-Page Optimization | | scottgray06200 -
Category Page Content
Hey Mozzers, I've recently been doing a content audit on the category and sub-category pages on our site. The old pages had the following "profile" Above The Fold
On-Page Optimization | | ATP
Page Heading
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products
600 words+ of content duplicated from articles, sub categories and products My criticisms of the page were
1. No content (text) above the fold
2. Page content was mostly duplicated content
3. No keyword structure, many pages competed for the same keywords and often unwanted pages outranked the desired page for the keyword. I cleaned this up to the following structure Above The Fold
H1 Page Heading 80-200 Word of Content (Including a link to supporting article)
H2 Page Heading (Expansion or variance of the H1 making sure relevant) 80-200 150 Words of Content
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products The new pages are now all unique content, targeted towards 1-2 themed keywords. I have a few worries I was hoping you could address. 1. The new pages are only 180-300 words of text, simply because that is all that is needed to describe that category and provide some supporting information. the pages previously contained 600 words. Should I be looking to get more content on these pages?
2. If i do need more content, It wont fit "above the fold" without pushing the products and sub categories below the fold, which isn't ideal. Should I be putting it there anyway or should I insert additional text below the products and below the fold or would this just be a waste.
3. Keyword Structure. I have designed each page to target a selction of keywords, for example.
a) The main widget pages targets all general "widget" terms and provides supporting infromation
b) The sub-category blue widget page targets anything related and terms such as "Navy Widgets" because navy widgets are a type of blue widget etc"
Is this keyword structure over-optimised or exactly what I should be doing. I dont want to spread content to thin by being over selective in my categories Any other critisms or comment welcome0 -
123 keywords for a page
Hey mOz fans , I have a site that has 130 keywords. can ı target this amount just incoperate them as Ryan discussed Before.
On-Page Optimization | | atakala0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
How can I reduce Too Many On-Page Links? I am looking for best method through which I can reduce by on page link.
Hello, As I have the Pro Account in SEOMOZ . I have created the campaign for my website and I have seen the warring for on page analysis for Too Many On-Page Links. As per my knowledge in past it's matter that you can put maximum 100 links per page but now is it still matter or harm if pages has Too Many On-Page Links? And if yest then please let me know the best method to reduce my On-Page Links with out doing any major changes in website
On-Page Optimization | | jemindesai0 -
301 redirects from several sub-pages to one sub-page
Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian...
On-Page Optimization | | MarieA1 -
Too many links on a page?
On my blog posts, I have links to all the categories and months, dating back 5-6 years. This make the number of links on each blog page well over 100, which I understand might decrease the value of each page. Is there a problem with having more than 100 links on a page?
On-Page Optimization | | rdreich492