Search Pages outranking Product Pages
-
A lot of the results seen in the search engines for our site are pages from our search results on our site, i.e. Widgets | Search Results
This has happened over time and wasn't intentional, but in many cases we see our search results pages appearing over our actual product pages in search, which isn't ideal.
Simply blocking indexing of these pages via robots wouldn't be ideal, at least all at once as we would have that period of time where those Search Results pages would be offline and our product pages would still be at the back of ranking.
Any ideas on a strategy to replace these Search Results with the actual products in a way that won't hurt us too bad during the transition? Or a way to make the actual product pages rank above the search results? Currently, it is often the opposite.
Thanks!
Craig
-
Thanks again for the answers!
Yeah, totally getting you on the Search within search issue. Wish we had known about that a couple of years ago. Did an analytics check and most of our non-home page traffic is coming from Search Results in serps. According to inurl, we have about 200,000 indexed SearchResult pages and based on some data I pulled up, they are our highest traffic non-home page pages, but also the least converting.
I think 301 re-directs on these would be rather tricky. I mean, if someone does a search on our site, they should get the search results page showing them several options, not be shot directly to a single product which might not be the one they need. It would be rather confusing for our regular customers as well.
But I agree we need to do something here, because conversely, our product pages, while getting the least traffic are the highest converters.
My only thought is that we would need to:
1. Find a list of all of the indexed Search Result pages, or at least the ones that have been hit over the last year or so. What would be the best way to do that? Screaming Frog? Analytics?
2. Create a script that analyzes these for the keywords used in them and find a suitable item to re-direct to based on the keyword extracted.
3. 301 re-direct them.
4. Change our current search results urls to include something that would not be included in these original pages so separate them from the old pages that are now being re-directed so that current searchers don't get re-directed as well.
5. Set the search results pages to no - index. Is that the best way to handle that? If we did robots.txt, then we would be breaking the link flow of the site wouldn't we? Don't we need the bots to crawl the search pages to lead to the product pages, or is the sitemap all that is needed?
Thanks for the time and answers!
Craig
-
Hello Craig,
I've dealt with this issue on several client sites and typically opt for noindexing the search pages (sometimes even blocking in Robots.txt) as recommended by others here - especially if you can't make any of them static.
In terms of the product pages, it could be helpful to the visitor if they search for "Specific Product A" for you to just go ahead and land them on the "Specific Product A" page, either via a 301 redirect from the search result page, or by serving up the product page in the first place. This would take care of usability as well as your issue with search engines.
I would not gradually implement something here, as that could be even more confusing to search engines. Do you want the search pages indexed or not?
What I have seen is a temporary blip in traffic (a few weeks at most) followed by improvement in traffic due to an improvement in product page rankings as a result. Every situation is different though, and this assumes good implementation.
Looking at this from Google's perspective, understand that they ARE the search engine so why would they want to send the user to yet another set of search results? Google should know which page on your site to send visitors. They don't need an intermediary, which is why their guidelines say this:
"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Good luck and let us know how it turns out!
-
Oh, and also, just to clarify.... Are you saying what we should do is 301:
http://oursite.com/SearchResult.html?Text=Monkeys+Ate+Soul
to, let's say
http://oursite.com/ProductPage.html?Title=TheMonkeysAteMySoul
That would be ok?
Thanks!
-
Thanks Jesse. Sounds like a big undertaking, but something we need to move on. Question... How accurate is "site:yoururl.com inurl:search"? I just did a test on it and the number of results that came back is way lower than what it should have been based on how our sitemaps are shown to be indexed in webmastertools.
Thanks for taking the time to answer!
Craig
-
I would be careful about allowing search pages to continually index. You will most likely end up with hundreds if not thousands of low value pages that may cause you to fall into a Panda algo penalty. Simply do a site:yoururl.com inurl:search (or whatever parameter you use ) to see how many pages you have indexed for search results.
You could find the page search pages that are out ranking your product page and 301 them if the traffic is substantial. Otherwise, I would say that by noindexing the search pages, you should reduce the competition for those product pages and they should start to rank and hopefully convert better.
I've had to do the the same for several sites because of a panda penalty so I can't speculate on traffic levels.
-
Thanks Zora! Yeah, these are all going to be dynamic unfortunately, and there are a lot of them. In the hundreds of thousands. So, we would need some type of transition strategy. I would be concerned that a one time no-index all at once would be quite problematic.
Just curious if anyone else had to transition in this way and was able to do so successfully.
Thanks for the feedback!!
Craig
-
We had the same problem, but decided to embrace it.
I started optimizing and adding content to a few of the search results pages (and made them static, not dynamic) and now they rank fairly well.
However, for dynamic search pages I suggest you noindex them.
Google recommends it, and it's best to follow their recommendations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will pushing a visitor to a conversion page hosted on a 3rd-party domain hurt the landing page ranking
Had an interesting question from a client. The client has a page that is optimized for a specific term. The goal of the page is to push users to sign-up for a trial. The trial registration (conversion) page is hosted by a third-party. Will pushing users to the conversion page cannibalize the SEO authority of the landing page. My reflexive answer is to say no, but now am not so sure.
On-Page Optimization | | infoblue0 -
Home Page Keywords not Ranking and Assigned to Inside Pages
Hi, thank you for taking the time to read this. We have a few websites with the same problem. I will use http://www.prepared-meals.com as an example: The home page was ranking on page one for keyword "Prepared Meals". The site is about 6 months old. We use the Moz page optimizer on all pages of our websites to score an A rating. Recently we found the home page is no longer showing up in search results and the keyword "prepared meals" now points to an inside page that is not relevant: http://www.prepared-meals.com/Senior-Meals/Moms-Meals-Reviews.html this page shows up for Prepared Meals around page 15 in Google results. We have read keywords in the URL might be the issue, even though the page optimizer in MOZ says to do that. We are wondering if this is the issue or there is some other problem we are not aware of. Again, thank you for you for your time. -Craig
On-Page Optimization | | CraigSWD0 -
Ranked page is not desired page
I have a question on a problem I am currently faced with. There is a certain keyword that my employer wants to rank for. The good news is that sometimes it does rank in the top 5 pages of Google. (It drops in and out) The bad news is that it is going to a page that we need to keep, but not the ideal place we want people who are looking for that keyword to go to. I was wondering if anyone has had any experience with this type of situation and what tactic they used to get people to the better page.
On-Page Optimization | | trumpfinc1 -
To Reduce (pages)... or not to Reduce?
Our site has a large Business Directory with millions of pages. For examples' sake, let's say it's a directory of Restaurants. Each Restaurant has 4 pages on the site, each tied together through a row of tabs across the top of the page: Tab 1 - Basic super 7 info - name, location, contact info Tab 2 - Restaurant menu Tab 3 - Restaurant reviews Tab 4 - Photos of food The Tab 1 page generates 95% of our traffic, and 90% of conversions. The conversion rate on Tab 2 - Tab 4 pages is 6 - 10x greater than Tab 1 conversions. Total Conversions from search queries on menus, reviews and food are 20% higher than are conversions resulting from searches on restaurant name & info alone. We're working with a consultant on a redesign, who wants to consolidate the 4 pages into one. Their advice is to focus on making a better page, featuring all of the content, sacrifice a little organic traffic but make up any losses by improving conversion. My counterpoint is that we shouldn't scrap the Tab 2-4 pages just because they have lower traffic - we should make the pages BETTER. The content we display is thin, and we have plenty of data we could expose to make the pages more robust. By consolidating it will also be hard to optimize a page for people searching for name/location AND menu AND reviews AND photos. We're asking that one page to do too much, and it's likely we will see diminished search volume for queries on menu, reviews and food. I think the decline will be much more significant than the consultant estimates. The consultant says there will be little change to organic traffic. since Tab 1 already generates 95% of traffic. Through basic math, they're saying the risk is a 5% decline in organic traffic. Further, they see little chance of queries for menu, reviews, and food declining because most of those queries tend to send people too the home page or Tab 1 page anyway. Finally, the designer of the new wireframes admitted that potential organic traffic risks were not taken into consideration when they recommended consolidating the pages. I sincerely appreciate your thoughts and consideration! Trisha
On-Page Optimization | | lzhao0 -
No index parts of a page?
Little bit of an odd question this, but how would one go about getting Google to not index certain content on a page? I'm developing an online store for a client and for a few of the products they will be stocking they will be using the manufacturers specs and descriptions. These descriptions and specs, therefore, will not be unique as they will be also used by a number of other websites. The title tag, onpage h1 etc will be fine for the seo of the actual pages (with backlinks, of course) so the impact of google not counting the description should be slight. I'm sure this can be done but for the life of me I cannot remember how. Thanks Carl
On-Page Optimization | | Grumpy_Carl0 -
Ranking for specific pages
HI, Lets say my website is abc.com and my targeted keyword is abc for index page. Internal pages, like abc.com/apple.htm, abc.com/banana.htm. The targeted keyword for apple.htm is fresh apples, buy apples, and for banana.htm, fresh banana, buy banana. How to define these keywords in the campaign. Please suggest. Thanks.
On-Page Optimization | | younus0 -
Duplicate pages
Hi, I am using a CMS that generates dynamic urls that according to the SeoMoz tool will be indexed as duplicate pages. The pages in questions are forms, blog-posts etc. that are not crucial to achieve ranking for. I do worry though about the consequences of having 20 (non-duplicate)pages with static urls and about 100 pages that are duplicates with dynamic urls. What consequences will this have for the speed that the robots crawl the site and could there be negative effects on ranking for the entire domain?
On-Page Optimization | | vibelingo0 -
SEO Value of Within-Page Links vs. Separate Pages
Title says it all. Assuming that you're talking about similar content (let's say, widgets), which is better: using within-page links for variations or using separate pages? I.e., do we have a widget page and then do in-page links to describe green, blue, and red widgets, or separate pages for each type of widget? In-page pro: more content on a single page, thus more keywords, key phrases, and general appearance of real content. In-page con: Jakob Neilsen says they're confusing. Also, for SEO, you only get one page title, rather than a separate page title for each. My personal bias is for in-page, since I hate creating dozens of short pages for what could be on one page, but my suspicion is that separate pages are better for SEO.
On-Page Optimization | | maxkennerly0