Search Pages outranking Product Pages
-
A lot of the results seen in the search engines for our site are pages from our search results on our site, i.e. Widgets | Search Results
This has happened over time and wasn't intentional, but in many cases we see our search results pages appearing over our actual product pages in search, which isn't ideal.
Simply blocking indexing of these pages via robots wouldn't be ideal, at least all at once as we would have that period of time where those Search Results pages would be offline and our product pages would still be at the back of ranking.
Any ideas on a strategy to replace these Search Results with the actual products in a way that won't hurt us too bad during the transition? Or a way to make the actual product pages rank above the search results? Currently, it is often the opposite.
Thanks!
Craig
-
Thanks again for the answers!
Yeah, totally getting you on the Search within search issue. Wish we had known about that a couple of years ago. Did an analytics check and most of our non-home page traffic is coming from Search Results in serps. According to inurl, we have about 200,000 indexed SearchResult pages and based on some data I pulled up, they are our highest traffic non-home page pages, but also the least converting.
I think 301 re-directs on these would be rather tricky. I mean, if someone does a search on our site, they should get the search results page showing them several options, not be shot directly to a single product which might not be the one they need. It would be rather confusing for our regular customers as well.
But I agree we need to do something here, because conversely, our product pages, while getting the least traffic are the highest converters.
My only thought is that we would need to:
1. Find a list of all of the indexed Search Result pages, or at least the ones that have been hit over the last year or so. What would be the best way to do that? Screaming Frog? Analytics?
2. Create a script that analyzes these for the keywords used in them and find a suitable item to re-direct to based on the keyword extracted.
3. 301 re-direct them.
4. Change our current search results urls to include something that would not be included in these original pages so separate them from the old pages that are now being re-directed so that current searchers don't get re-directed as well.
5. Set the search results pages to no - index. Is that the best way to handle that? If we did robots.txt, then we would be breaking the link flow of the site wouldn't we? Don't we need the bots to crawl the search pages to lead to the product pages, or is the sitemap all that is needed?
Thanks for the time and answers!
Craig
-
Hello Craig,
I've dealt with this issue on several client sites and typically opt for noindexing the search pages (sometimes even blocking in Robots.txt) as recommended by others here - especially if you can't make any of them static.
In terms of the product pages, it could be helpful to the visitor if they search for "Specific Product A" for you to just go ahead and land them on the "Specific Product A" page, either via a 301 redirect from the search result page, or by serving up the product page in the first place. This would take care of usability as well as your issue with search engines.
I would not gradually implement something here, as that could be even more confusing to search engines. Do you want the search pages indexed or not?
What I have seen is a temporary blip in traffic (a few weeks at most) followed by improvement in traffic due to an improvement in product page rankings as a result. Every situation is different though, and this assumes good implementation.
Looking at this from Google's perspective, understand that they ARE the search engine so why would they want to send the user to yet another set of search results? Google should know which page on your site to send visitors. They don't need an intermediary, which is why their guidelines say this:
"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Good luck and let us know how it turns out!
-
Oh, and also, just to clarify.... Are you saying what we should do is 301:
http://oursite.com/SearchResult.html?Text=Monkeys+Ate+Soul
to, let's say
http://oursite.com/ProductPage.html?Title=TheMonkeysAteMySoul
That would be ok?
Thanks!
-
Thanks Jesse. Sounds like a big undertaking, but something we need to move on. Question... How accurate is "site:yoururl.com inurl:search"? I just did a test on it and the number of results that came back is way lower than what it should have been based on how our sitemaps are shown to be indexed in webmastertools.
Thanks for taking the time to answer!
Craig
-
I would be careful about allowing search pages to continually index. You will most likely end up with hundreds if not thousands of low value pages that may cause you to fall into a Panda algo penalty. Simply do a site:yoururl.com inurl:search (or whatever parameter you use ) to see how many pages you have indexed for search results.
You could find the page search pages that are out ranking your product page and 301 them if the traffic is substantial. Otherwise, I would say that by noindexing the search pages, you should reduce the competition for those product pages and they should start to rank and hopefully convert better.
I've had to do the the same for several sites because of a panda penalty so I can't speculate on traffic levels.
-
Thanks Zora! Yeah, these are all going to be dynamic unfortunately, and there are a lot of them. In the hundreds of thousands. So, we would need some type of transition strategy. I would be concerned that a one time no-index all at once would be quite problematic.
Just curious if anyone else had to transition in this way and was able to do so successfully.
Thanks for the feedback!!
Craig
-
We had the same problem, but decided to embrace it.
I started optimizing and adding content to a few of the search results pages (and made them static, not dynamic) and now they rank fairly well.
However, for dynamic search pages I suggest you noindex them.
Google recommends it, and it's best to follow their recommendations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content by the product pages
Hi,Do you thing those pages have duplicate content:https://www.nobelcom.com/Afghanistan-phone-cards/from-Romania-235-2.htmlhttps://www.nobelcom.com/Afghanistan-phone-cards-2.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards-401.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards/from-Romania-235-401.html.And also how much impact will it have on a panda update?I'm trying to figure out if all the product pages, (that are in the same way as the ones above) are the reson for a Panda Penalty
On-Page Optimization | | Silviu0 -
10 Mobile Pages. 5 Desktop Pages. Canonnical to where?
I have a mobile site with more pages than the desktop site. Normally I would just point the page equivalents to the desktop site using the rel canonnical tag. What about the 5 pages? Do I just leave them be? Thanks
On-Page Optimization | | Imajery0 -
Duplicate Page Title
Wordpress Category pagination causes duplicate page title errors (ie. when there are so many posts in the category, it paginates them), is this a problem? Your tool is reporting it as a problem... but ProPhoto (my Wordpress provider say it is not a problem). Here are the 2 URL's with the same page title: http://www.lisagillphotography.co.uk/category/child-photography/ http://www.lisagillphotography.co.uk/category/child-photography/page/2/
On-Page Optimization | | LisaGill0 -
301 redirects from several sub-pages to one sub-page
Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian...
On-Page Optimization | | MarieA1 -
URL for location pages
Hello all We would like to create clean, easy URLs for our large list of Location pages. If there are a few URLs for each of the pages, am I right when I'm saying we would like this to be the canonical? Right now we would like the URL to be: For example
On-Page Optimization | | Ferguson
Domain.com/locations/Columbus I have found some instances where there might be 2,3 or more locations in the same city,zip. My conclusion for these would be: adding their Branch id's on to the URL
Domain.com/locations/Columbus/0304 Is this an okay approach? We are unsure if the URL should have city,State,zip for SEO purposes?
The pages will have all of this info in it's content
BUT what would be best for SEO and ranking for a given location? Thank you for any info!0 -
Two points of view on optimizing our search pages. What should we go with?
So we're in the process of going back and forth with our designer about optimizing our search results, which also doubles as a landing page for visitors searching with keywords like "Meeting Rooms Seattle" and "Seattle Meeting Spaces" We're on the front page in the SERPs, but still have a way to go. This is our current page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington And this is something we've proposed for our designer to work with: http://imgur.com/JU1zg There search page text and links in the top left corner were to be placed for onsite SEO purposes ie we have no real text/content on the page for relevancy. We're currently in the process of writing the copy for each city on the search pages. Our designer made this argument: After giving it some thought I came to the conclusion that we may want to take a step back, and focus on the overall goal of this exercise. From what I have gathered, you would like to generate more click-throus and improve SEO, right? In my opinion, adding all of the provided copy and the link farm to the search results page would not necessarily help that. In fact, I think it would actually push the actual results way down. The content you provided me is more suited for a landing page, not a search results page (that is taking into consideration that you want similar content for other locations). Redfin has done a ton of great SEO work on their site. Using them as an example, if you go to Redfin.com, you will find tiny links in the footer that say "home for sale in seattle" etc. If you click on those, it puts you on a page like this: http://www.redfin.com/cities/1/seattle?src=homepage and then from there you can click to a neighborhood page like this: http://www.redfin.com/city/1387/WA/Bellevue. I would recommend that we create a set of location pages with the content the client is asking for, that are specifically optimized for SEO, and provide links in the footer of the site to get to those pages. Then the links on the new landing pages would land the user on the search results page. By keeping two different pages for two different purposes separate would help keep content more organized and help user find specific info they are looking for. As a quick fix we could put one line of text under the H1 text on search results as well, maybe with a strong tag. By doing that we will be able to keep the page looking clean and easy to navigate through. Anyways, that's just my two cents. Any ideas/input on this?
On-Page Optimization | | eVenuesSEO0 -
Third party pages
Suppose you are using a third party tool such as an affiliate program. Typically, all the files are organized under one subdirectory. In addition, you may have little or no ability to modify any of the files in terms of SEO. Would you recommend hiding the entire subdirectory with a noindex? Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0 -
Too Many On-Page Links
I recently took on a website design client and ran his website through a battery of tests using Pro to take a look at the crawl errors. One that seems to stump me is the error "Too many On-Page links" concerning his blog. (http://franksdesigns.com/wp/blog) This is the first time I've seen this error and am rather confused. The report says there are 104 links on this page. However, I'm having trouble grasping this concept or finding the 104 links. Any suggestions are greatly appreciated. Thank you for your support!
On-Page Optimization | | WebLadder0