Should I noindex the site search page? It is generating 4% of my organic traffic.
-
I read about some recommendations to noindex the URL of the site search.
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales).My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead.
Would you noindex this page or not?
Any thoughts?
-
One other thing to think about - do you have another method for your the bots to find/crawl your content?
We robot.txt all of our /search result pages - I agree with Everett's post they are thin content and ripe for duplication issues.
We list all content pages in sitemap.xml and have a single section to "browse content" that is paginated. We use re="next" and "prev" to help the bots walk through each page.
References
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744
Personally, I think Maile's video is really great and you get to see some of the cool artwork in her house.
http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
Important to note that if you do setup pagination, if you add any other filters or sort options in that pagination, no follow those links and noindex those result pages as you want to have only one route through your pagination for Goog to travel through. Also, make sure each page has a unique title and description, I just add Page N to the standard blurb for each page and that usually takes care of it.
If you close one door on your search pages, you can open another one using pagination!
Cheers!
-
Since numerous search results pages are already in the index then Yes, you want to use the NoIndex tag instead of a disallow. The NoIndex tag will slowly lead to the pages being removed from the SERPs and the cache.
-
Mike, Everett,
thanks a lot. Will go ahead and noindex.Our navigation path is easy to crawl.
So I add noindex, nofollow in meta or xrobots tag?We have thousands of site search pages already in the google index, so I understand x rotobs or meta tag are preferred to using robots.txt right?
-
This was covered by Matt Cutts in a blog post way back in 2007 but the advice is still the same as Mik has pointed out. Search results could be considered to be thin content and not particularly useful to users so you can understand why Google want to avoid seeing search results in search result pages. Certainly I block all search results in robots.txt for all out sites.
You may lose 4% of your search traffic in the short term, but in the long term it could mean that you gain far more.
-
Google Webmaster Guidelines suggests you should "Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competing Pages on Ecommerce Site - Very Frustrating
We have multiple issues with this situation. We rank #1 for "Lace Fabric", #3 for "Lace Trim", and #80 for "Lace". We also rank for "Lace Ribbon", and "Lace Appliques". The Lace Fabric and Lace Trim pages have plenty of backlinks, wherein may lie the problem. We have a similar issue for "Satin". "Silk Satin", "Polyester Satin", "Satin Trim", "Satin Ribbon", etc. This is a very annoying and common pattern. Our backlink profile is sterling, and our competitors with inferior backlink profiles and branded search are outranking us. We outrank them across the board for 2 word terms. Based on my evaluation of TF/CF, PA/DA, Content, etc., we should be on page 1 for "Lace". IMHO, these pages are competing for the head term. Any ideas on how to eliminate this issue to rank for head terms?
Intermediate & Advanced SEO | | GWMSEO0 -
My site lost a lot of traffic in the lastest update - what to do?
Hi all, there seems to have been an algorithm update on February 7. One of my big sites www.poussette.com, lost about 25 % of its organic traffic afterwards and has not revovered yet. What are the best steps to take right now? It is 7 years old we continuously did conservative SEO (technical, link building, adding content). Thanks in advance. Dieter
Intermediate & Advanced SEO | | Storesco0 -
Normal that Home Page Generating Less than 4% Of Organic Traffic?
Greetings MOZ Community: My firm operates www.nyc-officespace-leader.com, a commercial real estate brokerage in New York City. Prior to the first Penguin update in April 2012, our home page used to receive about 10% or 600 of total organic visits. After the first Penguin was launched by Google organic traffic to the home dropped to maybe 5% or 200 visits per month. Since May of this year, it appears we have been penalized by Penguin 4.0 and are attempting to recover. Now our home page only generates about 140 organic visits per month, or less than 4% of organic traffic. Our home enjoyed good conversion rate, so this drop in traffic is a real loss. Does this very low level of traffic to the home page indicate something abnormal? Dropping from 10% to less than 4% is a major decline. Should we take specific steps regarding the home page like enhancing the content? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Can Keyword-Stuffing on a Single Page Penalize My Entire Site?
Hi forum! I want to improve my internal linking through adding keyword-rich anchor text to my search results pages (my site has an internal search engine for products). For example, if I were a shoes store, my product search engine results are currently:
Intermediate & Advanced SEO | | Travis-W
-Running
-Hiking
-Walking
-Track and I want to make them actual keyword-terms by changing them to:
-Running Shoes
-Hiking Shoes
-Walking Shoes
-Track Shoes This creates a problem - the keyword "shoes" is stuffed on the page. I don't care how well these dynamic search results pages appear in search, only the actual product pages. Is it okay to keyword stuff on these pages, or would it penalize my entire site?0 -
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing. Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing under same keywords " parts washers" Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
Intermediate & Advanced SEO | | mhart0 -
To noindex or not to noindex
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
Intermediate & Advanced SEO | | GreatFire.org0