Should I noindex the site search page? It is generating 4% of my organic traffic.
-
I read about some recommendations to noindex the URL of the site search.
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales).My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead.
Would you noindex this page or not?
Any thoughts?
-
One other thing to think about - do you have another method for your the bots to find/crawl your content?
We robot.txt all of our /search result pages - I agree with Everett's post they are thin content and ripe for duplication issues.
We list all content pages in sitemap.xml and have a single section to "browse content" that is paginated. We use re="next" and "prev" to help the bots walk through each page.
References
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744
Personally, I think Maile's video is really great and you get to see some of the cool artwork in her house.
http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
Important to note that if you do setup pagination, if you add any other filters or sort options in that pagination, no follow those links and noindex those result pages as you want to have only one route through your pagination for Goog to travel through. Also, make sure each page has a unique title and description, I just add Page N to the standard blurb for each page and that usually takes care of it.
If you close one door on your search pages, you can open another one using pagination!
Cheers!
-
Since numerous search results pages are already in the index then Yes, you want to use the NoIndex tag instead of a disallow. The NoIndex tag will slowly lead to the pages being removed from the SERPs and the cache.
-
Mike, Everett,
thanks a lot. Will go ahead and noindex.Our navigation path is easy to crawl.
So I add noindex, nofollow in meta or xrobots tag?We have thousands of site search pages already in the google index, so I understand x rotobs or meta tag are preferred to using robots.txt right?
-
This was covered by Matt Cutts in a blog post way back in 2007 but the advice is still the same as Mik has pointed out. Search results could be considered to be thin content and not particularly useful to users so you can understand why Google want to avoid seeing search results in search result pages. Certainly I block all search results in robots.txt for all out sites.
You may lose 4% of your search traffic in the short term, but in the long term it could mean that you gain far more.
-
Google Webmaster Guidelines suggests you should "Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Organic search traffic is dropping since September
We own a large wiki site which allows people to make articles about their business and other things that Wikipedia would prohibit. To make our site more rich and expand the pages people can link to on their pages, we scraped between 1-2 million pages from the english wikipedia, pages such as “Los Angeles, CA” and “United States” etc. We’ve been getting a steady supply of organic backlinks from users who create their own pages and cite their wikis on their website, in news etc. However, starting 2 months ago our organic traffic has started slowly decaying as if we have received some kind of algorithmic penalty. What could it be? Could it be dupe content from the wikipedia pages we imported and indexed? Could it be some kind of algo from the Penguin update? We are just very confused why our organic search traffic would begin to drop at all since every day we have organic users making quality pages, some of whom organically backlink their articles on their own website and these obviously add up over time.
Intermediate & Advanced SEO | | teddef1 -
The review stars for my ecommerce site in the organic search disappeared, how can I have them shown again?
We run www.prams.net, an ecommerce store for strollers and other baby products out of the Uk. We always followed schema and the stars appeared in our product snippets in the organic search, helping us a lot. A couple of weeks ago the stars stopped appearing. When I do the site:www.prams.net they appear but not in the search results anymore. We have a french site as well. www.poussette.com, their it still works. Has anybody got tips what we can do to make the stars appear again? Thanks in advance. Dieter Lang
Intermediate & Advanced SEO | | Storesco0 -
Normal that Home Page Generating Less than 4% Of Organic Traffic?
Greetings MOZ Community: My firm operates www.nyc-officespace-leader.com, a commercial real estate brokerage in New York City. Prior to the first Penguin update in April 2012, our home page used to receive about 10% or 600 of total organic visits. After the first Penguin was launched by Google organic traffic to the home dropped to maybe 5% or 200 visits per month. Since May of this year, it appears we have been penalized by Penguin 4.0 and are attempting to recover. Now our home page only generates about 140 organic visits per month, or less than 4% of organic traffic. Our home enjoyed good conversion rate, so this drop in traffic is a real loss. Does this very low level of traffic to the home page indicate something abnormal? Dropping from 10% to less than 4% is a major decline. Should we take specific steps regarding the home page like enhancing the content? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Page disappears from search results when Google geographic location is close to offline physical location
If you use Google to search georgefox.edu for "doctor of business administration", the first search result is http://www.georgefox.edu/business/dba/ - I'll refer to this page as the DBA homepage from here on. The second page is http://www.georgefox.edu/offices/sfs/grad/tuition/business/dba/ - I'll refer to this page as the DBA program costs page from here on. Search: https://www.google.com/search?q=doctor+of+business+administration+site%3Ageorgefox.edu This appears to hold true no matter what your geographic location is set to on Google. George Fox University is located in Newberg, Oregon. If you search for "doctor of business administration" with your geographic location set to a location beyond a certain distance away from Newberg, Oregon, the first georgefox.edu result is the DBA homepage. Set your location on Google to Redmond, Oregon
Intermediate & Advanced SEO | | RCF
Search: https://www.google.com/search?q=doctor+of+business+administration But, if you set your location a little closer to home, the DBA homepage disappears from the top 50 search results on Google. Set your location on Google to Newberg, Oregon
Search: https://www.google.com/search?q=doctor+of+business+administration Now the first georgefox.edu page to appear in the search results is the DBA program costs page. Here are the locations I have tested so far: First georgefox.edu search result is the DBA homepage Redmond, OR Eugene, OR Boise, ID New York, NY Seattle, WA First georgefox.edu search result is the DBA program costs page Newberg, OR Portland, OR Salem, OR Gresham, OR Corvallis, OR It appears that if your location is set to within a certain distance of Newberg, OR, the DBA homepage is being pushed out of the search results for some reason. Can anyone verify these results? Does anyone have any idea why this is happening?0 -
Keywords Directing Traffic To Incorrect Pages
We're experiencing an issue where we have keywords directing traffic to incorrect child landing pages. For a generic example using fake product types, a keyword search for XL Widgets might send traffic to a child landing page for Commercial Widgets instead. In some cases, the keyword phrase might point a page for a child landing page for a completely different type of product (ex: a search for XL Widgets might direct traffic to XL Gadgets instead). It's tough to figure out exactly why this might be happening, since each page is clearly optimized for its respective keyword phrase (an XL Widgets page, a Commercial Widgets page, an XL Gadgets page, etc), yet one page ends up ranking for another page’s keyword, while the desired page is pushed out of the SERPs. We're also running into an issue where one keyword phrase is pointing traffic to three different child landing pages where none of the ranking pages are the page we've optimized for that keyword phrase, or the desired page we want to rank appears lower in the SERPs than the other two pages (ex: a search for XL Widgets shows XL Gadgets on the first SERP, Commercial Widgets on the second SERP, and then finally XL Widgets down on the third or fourth SERP). We suspect this may be happening because we have too many child landing pages that are targeting keyword terms that are too similar, which might be confusing the search engines. Can anyone offer some insight into why this may be happening, and what we could potentially do to help get the right pages ranking how we'd like?
Intermediate & Advanced SEO | | ShawnHerrick0 -
Help! Optimizing dynamic internal search results pages...
Hi guys, Now I have always been against this, and opted to noindex internal search results pages to stop the waste of link juice, dupe content, and crawl loops... however, I'm in a discussion with somebody who feels there may be a solution, and that the pages could actually be optimized to rank (for different keywords to the landing pages of course). Anybody come across such a thing before? My only solution would be still to noindex and then build static pages with the most popular search results in but that won't suffice in this case. Any recommendations would be much appreciated 🙂 Thanks, Steve 🙂
Intermediate & Advanced SEO | | SteveOllington0 -
What causes internal pages to have a page rank of 0 if the home page is PR 5?
The home page PageRank is 5 but every single internal page is PR 0. Things I know I need to address each page has 300 links (Menu problem). Each article has 2-3 duplicates caused from the CMS working on this now. Has anyone else had this problem before? What things should I look out for to fix this issue. All internal linking is follow there is no page rank sculpting happening on the pages.
Intermediate & Advanced SEO | | SEOBrent0 -
Pages un-indexed in my site
My current website www.energyacuity.com has had most pages indexed for more than a year. However, I tried cache a few of the pages, and it looks the only one that is now indexed by Goggle is the homepage. Any thoughts on why this is happening?
Intermediate & Advanced SEO | | abernatj0