Should product searches (on site searches) be noindex?
-
We have a large new site that is suffering from a sitewide panda like penalty. The site has 200k pages indexed by Google. Lots of category and sub category page content and about 25% of the product pages have unique content hand written (vs the other pages using copied content).
So it seems our site is labeled as thin. I'm wondering about using noindex paramaters for the internal site search. We have a canonical tag on search results pointing to domain.com/search/ (client thought that would help) but I'm wondering if we need to just no index all the product search results.
Thoughts?
-
To me it sounds more like domain authority issue, lack of deep links, aged deep links. Gain them as natural as possible, over a period of time. Diversify your link profile. Your competition, those are on page 1, 2, are those in the 400 total root domains range. Again, just the root domains is not the criteria, as the age of the pages, deep links, anchor text, diversity of the links, age of those links all contribute.
You can test, but I am guessing blocking the search result URLs might not do enough. But this would be an interesting test. I would be curious to know what happens. Again, there might be algorithm updates that happen concurrently that could impact the actual test, but you could get a relative idea based on the dates when you block those pages both via a noindex and a disallow.
Let me know. Feel free to touch base via the QA or via email about this.
-
The site is 9 months old, since we purchased it anyways. Someone else had a one page site on it before we bought it for several years.
However, I'm looking to do this because our rankings are pretty poor. In fact, almost every keyword is stuck on page 3 or higher even with Pagerank or 5 and 400+ root domains. I'm afraid the previous SEO company may have got the site in trouble with too much low quality links.
It's either that or the site looks to thin and is not getting past Panda, it has 200k pages in the index of which maybe 2000 have any real solid and original content.
-
I know you said "new". But how new is it ? Are you also constantly working on your Link Profile ? I have seen "Monster" Authority sites within thousand's of those search pages ranking with no issues. So yes, as Tyler said, it might make sense to do a disallow via robots.txt as well as a noindex tag. Are you getting decent enough rankings on your category pages ? Again, it all boils down to the authority of the site/domain.
-
Yes I disallow all internal site searches. This robots.txt breakdown for my ecom platform magento, disallows them, but allows indexation: http://www.e-commercewebdesign.co.uk/blog/magento-seo/magento-robots-txt-seo.php
Tyler
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To noindex and follow or noindex no follow?
We have to greatly scale back on one of our services and focus on the other more successful ones. I need to figure out what to do with all the pages relating to the service we are cutting back. Just to be clear, we aren't getting rid of the service. So they still want the pages on the website, but it is better for us to have more link juice going to the other service pages, more of our content ratio to be around the more profitable services, etc. So, should I no-index/no-follow all the pages relating to the service we are cutting back on? Or should I no-index/follow all the pages relating the service we are cutting back on? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Noindex
I have been reading a lot of conflicting information on the Link Juice ramifications of using "NoIndex". Can I get some advice for the following situation? 1. I have pages that I do not want indexed on my site. They are lead conversion pages. Just about every page on my site has links to them. If I just apply a standard link, those pages will get a ton of Link Juice that I'd like to allocate to other pages. 2. If I use "nofollow", the pages won't rank, but the link juice evaporates. I get that. I won't use "nofollow" 3. I have read that "noindex, follow" will block the pages in the SERPs, but will pass Link Juice to them. I don't think that I want this either. If I "dead end" the lead form with no navigation or links, will the juice be locked up on the page? 4. I assume that I should block the pages in robots.txt In order to keep the pages out of the SERPs, and conserve Link Juice, what should I do? Can someone please give me a step by step process with the reasoning for what I should do here?
Intermediate & Advanced SEO | | CsmBill0 -
Understanding the levels in my site
How can I figure out which pages are on the same level on my site ? I created an automatic sitemap with a software online but it doesn't tell me abc page is on the 1 st level, xyz page is on the second level etc... and I have a hard time figuring out if my main menu is on the same level as my drop down menu as it is visible on the same page. Is there anyway to figure what which pages are on the same level ?
Intermediate & Advanced SEO | | seoanalytics0 -
Sites banned from Google?
How do you find out sites banned from Google? I know how to find out sites no longer cached, or is it the same thing once deindexed? As always aprpeciate your advice everyone.
Intermediate & Advanced SEO | | pauledwards0 -
Network Of Sites...
Hi Guys, Just wondering if anyone can help me out... We have recently been hit by the Google penguin update and I'm currently working though all the bad / spammy backlinks that previous SEO companies have built for us. I have come across 1 particular domain www.justgoodcars.com they seem to have a lot of different domain names: <colgroup><col width="390"></colgroup>
Intermediate & Advanced SEO | | ScottBaxterWW
| http://www.justpulsarcars.com/nissan-pulsar-warranties/1/United_Kingdom/all.html |
| http://www.justpumacars.com/ford-puma-warranties/1/United_Kingdom/all.html |
| http://www.justpuntocars.com/dutch-site/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom |
| http://www.justpuntocars.com/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom | Now all of theses domains names have exactly the same IP Address?? Above is just a few I would say there are 100s of them. Do you think this could have an affect on us? Thanks, Scott0 -
Help with a Sticky Site
Hey Everyone - I work for a company that is just getting into SEO. We have had some successes, but one project lately has got us stumped. We have been working hard, but have been unable to make an impact in Google rankings with the following site: http://stoneycreekinn.com/locations/index.cfm/DesMoines We are trying to optimize for the keyword phrase, "des moines hotel" This hotel is a branch location of a hotel chain in the Midwest. *Note we've already moved up some other branch locations for this hotel chain successfully. We've used several tools including the SEOmoz tool and seem to have higher marks than those sites that rank above us in Google surprisingly. Any idea what we're missing? Thanks!
Intermediate & Advanced SEO | | markhope0