Should product searches (on site searches) be noindex?
-
We have a large new site that is suffering from a sitewide panda like penalty. The site has 200k pages indexed by Google. Lots of category and sub category page content and about 25% of the product pages have unique content hand written (vs the other pages using copied content).
So it seems our site is labeled as thin. I'm wondering about using noindex paramaters for the internal site search. We have a canonical tag on search results pointing to domain.com/search/ (client thought that would help) but I'm wondering if we need to just no index all the product search results.
Thoughts?
-
To me it sounds more like domain authority issue, lack of deep links, aged deep links. Gain them as natural as possible, over a period of time. Diversify your link profile. Your competition, those are on page 1, 2, are those in the 400 total root domains range. Again, just the root domains is not the criteria, as the age of the pages, deep links, anchor text, diversity of the links, age of those links all contribute.
You can test, but I am guessing blocking the search result URLs might not do enough. But this would be an interesting test. I would be curious to know what happens. Again, there might be algorithm updates that happen concurrently that could impact the actual test, but you could get a relative idea based on the dates when you block those pages both via a noindex and a disallow.
Let me know. Feel free to touch base via the QA or via email about this.
-
The site is 9 months old, since we purchased it anyways. Someone else had a one page site on it before we bought it for several years.
However, I'm looking to do this because our rankings are pretty poor. In fact, almost every keyword is stuck on page 3 or higher even with Pagerank or 5 and 400+ root domains. I'm afraid the previous SEO company may have got the site in trouble with too much low quality links.
It's either that or the site looks to thin and is not getting past Panda, it has 200k pages in the index of which maybe 2000 have any real solid and original content.
-
I know you said "new". But how new is it ? Are you also constantly working on your Link Profile ? I have seen "Monster" Authority sites within thousand's of those search pages ranking with no issues. So yes, as Tyler said, it might make sense to do a disallow via robots.txt as well as a noindex tag. Are you getting decent enough rankings on your category pages ? Again, it all boils down to the authority of the site/domain.
-
Yes I disallow all internal site searches. This robots.txt breakdown for my ecom platform magento, disallows them, but allows indexation: http://www.e-commercewebdesign.co.uk/blog/magento-seo/magento-robots-txt-seo.php
Tyler
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
.co.uk and com: Independent sites, but owned buy us , sharing some product information
We have two sites .com and .co.uk. Both are selling sites and the .com sells in $ and .co.uk in £s.
Intermediate & Advanced SEO | | BruceA
75% of the text is from the .co.uk site and used on the .com site. Each site has 6000+ pages, 4000+ contain product descriptions that are identical. We have looked at canonical and hreflang, but neither seem to fix the problem of duplication issues. We can add into the product detail master page rel alternative, but this will not fix the other potential clashes on the other pages. Can anyone advise if we can add a site wide html to each site or one that will fix this. Many thanks0 -
Why isnt this site ranking?
I just took over for a site and noticed they have no presence for any keywords...not even low ranks. Their backlink profile is not the best, but webmaster tools says they have no manual actions. vonderhaar.com Thoughts on the matter?
Intermediate & Advanced SEO | | Atomicx0 -
SEO Priorities for Ecommerce Sites
Hello All! What is the best way to rank SEO tasks by PRIORITY for Ecommerce sites to improve?? It can be quite overwhelming with all the types of projects/tasks needed to improve organic rankings... How would you rank the most CRITICAL tasks to spend the MOST TIME on to the tasks you spend less on. Appreciate your input in advance 🙂 Thank you! Mark
Intermediate & Advanced SEO | | wickerparadise0 -
Canonical tags and product descriptions
I just wanted to check what you guys thought of this strategy for duplicate product descriptions. A sample product is a letter bracelet - a, b, c etc so there are 26 products with identical descriptions. It is going to be extremely difficult to come up with 25 new unique descriptions so with recommendation i'm looking to use the canonical tag. I can't set any to no-index because visitors will look for explicit letters. Because the titles only differ by the letter then a search for either letter bracelet letter a bracelet letter i bracelet will just return results for 'letter bracelet' due to stop words unless the searcher explicitly searches for 'letter "a" bracelet. So I reckon I can make 4 new unique descriptions. I research what are the most popular letters picking 5 from the top (excluding 'a' and 'i'). Equally share the remaining letters between those 5 and with each group set a canonical tag pointing to the primary letter of that group. Does this seem a sensible thing to do?
Intermediate & Advanced SEO | | MickEdwards0 -
Search Refinement URLs
My site is using search refinement and I am concerned about the URL adding additional characters when it's refined. My current URL is: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm and when someone chooses their specific year, make, and model then it changes to: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm?searching=Y&Cat=10280&RefineBy_7371=7708. Will this negatively affect SEO for this URL? Will the URL be counted twice? Any help would be great!
Intermediate & Advanced SEO | | BrandLabs0 -
Ranking a site in the USA
I'm UK based and looking at setting up a site to rank in the USA. As I understand it a .com TLD is best but these are used worldwide so do I simply need to set the geotargeting to USA in webmaster tools? Or is there a better domain to use? With hosting the site in US and on page content related to US cities (I plan to create a page for each US city I operate in the the city name in the H1 tag) will that be enough for google to understand that the page should rank in the US version of google. Also how can I view Google USA search results - when I go to google.com it automatically redirects to google.co.uk and I can only change the location on the left hand side to UK cities. Any help much appreciated!
Intermediate & Advanced SEO | | SamCUK0 -
Sites banned from Google?
How do you find out sites banned from Google? I know how to find out sites no longer cached, or is it the same thing once deindexed? As always aprpeciate your advice everyone.
Intermediate & Advanced SEO | | pauledwards0