Need Help On Proper Steps to Take To De-Index Our Search Results Pages
-
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of.
So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed.
By tag pages I mean:
Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies
Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies
So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to?
So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time?
OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages?
Can someone comment on what might be the best, safest, or fastest route?
Thanks so much for any help you might offer me!!
Craig
So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time?
OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages?
Can you tell me which would be the best, fastest and safest routes?
-
Hi Craig,
In general - the structure looks ok - just wondering how you going to manage to keep 1mio products a reasonable number of clicks from the homepage.
rgds
Dirk
-
Sounds good! Thanks again!
C
-
Hi Craig,
Getting quite late here in Belgium (already past midnight) - will get back to you tomorrow (with a fresher mind...)
Dirk
-
This is a big help as I am finalizing the category pages now.
So our site is big, getting close to 1,000,000 products in the store.
Each product can belong to up to 3 sub-cats. Our internal category structure is generally like this:
Widgets->Awesome Widgets->Blue Widgets
or
Widgets->Awesome Widgets->Large Widgets->Large Blue Widgets
So, currently, my structure is like this:
1. Home Page Links To:
Primary Category 1
Primary Category 2
Primary Category 3
Primary Category 42. Each Primary Category Page:
1. Links any sub-categories
2. Has a list of all products in that category with pagination linking to their product pages.3. The Product Page Links back to:
1. Primary Category Page
2. Each of the 3 Sub-Categories' Pages that Product Belongs To.
3. A small number of related products.Generally each sub-cat will have thousands if not tens of thousands of sub-products.
How does this sound and do you have any advice related to this?
Thanks again!! :):):):):):):):) You get extra smilies for awesome help.
Craig
-
Hi Craig,
A. The logic seems ok - but doesn't say much about the depth of the site. Questions for me are:
- can one product belong to more than one category?
- are we talking about 100 products or 10.000?
Suppose worst case
- each product belongs to only one subcategory & each subcategory belongs to one category
- you have 500 products in this subcategory
If there is pagination - with 50 products/page the last 50 products will be >10 clicks from the homepage
If there a 'show as one page - there would be too many links on the page so you cannot be certain that the ones at the bottom of the pages will get followed.
If a product can belong to more subcategories or categories and/or there are fewer products, it's more likely that it will be closer to the homepage.
B. No - the products would not be removed from the index. However, if there are no links to these pages, they will not be shown in the results (google wants that each part of your content should be reachable by at least 1 link). No (internal) links = no value is the way Google thinks. The more links & the fewer clicks from the homepage the more value a page gets. You should put the new navigation in place as soon as possible - ideally it should have been done at the same time.
Hope this clarifies,
Dirk
-
I was talking about my search pages specifically, either adding a meta robots no-index,no-follow OR just a no-index. I just went ahead and added no-follow.
So, good point on the screaming frog.
Currently, the site is organized like this: HomePage -> Several links to many variations of the Search Page -> Product Pages
The new organization will be:
Home Page -> Various Category Pages -> Various Sub-Category Pages (With products on them and pagination to show all products) -> Possibly Other Sub-Category Pages (With products on them and pagination)
Then on the product pages there will be links back to the primary and secondary category pages.
A. How does that sound and
B. So, if I have Product pages that are already indexed could no-indexing the Search pages mean these pages get removed? Or, if they are already in the index, are they safe?
Thanks again for taking the time to help and answer!!
Craig
-
Hi Craig,
Not sure where you would put the nofollow:
-
the links to the search pages on the articles need to be of type "follow" - if Google is never allowed to follow the links to the search pages it will take a lot of time before the bot discovers that all the search pages became "noindex"
-
the links on the search pages themselves- here you can do what you want. As the final goal is to remove the search pages from the index - once they're not longer indexed it becomes irrelevant if the links on these pages are nofollow or not. I would keep these links of type "follow" - allowing the bots to easily access all the pages - find the links on them that go the other search pages and take them out of the index.
One thing that you should also check and that I didn't mention before - it is probably a good idea to crawl your site now with Screaming Frog and check the depth of the site (%of articles at 1/2/3... clicks from the homepage). It could be possible that if you remove the "search" pages a larger part of your content moves deeper in the site - this could have a potential negative impact on the ranking of these articles. If this is the case - you could decide
- to keep some of the search pages (but noindex/follow)
- to increase cross linking between normal articles
- to add some new index pages (again noindex/follow)
(or a mix of these)
rgds,
Dirk
-
-
Hey Dirk,
I have one more follow-up on this if you don't mind. My SEO auditor said I should both no-index AND no-follow the search results pages.
This concerns me a little bit as I am concerned it may have a negative effect on my Product pages as I will have to make sure they will be found in another way, which I will do, but it will take time of course.
Any reason why you just suggested no-index and did not include the no-follow and do you have any other insight on that?
Thanks!
Craig
-
Thank you my brother...
Very much appreciate the time you took for some thorough answers here....
Very good stuff and VERY much appreciated.
I had a chat with my SEO auditor today and he suggested no-indexing, no following the search pages and in about 30 days remove the product page links.
So, I will likely do that.
Much appreciation to you - Craig
-
I don't think there is an easy route here - you will have to get rid of these indexed search pages in any case. Keeping this low quality pages will continue to hurt your site.
If you currently don't have the resources to do the 'ideal' scenario - I would go for the short pain: cut out these pages now, it will probably cost you traffic on the short term, but at least you have a clean base to build upon. Keeping the pages is probably better on the short term, but the longer you keep them, the more your site's reputation is going to be affected and put's you in danger for future algorithm updates.
Just my opinion
Dirk
-
Right, I hear you on that, and honestly, that scenario you have posited, is the reason I haven't done anything yet on this. I agree that is the ideal way to do it, but I am not sure I can. I just don't have the time or resources and I agree that the positive effect could take some time...
So, I am curious, what you think the quickest route to a positive effect would be?
C
-
Hi,
There is an alternative solution but it would require more work on your side.
The problem with your current situation is that you create thousands of low value pages with little added value (which Google doesn't really like: https://www.mattcutts.com/blog/search-results-in-search-results/) and then you heavily promote these low quality pages by point hundreds of links to them. Principal message to Google - these low quality pages are my most important ones.
What you could do is to check the search pages which are generating traffic (ex. take the top 100) and create "real" pages for them. If we take the example you give: http://oursite.com/Search.html?text=Kitten - rather than having a generic search page with little added value you create a real page with some added value content (yoursite.com/topics/kitten) with links to your most important pages on the subject. As an example of how such a page could look like: http://dogtime.com/dog-breeds/german-shepherd-dog - this page is like a kind of "home" - containing a definition + links to the most important related articles on the subject. If these kinds of pages already exist on your site then of course no need to create them.
On the related search pages you then put a canonical url pointing to this page. You also update the links to the search page to the "real" added value page. This way you start promoting new value added content with minimal risk of loosing your current positions & remove the old low value pages from the index. It can take some time however before you see a positive effect.
For the search request where it's not possible to create a version with add value - you point the canonical to the generic search page (or your homepage) and remove all the links to these pages.
Hope this helps,
Dirk
-
Dirk,
THANKS!!! Thanks for the solid response. I guess my only concern is, we are still getting traffic from these indexed Search pages... and I need to minimize the hit from removing them. Any other more advanced methods I could use? Or.... In that case, would you recommend I do a combination of using the URL removal tool PLUS removing the tags?
I just need to do this as right as possible. I can't afford too much of a hit here (if any.) But, at the same time, we are losing traffic so fast, and have lost so much traffic, I don't have any choice at this point. We have doubled our product pages in the past 3 years and yet have lost about half our traffic.
Thanks again!
Craig
-
Hi,
I would first put a noindex on all your search result pages and leave the tags on the pages to allow Google to crawl them & "read" the new instructions.
I would also try to block these result pages in the robots.txt - it accepts pattern-matching ( https://support.google.com/webmasters/answer/6062596?hl=en&ref_topic=6061961) - if you try this make sure that you test it properly to avoid unwanted side effects.
You could also try the url removal tool - it's quite easy to delete an entire directory with the tool (https://support.google.com/webmasters/answer/1663419?hl=en) - you must make sure however that the pages cannot be crawled again (so do it after the modification of the robots.txt). If your search is on the root of your site and not in a separate directory, not sure if it's going to work.
Just removing the links to these pages without other modification is not going to help - they will just remain in the index.
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Being Indexed / Referral URLs /
I have a few questions related to home page URLs being indexed, canonicalization, and GA reporting... 1. I can view the home page by typing in domain.com , domain.com/ and domain.com/index.htm There are no redirects and it's canonicalized to point to domain.com/index.htm -- how important is it to have redirects? I don't want unnecessary redirects or canonical tags, but I noticed the trailing slash can sometimes be typed in manually on other pages, sometimes not. 2. When I do a site search (site:domain.com), sometimes the HP shows up as "domain.com/", never "domain.com/index.htm" or "domain.com", and sometimes the HP doesn't show up period. This seems to change several times a day, sometimes within 15 minutes. I have no idea what is causing it and I don't know if it has anything to do with #1. In a perfect world, I would ask for the /index.htm to be dropped and redirected to .com/, and the canonical to point to .com/ 3. I've noticed in GA I see / , /index.htm, and a weird Google referral URL (/index.htm?referrer=https://www.google.com/) all showing up as top pages. I think the / and /index.htm is because I haven't setup a default URL in GA, but I'm not sure what would cause the referrer. I tracked back when the referrer URL started to show up in the top pages, and it was right around the time they moved over to https://, so I'm not sure what the best option is to remove that. I know this is a lot - I appreciate any insight anyone can provide.
Technical SEO | | DigMS0 -
Google place listings and search results- quick question.
Has anybody else noticed that they are ranking better on 'places' yet they have dropped off in the actual search results? We've had no message through webmaster tools. The same seems to have happened to our competitors.
Technical SEO | | onlinechester0 -
Targeting multiple keywords with index page
Quick keyword question.... I just started working with a client that is ranking fairly well for a number of keywords with his index page. Right now he has a bunch of duplicate titles, descriptions, etc across the entire site. There are 5 different keywords in the title of the index page alone. I am wondering if it OK to target 3 different keywords with the index page? Or, if I should cut it down to 1. Think blue widget, red widget, and widget making machines. I want each of the individual keywords to improve but don't want to lose what I have either. Any ideas? THANKS!!!!
Technical SEO | | SixTwoInteractive0 -
Getting Pages Indexed That Are Not In The Main Navigation
Hi All, Hoping you can help me out with a couple of questions I have. I am looking to create SEO friendly landing pages optimized for long tail keywords to increase site traffic and conversions. These pages will not live on the main navigation. I am wondering what the best way to get these pages indexed is? Internal text linking, adding to the sitemap? What have you done in this situation? I know that these pages cannot be orphaned pages and they need to be linked to somewhere. Looking for some tips to do this properly and to ensure that they can become indexed. Thanks! Pat
Technical SEO | | PatBausemer0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Google indexing directory folder listing page
Google somehow managed to find several of our images index folders and decided to include them into their index. Example: websitesite.com/category/images/ is what you'll see when doing a site:website.com search. So, I have two-part question: 1) Does this hurt our site's ability to rank in any way?
Technical SEO | | invision
Because all Google sees is just a directory listing page with a bunch of links to images in the folder. 2) If there could be any negative effect, what is the best way to get these folders out of Google's index?
I could block via robots.txt, but I'm afraid it will also block all the images in that folder from being indexed in Google image search. I could also turn off directory listing in cpanel / htaccess, but then that gives is a 403 forbidden. Will this hurt the site in anyway and would it prevent Google from indexing the images in the directory? Thanks,
Tony0 -
Existing Pages in Google Index and Changing URLs
Hi!! I am launching a newly recoded site this week and had a another noobie question. The URL structure has changed slightly and I have installed a 301 redirect to take care of that. I am wondering how Google will handle my "old" pages? Will they just fall out of the index? Or does the 301 redirect tell Google to rewrite the URLs in the index? I am just concerned I may see an "old" page and a "new" page with the same content in the index. Just want to make sure I have covered all my bases. Thanks!! Lynn
Technical SEO | | hiphound0