Need Help On Proper Steps to Take To De-Index Our Search Results Pages
-
So, I have finally decided to remove our Search Results pages from Google. This is a big dealio, but our traffic has consistently been declining since 2012 and it's the only thing I can think of.
So, the reason they got indexed is back in 2012, we put linked tags on our product pages, but they linked to our search results pages. So, over time we had hundreds of thousands of search results pages indexed.
By tag pages I mean:
Keywords: Kittens, Doggies, Monkeys, Dog-Monkeys, Kitten-Doggies
Each of these would be linked to our search results pages, i.e. http://oursite.com/Search.html?text=Kitten-Doggies
So, I really think these pages being indexed are causing much of our traffic problems as there are many more Search Pages indexed than actual product pages. So, my question is... Should I go ahead and remove the links/tags on the product pages first? OR... If I remove those, will Google then not be able to re-crawl all of the search results pages that it has indexed? Or, if those links are gone will it notice that they are gone, and therefore remove the search results pages they were previously pointing to?
So, Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time?
OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages?
Can someone comment on what might be the best, safest, or fastest route?
Thanks so much for any help you might offer me!!
Craig
So, I wanted to see if you have a suggestion on the best way to handle it? Should I remove the links/tags from the product page (or at least decrease them down to the top 8 or so) as well as add the no-follow no-index to all the Search Results pages at the same time?
OR, should I first no-index, no-follow ALL the search results pages and leave those tags on the product pages there to give Google a chance to go back and follow those tags to all of the Search Results pages so that it can get to all of those Search Results pages in order to noindex,. no follow them? Otherwise will Google not be able find these pages?
Can you tell me which would be the best, fastest and safest routes?
-
Hi Craig,
In general - the structure looks ok - just wondering how you going to manage to keep 1mio products a reasonable number of clicks from the homepage.
rgds
Dirk
-
Sounds good! Thanks again!
C
-
Hi Craig,
Getting quite late here in Belgium (already past midnight) - will get back to you tomorrow (with a fresher mind...)
Dirk
-
This is a big help as I am finalizing the category pages now.
So our site is big, getting close to 1,000,000 products in the store.
Each product can belong to up to 3 sub-cats. Our internal category structure is generally like this:
Widgets->Awesome Widgets->Blue Widgets
or
Widgets->Awesome Widgets->Large Widgets->Large Blue Widgets
So, currently, my structure is like this:
1. Home Page Links To:
Primary Category 1
Primary Category 2
Primary Category 3
Primary Category 42. Each Primary Category Page:
1. Links any sub-categories
2. Has a list of all products in that category with pagination linking to their product pages.3. The Product Page Links back to:
1. Primary Category Page
2. Each of the 3 Sub-Categories' Pages that Product Belongs To.
3. A small number of related products.Generally each sub-cat will have thousands if not tens of thousands of sub-products.
How does this sound and do you have any advice related to this?
Thanks again!! :):):):):):):):) You get extra smilies for awesome help.
Craig
-
Hi Craig,
A. The logic seems ok - but doesn't say much about the depth of the site. Questions for me are:
- can one product belong to more than one category?
- are we talking about 100 products or 10.000?
Suppose worst case
- each product belongs to only one subcategory & each subcategory belongs to one category
- you have 500 products in this subcategory
If there is pagination - with 50 products/page the last 50 products will be >10 clicks from the homepage
If there a 'show as one page - there would be too many links on the page so you cannot be certain that the ones at the bottom of the pages will get followed.
If a product can belong to more subcategories or categories and/or there are fewer products, it's more likely that it will be closer to the homepage.
B. No - the products would not be removed from the index. However, if there are no links to these pages, they will not be shown in the results (google wants that each part of your content should be reachable by at least 1 link). No (internal) links = no value is the way Google thinks. The more links & the fewer clicks from the homepage the more value a page gets. You should put the new navigation in place as soon as possible - ideally it should have been done at the same time.
Hope this clarifies,
Dirk
-
I was talking about my search pages specifically, either adding a meta robots no-index,no-follow OR just a no-index. I just went ahead and added no-follow.
So, good point on the screaming frog.
Currently, the site is organized like this: HomePage -> Several links to many variations of the Search Page -> Product Pages
The new organization will be:
Home Page -> Various Category Pages -> Various Sub-Category Pages (With products on them and pagination to show all products) -> Possibly Other Sub-Category Pages (With products on them and pagination)
Then on the product pages there will be links back to the primary and secondary category pages.
A. How does that sound and
B. So, if I have Product pages that are already indexed could no-indexing the Search pages mean these pages get removed? Or, if they are already in the index, are they safe?
Thanks again for taking the time to help and answer!!
Craig
-
Hi Craig,
Not sure where you would put the nofollow:
-
the links to the search pages on the articles need to be of type "follow" - if Google is never allowed to follow the links to the search pages it will take a lot of time before the bot discovers that all the search pages became "noindex"
-
the links on the search pages themselves- here you can do what you want. As the final goal is to remove the search pages from the index - once they're not longer indexed it becomes irrelevant if the links on these pages are nofollow or not. I would keep these links of type "follow" - allowing the bots to easily access all the pages - find the links on them that go the other search pages and take them out of the index.
One thing that you should also check and that I didn't mention before - it is probably a good idea to crawl your site now with Screaming Frog and check the depth of the site (%of articles at 1/2/3... clicks from the homepage). It could be possible that if you remove the "search" pages a larger part of your content moves deeper in the site - this could have a potential negative impact on the ranking of these articles. If this is the case - you could decide
- to keep some of the search pages (but noindex/follow)
- to increase cross linking between normal articles
- to add some new index pages (again noindex/follow)
(or a mix of these)
rgds,
Dirk
-
-
Hey Dirk,
I have one more follow-up on this if you don't mind. My SEO auditor said I should both no-index AND no-follow the search results pages.
This concerns me a little bit as I am concerned it may have a negative effect on my Product pages as I will have to make sure they will be found in another way, which I will do, but it will take time of course.
Any reason why you just suggested no-index and did not include the no-follow and do you have any other insight on that?
Thanks!
Craig
-
Thank you my brother...
Very much appreciate the time you took for some thorough answers here....
Very good stuff and VERY much appreciated.
I had a chat with my SEO auditor today and he suggested no-indexing, no following the search pages and in about 30 days remove the product page links.
So, I will likely do that.
Much appreciation to you - Craig
-
I don't think there is an easy route here - you will have to get rid of these indexed search pages in any case. Keeping this low quality pages will continue to hurt your site.
If you currently don't have the resources to do the 'ideal' scenario - I would go for the short pain: cut out these pages now, it will probably cost you traffic on the short term, but at least you have a clean base to build upon. Keeping the pages is probably better on the short term, but the longer you keep them, the more your site's reputation is going to be affected and put's you in danger for future algorithm updates.
Just my opinion
Dirk
-
Right, I hear you on that, and honestly, that scenario you have posited, is the reason I haven't done anything yet on this. I agree that is the ideal way to do it, but I am not sure I can. I just don't have the time or resources and I agree that the positive effect could take some time...
So, I am curious, what you think the quickest route to a positive effect would be?
C
-
Hi,
There is an alternative solution but it would require more work on your side.
The problem with your current situation is that you create thousands of low value pages with little added value (which Google doesn't really like: https://www.mattcutts.com/blog/search-results-in-search-results/) and then you heavily promote these low quality pages by point hundreds of links to them. Principal message to Google - these low quality pages are my most important ones.
What you could do is to check the search pages which are generating traffic (ex. take the top 100) and create "real" pages for them. If we take the example you give: http://oursite.com/Search.html?text=Kitten - rather than having a generic search page with little added value you create a real page with some added value content (yoursite.com/topics/kitten) with links to your most important pages on the subject. As an example of how such a page could look like: http://dogtime.com/dog-breeds/german-shepherd-dog - this page is like a kind of "home" - containing a definition + links to the most important related articles on the subject. If these kinds of pages already exist on your site then of course no need to create them.
On the related search pages you then put a canonical url pointing to this page. You also update the links to the search page to the "real" added value page. This way you start promoting new value added content with minimal risk of loosing your current positions & remove the old low value pages from the index. It can take some time however before you see a positive effect.
For the search request where it's not possible to create a version with add value - you point the canonical to the generic search page (or your homepage) and remove all the links to these pages.
Hope this helps,
Dirk
-
Dirk,
THANKS!!! Thanks for the solid response. I guess my only concern is, we are still getting traffic from these indexed Search pages... and I need to minimize the hit from removing them. Any other more advanced methods I could use? Or.... In that case, would you recommend I do a combination of using the URL removal tool PLUS removing the tags?
I just need to do this as right as possible. I can't afford too much of a hit here (if any.) But, at the same time, we are losing traffic so fast, and have lost so much traffic, I don't have any choice at this point. We have doubled our product pages in the past 3 years and yet have lost about half our traffic.
Thanks again!
Craig
-
Hi,
I would first put a noindex on all your search result pages and leave the tags on the pages to allow Google to crawl them & "read" the new instructions.
I would also try to block these result pages in the robots.txt - it accepts pattern-matching ( https://support.google.com/webmasters/answer/6062596?hl=en&ref_topic=6061961) - if you try this make sure that you test it properly to avoid unwanted side effects.
You could also try the url removal tool - it's quite easy to delete an entire directory with the tool (https://support.google.com/webmasters/answer/1663419?hl=en) - you must make sure however that the pages cannot be crawled again (so do it after the modification of the robots.txt). If your search is on the root of your site and not in a separate directory, not sure if it's going to work.
Just removing the links to these pages without other modification is not going to help - they will just remain in the index.
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any idea why pages are not being indexed?
Hi Everyone, One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one: https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console? Jeff
Technical SEO | | vetofunk1 -
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Investigating a huge spike in indexed pages
I've noticed an enormous spike in pages indexed through WMT in the last week. Now I know WMT can be a bit (OK, a lot) off base in its reporting but this was pretty hard to explain. See, we're in the middle of a huge campaign against dupe content and we've put a number of measures in place to fight it. For example: Implemented a strong canonicalization effort NOINDEX'd content we know to be duplicate programatically Are currently fixing true duplicate content issues through rewriting titles, desc etc. So I was pretty surprised to see the blow-up. Any ideas as to what else might cause such a counter intuitive trend? Has anyone else see Google do something that suddenly gloms onto a bunch of phantom pages?
Technical SEO | | farbeseo0 -
Properly Moving Blog from Index to its Own Page
Right now I have a website that is exclusively a blog. I want to create pages outside of the blog and move the blog to a page other than the index file e.g.) from domain.com to domain.com/blog I will have the blog post pages stay in the root directory. e.g.) domain.com/blog-post Any suggestions how to properly tell SE's and other websites that the blog has moved?
Technical SEO | | Bartell0 -
Un-Indexing a Page without robots.txt or access to HEAD
I am in a situation where a page was pushed live (Went live for an hour and then taken down) before it was supposed to go live. Now normally I would utilize the robots.txt or but I do not have access to either and putting a request in will not suffice as it is against protocol with the CMS. So basically I am left to just utilizing the and I cannot seem to find a nice way to play with the SE to get this un-indexed. I know for this instance I could go to GWT and do it but for clients that do not have GWT and for all the other SE's how could I do this? Here is the big question here: What if I have a promotional page that I don't want indexed and am met with these same limitations? Is there anything to do here?
Technical SEO | | DRSearchEngOpt0 -
Optimising multiple pages for the same search term
We were having a discussion on title tags and optimising multiple pages for the same term. We rank well for the phrase 'chanel glasses' which points to our Chanel brand page. The Chanel brand page is optimised for this term, and has the phrase 'Chanel glasses' at the front of its title tag. Previously, the title tag on our home page had the words 'Chanel glasses' at the start in an attempt to rank twice for the term (as one of our competitors has managed). This never worked (though at the time, our DA/PA was lower than it is now). For this reason I switched the title tag on the homepage to try and rank for 'designer glasses'. My belief is, given we already rank highly for the term on a more relevant landing page, trying to rank for it again on the home page is not the best use of a title tag on our highest PA page. We may as well use it for something more generic like 'designer glasses' (though this term does not convert nearly as well, nor does it currently rank as well for us as we've not been attempting to get 'designer glasses' as anchor text. Plus it's more competitive. Another generic term maybe be preferable). My colleague's view is we should attempt to do what our competitor has done and try and rank twice on page one for this term. I like the idea of dominating the top results, but I feel that since attempting to get double-listed hasn't worked for us so far, we should use the homepage for optimising for a different term ( ideally something that we don't already rank for elsewhere on the site). I see his point of view - if we were ranking nowhere for the search term then, yes we should concentrate on getting one page to rank, not two. But since we already rank well for the term, perhaps his strategy is preferable? Just for clarity, the title tags are not duplicate, but the idea was to share many of the same keywords between the two title tags. What are your thoughts SEOmoz?
Technical SEO | | seanmccauley0 -
Should I set up a disallow in the robots.txt for catalog search results?
When the crawl diagnostics came back for my site its showing around 3,000 pages of duplicate content. Almost all of them are of the catalog search results page. I also did a site search on Google and they have most of the results pages in their index too. I think I should just disallow the bots in the /catalogsearch/ sub folder, but I'm not sure if this will have any negative effect?
Technical SEO | | JordanJudson0