Google Is Indexing My Internal Search Results - What should i do?
-
Hello,
We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue.
We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty .
We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages.
Our main focus is to improve user experience and not have customers enter the site through unexpected pages.
All thoughts or recommendations are welcome.
Thanks
-
A couple of things come to mind:
Why don't you want the product pages to be in the index?
Why is there concern of a penalty?
As to your question:
- Are you signed into Google when you are searching? Google will show you these types of results in the SERP's, but they are not necessarily shown to customers. If you have Google desktop installed it will also show you documents on your machine in SERPs if they find them relevant to what you are looking for.
* Do the URL's have parameters? If so you can set those in GWT and inform Google what they should do when they encounter them as they crawl the site.
*Canonical the pages you don't want in the index if possible to the static pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
What exactly is an impression in Google Webmaster Tools search queries with the image filter turned on?
Is it when someone does an image search? Or does it count a regular search that has images in it? On an image search does the picture actually have to be viewed on the screen or can it be below in the infinite scroll?
Intermediate & Advanced SEO | | EcommerceSite0 -
Search Results not Updating (Title, Description, and URL)
Issue: I recently discovered that my site was accessible by both HTTP and HTTPS. The site has used a rel canonical tag to point to the HTTP version. Google+ was pointing to HTTPS though. The title, description, and URL shown in the results for the homepage is HTTPS, other pages are HTTP, etc... Steps taken to Resolve: This week I did the following... 301'd all non-checkout pages to the HTTP version Switched Google+ URL to HTTP version and added new post with an HTTP link to the homepage. Used webmaster tools to recrawl and reindex the site Resubmitted XML Sitemap No luck... the site is still not updating... any advice would be greatly appreciated. Thanks all! Site is Here
Intermediate & Advanced SEO | | AhlerManagement0 -
My website (non-adult) is not appearing in Google search results when i have safe search settings on. How can i fix this?
Hi, I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated. Thanks
Intermediate & Advanced SEO | | CupidTeam0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Does Google Index an Alert Div w/Delayed Hide
We have a div at the top of a client's the page that displays an alert to the user. After 30 seconds it is rendered hidden. Does Google index this? Does Google take this into account when it ranks the page?
Intermediate & Advanced SEO | | WEOMedia0 -
Do you think that google places have influence on organic search?
Hi everyone, Do you think that google places have influence on organic search? i mean if you are on google places with your companies, your ranking must be better for that query? isn't it ? if you agree with me i'll bring you this stange case: my clients has a company profile on google places but it is only fourth on the organic results for that query and his competitor is ranked first with: no google places profile no authority and no company in the city ( in fact it is not in the city but 30 miles far ) the query is "poltrona frau brescia" poltrona frau= name of the company brescia= city in wich the company is located thanks poltrona-frau-brescia
Intermediate & Advanced SEO | | guidoboem0