Removing pages from index
-
Hello,
I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist.The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal.
Also, should I 301, 404 or 410 these pages?
Any help would be appreciated.
Thanks,
Alex
-
yes the no content page thing is a big problem. If you have a "view all" option, and it's more than a dozen, fifteen or maybe 20 products, that should be paginated, with full indexing. Maile Oyhe even talked about that specific scenario of "view all" being good.
In my experience, all of the no-content pages should, ideally, be 301 redirected in a way that they point to the most relevant highest level category page on your site.
Since there's so many, there's no easy way to get them removed from the index other than doing the 301 then being patient as Google recrawls then re-confirms.
-
Ah - that's definitely better, if you don't go too wide. 2009 - 2010's concept of not having too many links went too far with too many people. Sites became too flat.
Categories and pagination are best served with having enough categories to cover the highest level groups, with sub-categories as appropriate, but not to the the point where there's only a few products in any single sub-category. So if you've got more than a dozen or fifteen products in a category or sub-category, pagination is perfectly valid.
Having more than six, eight or maybe ten categories at most, is also not good.
-
Alan, I think I misspoke. I meant to say that a categorically structured set of your products would be better to index than a paginated version. For example :
http://www.sunglasses.com/mens/black/productx
as opposed to http://www.sunglasses.com/products?page=233
Is it still considered wise to index both paginated results along with categorized results in this case?
-
Hi Alan,
Thanks for the info. I was going to set my page 2+ to "noindex,follow", however your reply makes sense. I will leave them indexable. I do see some competitors "rel=canonical" pagination to "view all" pages. I think I will keep my pages as is.
However, as my reply to Ryan stated, my issue is still the INDEX.
Google has thousands of "no content" pages indexed. They contain links to other "no content" pages making my site look thin. This may be the reason we lost so much ranking/traffic with Panda update.
How do I get these pages removed from the index? And do I return 301, 404 or 410 when Google comes back to reindex them?
Thanks for your help!
Alex
-
Hi Ryan,
I crawled the site, and did not find links to these pages, however it made me realize another HUGE issue. Since the paging is dynamically created, it has links to the "back" & "forward" no matter what page you are on. So, if page # 5000 is displayed, it will have links to page # 4999 and 5001. Although in my website I do not have links to pages that do not exist, all it takes is someone link to my site with "page=10000" and Google to index that page. From this point on, G will index all the PAGEs that do not exist.
Thanks again for getting me a step closer to resolving my problem.
However, the problem is still the INDEX. Google has (now realizing that its in the thousands) pages indexed with no content. These pages just contain links to other PAGING pages that have no content and my main menu/categories.
How do I get these pages removed from the index?
Thanks again!
Alex
-
For the record, that link that SSCDavis referenced includes Matt Cutts discussing faceted navigation, not pagination. Faceted navigation is different than pagination by leaps and bounds. So he (SSCDavis), with all due respect, is absolutely incorrect in his claim of what Matt said.
Maile Ohye, Senior support engineer at Google, definitely recommends allowing pagination to be indexed, if implemented properly. She even discussed this at length this week up at SMX Advanced in Seattle. Vanessa Fox, head of Nine by Blue, and former Googler (the creator of Google Webmaster Tools) agrees.
And so do I.
When performed properly, pagination (with quality optimization of paginated pages) can lead to dramatic increases in individual products indexed, higher quality visits from people further along in the buying process, and more people finding the site through an exponentially greater number of keyword phrases.
Consider this - in pagination (X number of products on the initial page, with X additional DIFFERENT products on page 2, and x additional still more and different products on page 3,etc. - by not wanting those pages indexed, you're communicating to Google - hey - we don't care about these other products enough to include them." Which means they get a false and negative understanding of how many products you have in your catalog. And THAT drives the overall strength of your catalog down.
Now, if, on the other hand, you already show ALL of your products on a top level page that is linked from the main navigation, then sure, pagination should be killed. But only if that's the case.
-
Alex,
I would highly recommend crawling your website and examining the crawl report. If Google is indexing these pages, then they got to them on your site at some point. I would proceed with the idea in mind this is a web design issue, not someone trying to ruin your rankings, as you suggested.
The crawl report will show the referrer page which can help troubleshoot the issue. When you have pages generated by a CMS or other software, there can easily be issues like the one you are experiencing. In my experience this is the most likely cause of your issue.
You mentioned there are 100s of these pages in the index. If you can determine a pattern they match, it is possible you can 301 all of them with a single rule, sending the user to your main category page or where ever you feel is best.
You can also set up a parameter specific instructions in Google WMT. I would avoid doing this until after you have reviewed your crawl report. From your Google WMT dashboard > Site Configuration > Settings > Parameter handling tab > find or add your parameter and adjust the setting as you deem fit.
-
**Edit: Please see alans answer
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index, follow on a paginated page with a different rel=canonical URL
Hello, I have a question about meta robots ="index, follow" and rel=canonical on category page pagination. Should the sorted page be <meta name="robots" content="index,follow"></meta name="robots" content="index,follow"> since the rel="canonical" is pointing to a separate page that is different from the URL? Any thoughts on this topic would be awesome. Thanks. Main Category Page
Intermediate & Advanced SEO | | Choice
https://www.site.com/category/
<meta name="robots" content="index,follow"><link rel="canonical" href="https: www.site.com="" category="" "=""></link rel="canonical" href="https:></meta name="robots" content="index,follow"> Sorted Page
https://www.site.com/category/?p=2&dir=asc&order=name
<meta name="robots" content="index, follow"=""><link rel="canonical" href="https: www.site.com="" category="" ?p="2""></link rel="canonical" href="https:></meta name="robots" content="index,> As you can see, the meta robots is telling Google to index https://www.site.com/category/?p=2&dir=asc&order=name , yet saying the canonical page is https://www.site.com/category/?p=2 .0 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Which index page should I canonical to?
Hello! I'm doing a routine clean up of my code and had a question about the canonical tag. On the index page, I have the following: I have never put any thought into which index path is the best to use. http://www.example.com http://www.example.com/ http://www.example.com/index.php Could someone shed some light on this for me? Does it make a difference? Thanks! Ryan
Intermediate & Advanced SEO | | Ryan_Phillips1 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0