Removing pages from index
-
Hello,
I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist.The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal.
Also, should I 301, 404 or 410 these pages?
Any help would be appreciated.
Thanks,
Alex
-
yes the no content page thing is a big problem. If you have a "view all" option, and it's more than a dozen, fifteen or maybe 20 products, that should be paginated, with full indexing. Maile Oyhe even talked about that specific scenario of "view all" being good.
In my experience, all of the no-content pages should, ideally, be 301 redirected in a way that they point to the most relevant highest level category page on your site.
Since there's so many, there's no easy way to get them removed from the index other than doing the 301 then being patient as Google recrawls then re-confirms.
-
Ah - that's definitely better, if you don't go too wide. 2009 - 2010's concept of not having too many links went too far with too many people. Sites became too flat.
Categories and pagination are best served with having enough categories to cover the highest level groups, with sub-categories as appropriate, but not to the the point where there's only a few products in any single sub-category. So if you've got more than a dozen or fifteen products in a category or sub-category, pagination is perfectly valid.
Having more than six, eight or maybe ten categories at most, is also not good.
-
Alan, I think I misspoke. I meant to say that a categorically structured set of your products would be better to index than a paginated version. For example :
http://www.sunglasses.com/mens/black/productx
as opposed to http://www.sunglasses.com/products?page=233
Is it still considered wise to index both paginated results along with categorized results in this case?
-
Hi Alan,
Thanks for the info. I was going to set my page 2+ to "noindex,follow", however your reply makes sense. I will leave them indexable. I do see some competitors "rel=canonical" pagination to "view all" pages. I think I will keep my pages as is.
However, as my reply to Ryan stated, my issue is still the INDEX.
Google has thousands of "no content" pages indexed. They contain links to other "no content" pages making my site look thin. This may be the reason we lost so much ranking/traffic with Panda update.
How do I get these pages removed from the index? And do I return 301, 404 or 410 when Google comes back to reindex them?
Thanks for your help!
Alex
-
Hi Ryan,
I crawled the site, and did not find links to these pages, however it made me realize another HUGE issue. Since the paging is dynamically created, it has links to the "back" & "forward" no matter what page you are on. So, if page # 5000 is displayed, it will have links to page # 4999 and 5001. Although in my website I do not have links to pages that do not exist, all it takes is someone link to my site with "page=10000" and Google to index that page. From this point on, G will index all the PAGEs that do not exist.
Thanks again for getting me a step closer to resolving my problem.
However, the problem is still the INDEX. Google has (now realizing that its in the thousands) pages indexed with no content. These pages just contain links to other PAGING pages that have no content and my main menu/categories.
How do I get these pages removed from the index?
Thanks again!
Alex
-
For the record, that link that SSCDavis referenced includes Matt Cutts discussing faceted navigation, not pagination. Faceted navigation is different than pagination by leaps and bounds. So he (SSCDavis), with all due respect, is absolutely incorrect in his claim of what Matt said.
Maile Ohye, Senior support engineer at Google, definitely recommends allowing pagination to be indexed, if implemented properly. She even discussed this at length this week up at SMX Advanced in Seattle. Vanessa Fox, head of Nine by Blue, and former Googler (the creator of Google Webmaster Tools) agrees.
And so do I.
When performed properly, pagination (with quality optimization of paginated pages) can lead to dramatic increases in individual products indexed, higher quality visits from people further along in the buying process, and more people finding the site through an exponentially greater number of keyword phrases.
Consider this - in pagination (X number of products on the initial page, with X additional DIFFERENT products on page 2, and x additional still more and different products on page 3,etc. - by not wanting those pages indexed, you're communicating to Google - hey - we don't care about these other products enough to include them." Which means they get a false and negative understanding of how many products you have in your catalog. And THAT drives the overall strength of your catalog down.
Now, if, on the other hand, you already show ALL of your products on a top level page that is linked from the main navigation, then sure, pagination should be killed. But only if that's the case.
-
Alex,
I would highly recommend crawling your website and examining the crawl report. If Google is indexing these pages, then they got to them on your site at some point. I would proceed with the idea in mind this is a web design issue, not someone trying to ruin your rankings, as you suggested.
The crawl report will show the referrer page which can help troubleshoot the issue. When you have pages generated by a CMS or other software, there can easily be issues like the one you are experiencing. In my experience this is the most likely cause of your issue.
You mentioned there are 100s of these pages in the index. If you can determine a pattern they match, it is possible you can 301 all of them with a single rule, sending the user to your main category page or where ever you feel is best.
You can also set up a parameter specific instructions in Google WMT. I would avoid doing this until after you have reviewed your crawl report. From your Google WMT dashboard > Site Configuration > Settings > Parameter handling tab > find or add your parameter and adjust the setting as you deem fit.
-
**Edit: Please see alans answer
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
Landing pages, are my pages competing?
If I have identified a keyword which generates income and when searched in google my homepage comes up ranked second, should I still create a landing page based on that keyword or will it compete with my homepage and cause it to rank lower?
Intermediate & Advanced SEO | | The_Great_Projects0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Duplicate page title at bottom of page - ok, or bad?
Can I get you experts opinion? A few years ago, we customized our pages to repeat the page title at the bottom of the page. So the page title is in the breadcrumbs at the top, and then it's also at the bottom of the page under all the contents. Here is a sample page: bit.ly/1pYyrUl I attached a screen shot and highlighted the second occurence of the page title. Am worried that this might be keyword stuffing, or over optimizing? Thoughts or advice on this? Thank you so much! ron ZH8xQX6
Intermediate & Advanced SEO | | yatesandcojewelers0 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
Indexing Dynamic Pages
http://www.oreillyauto.com/site/c/search/Wiper+Blade/03300/C0047.oap?make=Honda&model=Accord&year=2005&vi=1430764 How is O'Reilly getting this page indexed? It shows up in organic results for [2005 honda accord windshield wiper size].
Intermediate & Advanced SEO | | Kingof50 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0 -
How to remove hundreds of duplicate pages
Hi - while i was checking duplicate links, am finding hundreds of duplicates pages :- having undefined after domain name and before sub page url having /%5C%22/ after domain name and before the sub page url Due to Pagination limits Its a joomla site - http://www.mycarhelpline.com Any suggestions - shall we use:- 301 redirect leave these as standdstill and what to do of pagination pages (shall we create a separate title tag n meta description of every pagination page as unique one) thanks
Intermediate & Advanced SEO | | Modi0