URL Parameter Handling In GWT to Treat Overindexation - how aggressive?
-
Hi,
My client recently launched a new site and their index went from about 20K up to about 80K - which is a severe over indexation.
I believe this was caused by parameter handling as some category pages now have 700 pages in the results for "site:domain.com/category1" - and apart from the top result, they are all parameters being indexed.
My question is how active/aggressive should I be in blocking these parameters in Google Webmaster Tools? Currently, everything is set to 'let googlebot decide'.
-
Hi! Did these answers take care of your question, or do you still have some questions?
-
Hey There
I would use a robots meta noindex on them (except for the top page of course) and use rel = prev/next to show they are paginated.
I would prefer to do that than use WMT. Also, WMT crawl settings will stop the crawling, but not remove them from the index. Plus, WMT will only handle Google, not other engines like Bing etc. Not that Bing matters, but always better to have a universal solution.
-Dan
-
Hello Search Guys,
Here is some food for thought taken from: http://www.quora.com/Does-Google-limit-the-number-of-pages-it-indexes-for-a-particular-site
Summary:
"Google says they crawl the web in "roughly decreasing PageRank order" and thus, pages that have not achieved widespread link popularity, particularly on large, deep sites, may not be crawled or indexed."
"Indexation
There is no limit to the number of pages Google may index (meaning available to be served in search results) for a site. But just because your site is crawled doesn't mean it will be indexed.Crawl
The ability, speed and depth for which Google crawls your site and retrieves pages can be dependent on a number of factors: PageRank, XML sitemaps, robots.txt, site architecture, status codes and speed.""For a zero-backlink domain with 80.000+ pages, in conjunction with rel=canonical and an xml-sitemap (You do submit a sitemap, don't you?), after submitting the domain to Google for a crawl, a little less than 10k pages remained in index. A few crawls later this was reduced to a mere 250 (very good job on Google's side).
This leads me to believe the indexation cap for a newer site with low to zero pagerank/authority is around 10k."
Another interesting article: http://searchenginewatch.com/article/2062851/Google-Upping-101K-Page-Index-Limit
Hope this helps, and easy response is to limit crawling to the most needed pages as aggressive as possible to remove the unneeded links leaving only needed ones
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate URLs on eCommerce site caused by parameters
Hi there, We have a client with a large eCommerce site with about 1500 duplicate URLs caused by the parameters in the URLs (such as the sort parameter where the list of products are then sorted by price, age etc.) Example: www.example.com/cars/toyota First duplicate URL: www.example.com/cars/toyota?sort=price-ascending Second duplicate URL: www.example.com/cars/toyota?sort=price-descending Third duplicate URL: www.example.com/cars/toyota?sort=age-descending Originally we had advised to add a robots.txt file to block search engines from crawling the URLs with parameters but this hasn't been done. My question: If we add the robots.txt now and exclude all URLs with filters - how long will it take for Google to disregard the duplicate URLs? We could ask the developers to add canonical tags to all the duplicates but these are about 1500... Thanks in advance for any advice!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
URL Construction
Working on an old site that currently has category urls (that productively rank) like this example: LakeNameBoating.com/category/705687/rentals I want to enhance the existing mid page one rank for terms related to "Lake Name Boat Rentals," 301ing the old urls to the new, would you construct the new urls as: LakeNameBoating.com/lake-name-boat-rentals or... LakeNameBoating.com/boat-rentals And why? It's all for one particular lake with "name" being just an anonymous placeholder example. Thanks!
Intermediate & Advanced SEO | | 945010 -
Is it a problem if a URL has too many backslashes in its address?
The ecommerce platform of the site that I am working on generates URLs that contain ID Codes for each different product category, color variations, styles, etc. An example of a URL for a specific product includes: www.example.com/women/denim-jeans/py/c/109/np/108/p/3834.html Is it a problem for search engine crawlers if the URL address has so many backslashes in its address? Appreciate your feedback.
Intermediate & Advanced SEO | | SEO_Promenade0 -
How do I best handle a minor URL change?
My company is about to complete an upgrade to our website but part of this will be changing the URLs slightly. Mainly the .aspx suffix will be dropped off the pages that we're most worried about. The current URLs will automatically redirect to the new pages, will this be enough or will there be an SEO impact? If it helps the site is www.duracard.com and the product pages are the ones we want to keep ranked. For instance if someone searches for "plastic gift cards" our page '<cite>https://www.duracard.com/products/plastic-gift-cards.aspx</cite>' is #3 and we want to make sure it stays that way once we change it to 'https://www.duracard.com/products/plastic-gift-cards'. Any advice would be greatly appreciated, thank you!
Intermediate & Advanced SEO | | Andrea.G0 -
Using abbreviations in URL - Matching Keyword
We have a website that uses /us/, /ca/, /va/, etc for URLs of the different U.S. states. How much better is it (or is it at all better) to use /california/ or /virginia/ instead in our URLs to rank for searches that include the name of those states?
Intermediate & Advanced SEO | | Heydarian0 -
Should I change wordpress urls?
Should I change my wordpress permalinks to include the keyword? For examples at the minute my url is http://www.musicliveuk.com/home/wedding-singer. Is it better to be http://www.musicliveuk.com/live-bands/wedding-singer. 'home' is not relevant so surely 'live-bands' would be better? If I change the urls won't I lose 'link juice' as external links will all point to a url that no longer exists? Or will wordpress automatically redirect the old url to the new one? Finally, if I should change the url as described how do I do it on wordpress? I can only see how to edit the last bit of the url and not the middle bit.
Intermediate & Advanced SEO | | SamCUK0 -
Aggressive Loss in Rankings
I recently launched a website and used local directories, web directories and guest blogging as a means of getting links to the site. Within 60 days the site was ranking on the first page for a highly competitive keyword. It was hovering on the first page for about two weeks before it plummeted in ranking to near 200. I am not finding any crawl errors or duplicate content issues on the site. Could my links have caused this massive decrease in rankings?
Intermediate & Advanced SEO | | mj7750 -
Quick URL structure question
Say you've got 5,000 articles. Each of these are from 2-3 generations of taxonomy. For example: example.com/motherboard/pc/asus39450 example.com/soundcard/pc/hp39 example.com/ethernet/software/freeware/stuffit294 None of the articles were SUPER popular as is, but they still bring in a bit of residual traffic combined. Few thousand or so a day. You're switching to a brand new platform. Awesome new structure, taxonomy, etc. The real deal. But, historically, you don't have the old taxonomy functions. The articles above, if created today, file under example.com/hardware/ This is the way it is from here on out. But what to do with the historical files? keep the original URL structure, in the new system. Readers might be confused if they try to reach example.com/motherboard, but at least you retain all SEO weight and these articles are all older anyways. Who cares? Grab some lunch. change the urls to /hardware/, and redirect everything the right way. Lose some rank maybe, but its a smooth operation, nice and neat. Grab some dinner. change the urls to /hardware/ DONT redirect, surprise Google with 5k articles about old computer hardware. Magical traffic splurge, go skydiving. Panic, cry into your pillow. Get job signing receipts at CostCo Thoughts?
Intermediate & Advanced SEO | | EricPacifico0