URL Parameter Handling In GWT to Treat Overindexation - how aggressive?
-
Hi,
My client recently launched a new site and their index went from about 20K up to about 80K - which is a severe over indexation.
I believe this was caused by parameter handling as some category pages now have 700 pages in the results for "site:domain.com/category1" - and apart from the top result, they are all parameters being indexed.
My question is how active/aggressive should I be in blocking these parameters in Google Webmaster Tools? Currently, everything is set to 'let googlebot decide'.
-
Hi! Did these answers take care of your question, or do you still have some questions?
-
Hey There
I would use a robots meta noindex on them (except for the top page of course) and use rel = prev/next to show they are paginated.
I would prefer to do that than use WMT. Also, WMT crawl settings will stop the crawling, but not remove them from the index. Plus, WMT will only handle Google, not other engines like Bing etc. Not that Bing matters, but always better to have a universal solution.
-Dan
-
Hello Search Guys,
Here is some food for thought taken from: http://www.quora.com/Does-Google-limit-the-number-of-pages-it-indexes-for-a-particular-site
Summary:
"Google says they crawl the web in "roughly decreasing PageRank order" and thus, pages that have not achieved widespread link popularity, particularly on large, deep sites, may not be crawled or indexed."
"Indexation
There is no limit to the number of pages Google may index (meaning available to be served in search results) for a site. But just because your site is crawled doesn't mean it will be indexed.Crawl
The ability, speed and depth for which Google crawls your site and retrieves pages can be dependent on a number of factors: PageRank, XML sitemaps, robots.txt, site architecture, status codes and speed.""For a zero-backlink domain with 80.000+ pages, in conjunction with rel=canonical and an xml-sitemap (You do submit a sitemap, don't you?), after submitting the domain to Google for a crawl, a little less than 10k pages remained in index. A few crawls later this was reduced to a mere 250 (very good job on Google's side).
This leads me to believe the indexation cap for a newer site with low to zero pagerank/authority is around 10k."
Another interesting article: http://searchenginewatch.com/article/2062851/Google-Upping-101K-Page-Index-Limit
Hope this helps, and easy response is to limit crawling to the most needed pages as aggressive as possible to remove the unneeded links leaving only needed ones
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with parameter URLs as primary internal links and not canonicals? Weird situation inside...
So I have a weird situation, and I was hoping someone could help. This is for an ecommerce site. 1. Parameters are used to tie Product Detail Pages (PDP) to individual categories. This is represented in the breadcrumbs for the page and the use of a categoryid. One product can thus be included in multiple categories. 2. All of these PDPs have a canonical that does not include the parameter / categoryid. 3. With very few exceptions, the canonical URL for the PDPs are not linked to. Instead, the parameter URL is to tie it to a specific category. This is done primarily for the sake of breadcrumbs it seems. One of the big issues we've been having is the canonical URLs not being indexed for a lot of the products. In some instances, the canonicals _are _indexed alongside parameters, or just parameter URLs are indexed. It's all very...mixed up, I suppose. My theory is that the majority of canonical URLs not being linked to anywhere on the site is forcing Google to put preference on the internal link instead. My problem? **I have no idea what to recommend to the client (who will not change the parameter setup). ** One of our Technical SEOs recommended we "Use cookies instead of parameters to assign breadcrumbs based on how the PDP is accessed." I have no experience this. So....yeah. Any thoughts? Suggestions? Thanks in advance.
Intermediate & Advanced SEO | | Alces0 -
How to Handle Spammy Top Referring Domains
We keep getting links from the domain lyricswithoutmelody.org. Currently we have the most referring backlinks of all from them. I'm not sure what to do with it... is it hurting us? I know I can disavow them, but I'm afraid it will hurt since we have 472 total backlinks from the domain. Their trust flow is 9 and citation flow is 11. Another option I was thinking is to block the domains IP from seeing our website, would that work? Just trying to figure out the best coarse of action... or if no action at all is best. I've attached a screenshot of my top referring domains. The ones outlined in red I don't know who they are and if it's helping or hurting. Moz Fam HELP! Ijb09DNhIW5
Intermediate & Advanced SEO | | LindsayE0 -
Pagination new pages vs parameters
I'm working on a site that currently handles pagination like this cars-page?p=1 cars-page?p=2 In webmaster tools I can then tell ?p= designates pagination However I have a plugin I want to add to fix other seo issues, among those it adds rel="prev" rel="next" and it modifies the pagination to this cars-page-1.html cars-page2.html Notice I lost the parameter here and now each page is a different page url, pagination is no longer a parameter. I will not longer be able to specify the pagination parameter in webmaster tools. Would this confuse google as the pagination is no longer a parameter and there will now be multiple urls instead of one page with parameters? My gut says this would be bad, as I haven't seen this approach often on ecommerce site, but I wanted to see what the community thought?
Intermediate & Advanced SEO | | K-WINTER0 -
How to handle brand description on product pages?
Hi Mozzers, Hope you're doing good. I have a content placement related question. Assume, I have 1000 products of brand A, 1000 of brand B, and so on. Now, if I want to put brand specific 200-words description on each of these product pages. I'm creating duplicate content across the site by putting absolutely same brand description on these product pages i.e brand A description on first 1000 pages, brand B description on next 1000 products and so on. Looking for an expert advice around placement of content here i.e how can I add brand description on product pages and avoid duplicate content penalty? Any help?
Intermediate & Advanced SEO | | _nitman0 -
How best to handle (legitimate) duplicate content?
Hi everyone, appreciate any thoughts on this. (bit long, sorry) Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example) Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword. These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc) Sites share the same template/look and feel too AND are accessed via same IP - just for good measure 🙂 So - to questions/thoughts. 1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally. 2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them) 3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant? 4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that? I think 1- is first thing to do. Anything else? Many thanks.
Intermediate & Advanced SEO | | Capote0 -
Dynamic URLs Appearing on Google Page 1\. Convert to Static URLs or not?
Hi, I have a client who uses dynamic URLs thoughout his site. For SEO purposes, I've advised him to convert dynamic URLs to static URLs whenever possible. However, the client has a few dynamic URLs that are appearing on Google Page 1 for strategically valuable keywords. For these URLs, is it still worth it to 301 them to static URLs? In this case, what are the potential benefits and/or pitfalls?
Intermediate & Advanced SEO | | mindflash0 -
Automatic redirect to external urls
Hi, there is a way to create a "bridge page" with automatic url redirect ( 302 ) without google penalization? In this moment, my bridge pages are indexed on google with title and description of the redirected page.. Thanks in advance. Mauro.
Intermediate & Advanced SEO | | raulo790 -
How to fix duplicated urls
I have an issue with duplicated pages. Should I use cannonical tag and if so, how? Or should change the page titles? This is causing my pages to compete with each other in the SERPs. 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' is also used on http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 | | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |
Intermediate & Advanced SEO | | Melia0