How to resolve - Googlebot found an extremely high number of URLs
-
Hi,
We got this message from Google Webmaster “Googlebot found an extremely high number of URLs on your site”. The sample URLs provided by Google are all either noindex or have a canonical.
- http://www.myntra.com/nike-stylish-show-caps-sweaters
- http://www.myntra.com/backpacks/f-gear/f-gear-unisex-black-&-purple-calvin-backpack/162453/buy?src=tn&nav_id=541
- http://www.myntra.com/kurtas/alma/alma-women-blue-floral-printed-kurta/85178/buy?nav_id=625
Also we have specified the parameters on these URLs as representative URL in Google Webmaster - URL parameters.
Your comments on how to resolve this issue will be appreciated.
Thank You
Kaushal Thakkar
-
Hi Kaushal,
Thanks for the question.
There are a few ways to deal with this problem which are recommended by Google here. In summary, you can:
- Use parameter handling as you have done
- Add the nofollow attribute to problematic URLs
- Block problematic URLs in robots.txt
There is also a thread in the Google webmaster forums which may be useful to you:
Overall, it comes down to having a good site architecture and cutting down / removing / blocking URLs that you don't care about from a search perspective.
I hope that helps a bit!
Paddy
-
Thank you David, Its been more than 10 months since these parameters have been specified in webmaster. This and other activities like noindex and canonicals helped us to reduce the indexed URL count from 32 million to 1.2 million. As the url index reduced this warning from google stopped for 4 months. However we started receiving this message again from february 2014.
Thanks
Kaushal
-
"we have specified the parameters on these URLs as representative URL in Google Webmaster - URL parameters."
How long ago was this done? Since there are so many URL's, it may take a while for them to recrawl and index the representative URL's per your request.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Back links issue and how to resolve it
Hi there! We have a client who has been generating back links from external sites over a period of two years with all the same anchor text which all link back to the home page. This anchor text is also their main search phrase they wish to score highly on. In total, they have roughly 300 domain names linking to their site. Over 50 of these domain names all have the same anchor text. These links have been generated through articles and blogs. So roughly 20% of the total number of links all have the same anchor text. Over the past 6 months the client has noticed a steady drop in their rankings for this term. From the back link analysis we have done, we believe it is this which is causing the problem. Does any one else agree? For the remedy, do we go in and see if we can change the anchor text or disavow them through Google webmaster tools? Suggestions? Thanks for your help! P 🙂
White Hat / Black Hat SEO | | Globalgraphics0 -
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
RD and PA high but still not ranking
We picked up a client who before ourselves was just using link building as their SEO strategy. They came to us for on page SEO and overall guidance. We have done some targeted link building and did some work with their link building company to remove some links, however after doing some further diggings Im wondering if we still have some bad links? My reasoning for this is:- all the SEO work we have done on the pages are getting A reports in Moz (which is our back up check) the RD and PA for many of the pages we have focused on are higher RDs and PA's than the pages that rank on the first page Any suggestions?
White Hat / Black Hat SEO | | SocialB1 -
Why is this site performing so well in the SERP's and getting high traffic volume for no apparent reason!
The site is https://virtualaccountant.ie/ It's a really small site They have only about 7 back links, They don't blog They don't have a PPC campaign They don't stand out from the crowd in terms of product or services offered So why are they succeeding in topping the SERP's for difficult to rank for accounting keywords such as accountant and online accounts. What are they doing better than everyone else, or have they discovered a way to cheat Google, and worse still - ME!
White Hat / Black Hat SEO | | PeterConnor0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Stuffing keywords into URLs
The following site ranks #1 in Google for almost every key phrase in their URL path for almost every page on their site. Example: themarketinganalysts.com/en/pages/medical-translation-interpretation-pharmaceutical-equipment-specifications-medical-literature-hippa/ The last folder in this URL uses 9 keywords and I've seen as many as 18 on the same site. Curious: every page is a "default.html" under one of these kinds of folders (so much architecture?). Question: How much does stuffing keywords into URL paths affect ranking? If it has an effect, will Google eventually ferret it out and penalize it?
White Hat / Black Hat SEO | | PaulKMia0 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0