Googlebot found an extremely high number of URLs on your site
-
I keep getting the "Googlebot found an extremely high number of URLs on your site" message in the GWMT for one of the sites that I manage.
The error is as below-
Googlebot encountered problems while crawling your site.
Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site.
I understand the nature of the message - the site uses a faceted navigation and is genuinely generating a lot of duplicate pages. However in order to stop this from becoming an issue we do the following;
- No-index a large number of pages using the on page meta tag.
- Use a canonical tag where it is appropriate
But we still get the error and a lot of the example pages that Google suggests are affected by the issue are actually pages with the no-index tag.
So my question is how do I address this problem?
I'm thinking that as it's a crawling issue the solution might involve the no-follow meta tag.
any suggestions appreciated.
-
I feel we are missing some information here.
For example, for our site we have done a canonical on the pages where we have query parameters. We have also specified these parameters as representative URL in Google Webmaster - URL parameters. Even after this we received this message "Googlebot found an extremely high number of URLs on your site".
The surprising thing is that these parameters are existing on the site for a long time, and the total URL count is reducing. Even after this Google has started sending this message to us since Feb 2014. Seems there has been some algorithmic change because of which some additional conditions that have not been highlighted in this thread have to be taken care of.. Not sure what..
-
Although I generally find NOINDEX works better than Google claims, I think @donford is essentially right - you still need to solve some of the architecture issues, or Google will attempt to re-crawl.
It's a complex problem, and sometimes a combination of NOINDEX, canonical, 301s, 404s, rel=prev/next, etc. all come into play. You don't usually need a "perfect" solution, but one tool rarely fits all situations these days.
Google has suggested that you try parameter handling in GWT. NOINDEX won't prevent crawling (just indexation), but GWT parameters help save crawler bandwidth. I've had mixed results on large sites, honestly, but it may be worth a try.
-
I was afraid that this might be the case.
Thanks for the help.
-
Hi Ben,
You are attempting to fix your SEO issue by using NOINDEX & CANONICAL but you are not fixing the main issue which is the URL's are still there.
NOINDEX will not stop Google from recognizing the link nor will NOFOLLOW. They actually use every link's information in one form or another regardless of the tag attributes.
Here is a direct quote from Matt Cutts about NOINDEX:
"Our highest duty has to be to our users, not to an individual webmaster. When a user does a navigational query and we don’t return the right link because of a NOINDEX tag, it hurts the user experience (plus it looks like a Google issue).....
REF: http://www.mattcutts.com/blog/google-noindex-behavior/
The first solution I would be interested in is working on the architecture of the site to see if there is a way to stop the crazy amount of URL's being generated and/or consolidate them to a single point. The next step would be to see if there is any commonality between these extra URL's and if there is any possibility to use a 301 redirect to consolidate these extra urls.
I think what you're really after was a way to fix this with a tag or patch, but I think the best way to fix this is to replace the engine that is driving these URL's. You're going to have to be a bit more specific in such case as to what kind of site you're using (Joomla, WordPress, Oscommerce, etc) for a more specific answer.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site migration/ CMS/domain site structure change-no access to search console
Hi everyone, We are migrating an old site under a bigger umbrella (our main domain). As mentioned in the title, We'll perform CMS migration, domain change, and site structure change. Now, the major problem is that we can't get into google search console for the old site. The site still has old GA code, so google search console verification using this method is not possible, also there is no way developers will be able to add GTM or edit DNS setting (not to bother you with the reason why). Now, my dilemma is : 1. Do we need access to old search console to notify Google about the domain name change or this could be done from our main site (old site will become a part of) search console 2. We are setting up 301 redirects from old to the new domain (not perfect 1:1 redirect ). Once migration is done does anything else needs to be done with the old domain (it will become obsolete)? 3.The main site, Site-map... Should I create a new sitemap with newly added pages or update the current one. 4. if you have anything else please add:) Thank you!
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Ajax tabs on site
Hello, On a webpage I have multiple tabs, each with their own specific content. Now these AJAX/JS tabs, if Google only finds the first tab when the page loads the content would be too thin. What do you suggest as an implementation? With Google being able to crawl and render more JS nowadays, but they deprecated AJAX crawling a while back. I was maybe thinking of doing a following implementation where when JS is disabled, the tabs collapse under each other with the content showing. With JS enabled then they render as tabs. This is usually quite a common implementation for tabbed content plugins on Wordpress as well. Also, Google had commented about that hidden/expandable content would count much less, even with the above JS fix. Look forward to your thoughts on this. Thanks, Conrad
Intermediate & Advanced SEO | | conalt1 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Merging 11 community sites into 1 regional site
I am merging 11 real estate community sites into 1 regional site and don't really know what type of redirect should I use for the homepage?, for instance: www.homepage.com redirect to www.regionalsite.com/community-page Should I 301 this redirect? If yes, how could I 301 redirect a homepage to an internal page in my new site? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
URL with a # but no ! being indexed
Given that it contains a #, how come Google is able to index this URL?: http://www.rtl.nl/xl/#/home It was my understanding that Google can't handle # properly unless it's paired with a ! (hash fragment / bang). site:http://www.rtl.nl/xl/#/home returns nothing, but: site:http://www.rtl.nl/xl returns http://www.rtl.nl/xl/#/home in the result set
Intermediate & Advanced SEO | | EdelmanDigital0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
Is it safe to redirect multiple URLs to a single URL?
Hi, I have an old Wordress website with about 300-400 original pages of content on it. All relating to my company's industry: travel in Africa. It's a legitimate site with travel stories, photos, advice etc. Nothing spammy about. No adverts on it. No affiliates. The site hasn't been updated for a couple of years and we no longer have a need for it. Many of the stories on it are quite out of date. The site has built up a modest Mozrank value over the last 5 years, and has a few hundreds organically achieved inbound links. Recently I set up a swanky new branded website on ExpressionEngine on a new domain. My intention is to: Shut down the old site Focus all attention on building up content on the new website Ask the people linking to the old site to my new site instead (I wonder how many will actually do so...) Where possible, setup a 301 redirect from pages on the old site to their closest match on the new site Setup a 301 redirect from the old site's home page to new site's homepage Sounds good, right? But there is one issue I need some advice on... The old site has about 100 pages that do not have a good match on the new site. These pages are outdated or inferior quality, so it doesn't really make sense to rewrite them and put them on the new site. I call these my "black sheep pages". So... for these "black sheep pages" should I (A) redirect the urls to the new site's homepage (B) redirect the urls the old site's home page (which in turn, redirects to the new site's homepage, or (C) not redirect the urls, and let them die a lonely 404 death? OPTION A: oldsite.com/page1.php -> newsite.com
Intermediate & Advanced SEO | | AndreVanKets
oldsite.com/page2.php -> newsite.com
oldsite.com/page3.php -> newsite.com
oldsite.com/page4.php -> newsite.com
oldsite.com/page5.php -> newsite.com
oldsite.com -> newsite.com OPTION B: oldsite.com/page1.php -> oldsite.com
oldsite.com/page2.php -> oldsite.com
oldsite.com/page3.php -> oldsite.com
oldsite.com/page4.php -> oldsite.com
oldsite.com/page5.php -> oldsite.com
oldsite.com -> newsite.com OPTION 😄 oldsite.com/page1.php : do not redirect, let page 404 and disappear forever
oldsite.com/page2.php : do not redirect, let page 404 and disappear forever
oldsite.com/page3.php : do not redirect, let page 404 and disappear forever
oldsite.com/page4.php : do not redirect, let page 404 and disappear forever
oldsite.com/page5.php : do not redirect, let page 404 and disappear forever
oldsite.com -> newsite.com My intuition tells me that Option A would pass the most "link juice" to my new site, but I am concerned that it could also be seen by Google as a spammy redirect technique. What would you do? Help 😐1 -
How would you fix this site?
We're currently in the IA and design phase of rolling out a complete overhaul of our main site. In the meantime I've been doing some SEO triage, but I wanted to start making a longer term plan for SEO during and after the new site goes up. We have a pretty decent domain authority, and some quality backlinks, but we're just getting creamed in the SERPs. And so on to my question: How would you fix this site? What SEO strategy would you employ? http://www.adoptionhelp.org Thanks!
Intermediate & Advanced SEO | | AdoptionHelp0