I have 15,000 pages. How do I have the Google bot crawl all the pages?
-
I have 15,000 pages. How do I have the Google bot crawl all the pages? My site is 7 years old. But there are only about 3,500 pages being crawled.
-
Can you tell us the URL of the site in question? That can help us to help you, because we can look at the site and maybe spot something like an improper robots.txt or site architecture that makes it hard for a robot to crawl.
-
Tihs one is interesing. I work with a currency exchange/transfer site where I have 20000+ pages in english only. What I did is pretty basic, but it worked. I did one sitemap for all the main pages - service pages, homepage and pages which won't change until the next redesign. I did one more XML sitemap file, where I had my first set of money transfer pairs grouped - country to country. My 3rd and largest XML file is where I had 16512 currency combinations listed, by each combo being a webpage. For my english version I have 16058 out of them indexed. The pages are quite similar by content, but also by function. I have 4 variables which seems to do the trick. WIth other language targetings my success ranges from 3000-12000 indexed pages from this particular sitemap.
I guess it depends on the market you are targeting. If the pages have similar content it won't hurt doing some alterations to provide custom information if possible.
Hope that helps!
-
I did set up an XML sitemap and submit it via Google Webmaster Tools. It did not help. Is it because my PR is 2?
-
Google will only index your pages if it deems they are "worthy". However, you can certainly give Googlebot some encouragement. A good way to do this is to setup an XML sitemap and submit it via Google Webmaster Tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My client is using a mobile template for their local pages and the Google search console is reporting thousands of duplicate titles/meta descriptions
So my client has 2000+ different store locations. Each location has the standard desktop location and my client opted for a corresponding mobile template for each location. Now the Google search console is reporting thousands of duplicate titles/meta descriptions. However this is only because the mobile template and desktop store pages are using the exact same title/meta description tag. Is Google penalizing my client for this? Would it be worth it to update the mobile template title/meta description tags?
Technical SEO | | RosemaryB0 -
Google still listing pages from old domain after 2 change requests
Good Morning I put forward the following question in December 2014 https://moz.com/community/q/google-still-listing-old-domain as pages from our old domain www.fhr-net.co.uk were still indexed in Google. We have submitted two change request in WMT, the most recent was over 6 months ago yet the old pages are still being indexed and we can't see why that would be Any advice would be appreciated
Technical SEO | | Ham19790 -
When i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
when i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
Technical SEO | | Jamalon0 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
Empty Google cached pages.
My little startup Voyage has a tough relationship with Google. I have been reading SEOMOZ/MOZ for years. I am no pro but I understand the basics pretty well. I would like to know why all pages on my main domain look empty in google cache. Here is one example. Other advice is welcome too. I know a lot of my metas and my markup is bad but I am working on it!
Technical SEO | | vincentgagne0 -
41.000 pages indexed two years after it was redirected to a new domain
Hi!Two years ago, we changed the domain elmundodportivo.es to mundodeportivo.com. Apparently, everything was OK, but more than two years later, there are still 41.000 pages indexed in Google (https://www.google.com/search?q=site%3Aelmundodeportivo.es) even though all the domains have been redirected with a 301 redirect. I detected some problems with redirections that were 303 instead of 301, but we fixed that one month ago.A secondary problem is that the pagerank for elmundodportivo.es is 7 yet and mundodeportivo.com is 3.What I'm doing wrong?Thank you all,Oriol
Technical SEO | | MundoDeportivo0 -
Google Published Date - Does Google Lie?
Here's the scenario. I create a page called "ABC" and it gets published and found by Google lets say on the 13th of April. on the 15th (or 14th) i decide to update the URL, page Title, and content. (Redirect old URL to new URL as well) Will Google still show this page as being published on the 13th? or would it update the publish date according to the new URL? Greg | | | | | | <a id="question_reply-to-question-36769-description_codeblock" class="mceButton mceButtonEnabled mce_codeblock" style="color: #000000; border: 1px solid #f0f0ee; margin: 0px 1px 0px 0px; padding: 0px; background-color: transparent; cursor: default; vertical-align: baseline; width: 20px; border-collapse: separate; display: block; height: 20px;" title="Create Code Block" tabindex="-1"></a>Create Code Block | | | | | | | | | | | | | | |
Technical SEO | | AndreVanKets0 -
Magento - Google Webmaster Crawl Errors
Hi guys, Started my free trial - very impressed - just thought I'd ask a question or two while I can. I've set up the website for http://www.worldofbooks.com (large bookseller in the UK), using Magento. I'm getting a huge amount of not found crawl errors (27,808), I think this is due to URL rewrites, all the errors are in this format (non search friendly): http://www.worldofbooks.com/search_inventory.php?search_text=&category=&tag=Ure&gift_code=&dd_sort_by=price_desc&dd_records_per_page=40&dd_page_number=1 As oppose to this format: http://www.worldofbooks.com/arts-books/history-of-art-design-styles/the-art-book-by-phaidon.html (the re-written URL). This doesn't seem to really be affecting our rankings, we targeted 'cheap books' and 'bargain books' heavily - we're up to 2nd for Cheap Books and 3rd for Bargain Books. So my question is - are these large amount of Crawl errors cause for concern or is it something that will work itself out? And secondly - if it is cause for concern will it be affecting our rankings negatively in any way and what could we do to resolve this issue? Any points in the right direction much appreciated. If you need any more clarification regarding any points I've raised just let me know. Benjamin Edwards
Technical SEO | | Benj250