Google indexing is slowing down?
-
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap.
We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now.
Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks!
-
Try getting links for important pages. As for google crawling and indexing: i have websites that contain 10k of URL's on which i know, half of that is not relevant for search at all. Google does it completely automaticly and cuts out the most duplicate or non-relevant pages in comparison to what i offer in my sitemap and brings 4k of pages or so in actual search.
It's really no issue. You cant tell me you got 20 million of unique (user) generated content.
-
Share your URL for better responses.
Have you submitted a complete XML sitemap through Google Search Console? And are all the pages free of a no-index meta tag?
20 million unique pages can only be created in some automated fashion based on a much smaller amount of original content, so Google may perceive these pages to be spammy and not worth indexing.
-
Ryan,
G doesn't crawl/index every url it encounters. Its decision to do so is based on it perceived value of the the page/the page the link is one. You can increase that value/PR/PA by sculpting your internal links, revised silo-ing of your content and/or acquiring backlinks to pages deeper within your site.
If you're a brand new site with 20M pages, what reason have you give google to crawl all those pages. Just submitting them or listing them on a sitemap isn't enough of a reason. I mean, 20M pages! google's like damn dude, what's up with all that?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google does not remove my page?
Hi everyone, last week i add "Noindex" tag into my page, but that site still appear in the organic search. what other things i can do for remove from google?
Technical SEO | | Jorge_HDI0 -
Index bloating issue
Hello, In the last month, I noticed a huge spike in the number of pages indexed on my site, which I think is impacting my SEO quality score. While I've only have about 90 pages on my site map, the number of pages indexed jumped to 446, with about 536 pages being blocked by robots. At first we thought this might be due to duplicate product pages showing up in different categories on my site, but we added something to our robot.txt file to not index those pages. But the number has not gone down. I've tried to consult with our hosting vendor, but no one seems to be concerned or have any idea why there was such a big jump in the last month. Any insights or pointers would be so greatly appreciated, so that I can fix/improve my SEO as quickly as possible! Thanks!
Technical SEO | | Saison0 -
Getting Google to index a large PDF file
Hello! We have a 100+ MB PDF with multiple pages that we want Google to fully index on our server/website. First of all, is it even possible for Google to index a PDF file of this size? It's been up on our server for a few days, and my colleague did a Googlebot fetch via Webmaster Tools, but it still hasn't happened yet. My theories as to why this may not work: A) We have no actual link(s) to the pdf anywhere on our website. B) This PDF is approx 130 MB and very slow to load. I added some compression to it, but that only got it down to 105 MB. Any tips or suggestions on getting this thing indexed in Google would be appreciated. Thanks!
Technical SEO | | BBEXNinja0 -
How to fix google index filled with redundant parameters
Hi All This follows on from a previous question (http://moz.com/community/q/how-to-fix-google-index-after-fixing-site-infected-with-malware) that on further investigation has become a much broader problem. I think this is an issue that may plague many sites following upgrades from CMS systems. First a little history. A new customer wanted to improve their site ranking and SEO. We discovered the site was running an old version of Joomla and had been hacked. URL's such as http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate redirected users to other sites and the site was ranking for buy adobe or buy microsoft. There was no notification in webmaster tools that the site had been hacked. So an upgrade to a later version of Joomla was required and we implemented SEF URLs at the same time. This fixed the hacking problem, we now had SEF url's, fixed a lot of duplicate content and added new titles and descriptions. Problem is that after a couple of months things aren't really improving. The site is still ranking for adobe and microsoft and a lot of other rubbish and the urls like http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate are still sending visitors but to the home page as are a lot of the old redundant urls with parameters in them. I think it is default behavior for a lot of CMS systems to ignore parameters it doesn't recognise so http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate displays the home page and gives a 200 response code. My theory is that Google isn't removing these pages from the index because it's getting a 200 response code from old url's and possibly penalizing the site for duplicate content (which don't showing up in moz because there aren't any links on the site to these url's) The index in webmaster tools is showing over 1000 url's indexed when there are only around 300 actual url's. It also shows thousands of url's for each parameter type most of which aren't used. So my question is how to fix this, I don't think 404's or similar are the answer because there are so many and trying to find each combination of parameter would be impossible. Webmaster tools advises not to make changes to parameters but even so I don't think resetting or editing them individually is going to remove them and only change how google indexes them (if anyone knows different please let me know) Appreciate any assistance and also any comments or discussion on this matter. Regards, Ian
Technical SEO | | iragless0 -
Google Published Date - Does Google Lie?
Here's the scenario. I create a page called "ABC" and it gets published and found by Google lets say on the 13th of April. on the 15th (or 14th) i decide to update the URL, page Title, and content. (Redirect old URL to new URL as well) Will Google still show this page as being published on the 13th? or would it update the publish date according to the new URL? Greg | | | | | | <a id="question_reply-to-question-36769-description_codeblock" class="mceButton mceButtonEnabled mce_codeblock" style="color: #000000; border: 1px solid #f0f0ee; margin: 0px 1px 0px 0px; padding: 0px; background-color: transparent; cursor: default; vertical-align: baseline; width: 20px; border-collapse: separate; display: block; height: 20px;" title="Create Code Block" tabindex="-1"></a>Create Code Block | | | | | | | | | | | | | | |
Technical SEO | | AndreVanKets0 -
Update index date
If I update the content of a page without changing the initial url and google crawls my new page, will the index date (that appears in the SERP) change to the latest update? In positive case how many change should I do to consider an update? tks
Technical SEO | | fabrico230 -
Sitemaps for Google
In Google Webmaster Central, if a URL is reported in your site map as 404 (Not found), I'm assuming Google will automatically clean it up and that the next time we generate a sitemap, it won't include the 404 URL. Is this true? Do we need to comb through our sitemap files and remove the 404 pages Google finds, our will it "automagically" be cleaned up by Google's next crawl of our site?
Technical SEO | | Prospector-Plastics0 -
Mobile Google Not Indexing Mobile Website
Google currently does not index our mobile website. It has the WWW website in it's index. When a user from a mobile phone clicks on a mobile search result for WWW we redirect them to our mobile website. This is posing problems for us as our mobile website is a fraction of the # of pages/sections as our WWW. So for example, mobile search results show that we have a "careers" section; but that's not the case for the mobile website. As a result a user gets a 404. How do we force mobile Google to index our mobile website instead of our WWW?
Technical SEO | | RBA0