Howcome Google is indexing one day 2500 pages and the other day only 150 then 2000 again ect?
-
This is about an big affiliate website of an customer of us, running with datafeeds...
Bad things about datafeeds:
- Duplicate Content (product descriptions)
- Verrryyyy Much (thin) product pages (sometimes better to noindex, i know, but this customer doesn't want to do that)
-
Hi Dana,
Thanks for your detailed explanation. Appreciate it Off course I understand that site speed is a factor for crawling (+ ranking) and that the Google bots only want to spend a certain period of time on a website. It's more like, when servers are performing almost equal every day so page loads are igual to, what could it be?
I agree with your two points of considering, but I'm the type of guy that always wants to know why something is happening
@Nakul: Thanks for your responds!
The pages that are in and out of the index are mostly product pages. So the thing about "frequently updates" can be something. The website is pretty young so authority is not yet build as it should be for a big site. This can also be a factor cause the more authority the more time Google will spend indexing a website rightAnyway, great thanks for both of your answers!
Gr. Wesley
-
I agree with everything Nakul has said. Just to piggyback on that with additional information, try to think about it this way. Remember when someone gave you $1.00 when you were little and said "Don't spend it all in one place?" Well, someone at Google must have grown up with the same grandparents I did.
Okay, now, the analogy-free explanation
Google has a "crawl budget" every day. Every day that budget is allocated to millions of different sites. Now, by "sites" I mean "pages." Some pages change really frequently (i.e. the Yahoo New homepage). Some pages change hardly ever (i.e. an archived blog post). Also, some pages have very high PR and others, not so much. Also, some pages load extremely fast (consuming less of Google's bandwidth when the page is crawled) which leaves more Google resources available to Google to crawl more pages. Google likes it, and so should we all because people with fast sites are making it possible for everyone to get crawled more often (in essence, making them very considerate, well-behaved members of the Internet community).
So, based on all these, Google is going to apportion a part of its crawl budget to your site on any given day. Some days, it may have more room in its budget for you than others. Part of this might be effected by how fast pages, on any given day, load from your site. A ton of parameters can come into play here, including whether or not the pages on that day are heavier, or whether or not your servers are performing really fast on one day versus another.
I'd say the two things to be really concerned with after considering all of these things are:
- Is Google indexing all of the pages you want indexed?
- Is Google's cache date of your important pages recent enough? (i.e. 3 weeks or less)
If the answer is "no" to either one of those, then it's time to do some investigation to find out if there are technical issues or penalties that have been put in place that are hurting Google's ability or desire (not the right word to use about a bot, but I'm using it anyway) to crawl your pages.
Does that help?
-
Domain Authority / Pagerank is what Google looks to see how deep and how frequently Google will crawl a particular website. They also typically look into how frequently the content is being updated.
Think about it from Google's perspective. Why should they index that website, 2500 pages every day. What's changing ? Does the site have enough domain authority to warrant that kind of indexing ?
In my opinion, this is not a concern. Just submit XML Sitemaps and see what percentage of your submitted pages are indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Processing but Not Indexing XML Sitemap
Like it says above, Google is processing but not indexing our latest XML sitemap. I noticed this Monday afternoon - Indexed status was still Pending - and didn't think anything of it. But when it still said Pending on Tuesday, it seemed strange. I deleted and resubmitted our XML sitemap on Tuesday. It now shows that it was processed on Tuesday, but the Indexed status is still Pending. I've never seen this much of a lag, hence the concern. Our site IS indexed in Google - it shows up with a site:xxxx.com search with the same number of pages as it always has. The only thing I can see that triggered this is Sunday the site failed verification via Google, but we quickly fixed that and re-verified via WMT Monday morning. Anyone know what's going on?
Intermediate & Advanced SEO | | Kingof50 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
How to remove wrong crawled domain from Google index
Hello, I'm running a Wordpress multisite. When I create a new site for a client, we do the preparation using the multisite domain address (ex: cameleor.cobea.be). To keep the site protected we use the "multisite privacy" plugin which allows us to restrict the site to admin only. When site is ready we a domain mapping plugin to redirect the client domain to the multisite (ex: cameleor.com). Unfortunately, recently we switched our domain mappin plugin by another one and 2 sites got crawled by Google on their multsite address as well. So now when you type "cameleor" in Google you get the 2 domains in SERPS (see here http://screencast.com/t/0wzdrYSR). It's been 2 weeks or so that we fixed the plugin issue and now cameleor.cobea.be is redirected to the correct address cameleor.com. My question: how can I get rid of those wrong urls ? I can't remove it in Google Webmaster Tools as they belong to another domain (cf. cameleor.cobea.be for which I can't get authenticated) and I wonder if will ever get removed from index as they still redirect to something (no error to the eyes of Google)..? Does anybody has an idea or a solution for me please ? Thank you very much for your help Regards Jean-Louis
Intermediate & Advanced SEO | | JeanlouisSEO0 -
How many inner links on one page?
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do? We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow? We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site? Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
Intermediate & Advanced SEO | | komeksimas1 -
Should I index tag pages?
Should I exclude the tag pages? Or should I go ahead and keep them indexed? Is there a general opinion on this topic?
Intermediate & Advanced SEO | | NikkiGaul0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0 -
Should you stop indexing of short lived pages?
In my site there will be a lot of pages that have a short life span of about a week as they are items on sale, should I nofollow the links meaning the site has a fwe hundred pages or allow indexing and have thousands but then have lots of links to pages that do not exist. I would of course if allowing indexing make sure the page links does not error and sends them to a similarly relevant page but which is best for me with the SEarch Engines? I would like to have the option of loads of links with pages of loads of content but not if it is detrimental Thanks
Intermediate & Advanced SEO | | barney30120 -
Pages un-indexed in my site
My current website www.energyacuity.com has had most pages indexed for more than a year. However, I tried cache a few of the pages, and it looks the only one that is now indexed by Goggle is the homepage. Any thoughts on why this is happening?
Intermediate & Advanced SEO | | abernatj0