Does page speed affect what pages are in the index?
-
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them.
I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
-
An SEO who thinks adding thousands of useless pages will do a website good? Get rid of them, or (preferably) get them re-educated!
-
I cant say that it is down to the panda update because im not 100% sure but from what your saying about the spun content and what you can see the panda update is all about then its likely to be.
Although the update is in July it does not mean your be hit straight away, but its only been a month from the update to you loosing results in the index and it just so happens the update is to combat duplicate and spun content.
Have your load times decreased?
-
I thought Panda was in July, this appears to be around mid Aug that the drop occurred.
-
Its the content.
Google launched an update to its algo called the panda update which basically hammered duplicate/spun content websites this year.
If you Google 'Google panda update' have a little read your find loads of ammo to throw back.
-
Yes, we have 1.2m pages with content generated from spintext like algorithms. I'm not in charge of our SEO strategy I'm the one that has to develop it but when i hear them blaming load times(my problem) instead of content(their problem) it really makes me question how well they're really doing. I've been trying to tell our "expert" load times are not the issue but yet he keeps coming back to us with that instead of changes to the content.
-
Well I just checked our webmaster tools and on average 1-2 seconds is a fast load time, so im 99% here your correct that its not load times.
When you say 'spun up' do you mean you have 1.2m pages which are basically spun content? If so thats most likely the problem.
-
I'm pretty sure they indexed about double of that at one point and then the pages that appeared in their index cut in half one day. Again our SEO guy told us this was normal and that we need to speed up the pages and release more pages.
-
It could be the structure,
You might find Google is struggling to find those pages that you want crawled.
If those pages are 5 clicks away from the homepage Google will need to follow down those links as well to find it.
So you could have homepage - category - sub category - paging number 9 - page you want found.
Just a thought!
-
With such fast load speeds there is no way you're running into trouble on that front. It's far more likely that it's a quality issue, especially if you believe there are a number of poorly generated pages.
Are there any discrepancies between the number of pages you're seeing on Google and Bing via the site:domain.com query, and the number of pages in the index as shown in Webmaster Tools? It's always possible that some other form of indexing issue is at play.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Not Being Indexed
Hey Everyone - I have a site that is being treated strangely by google (at least strange to me) The site has 24 pages in the sitemap - submitted to WMT'S over 30 days ago I've manually triggered google to crawl the homepage and all connecting links as well and submitted a couple individually. Google has been parked the indexing at 14 of the 24 pages. None of the unindexed URL's have Noindex or follow tags on them - they are clearly and easily linked to from other places on the site. The site is a brand new domain, has no manual penalty history and in my research has no reason to be considered spammy. 100% unique handwritten content I cannot figure out why google isn't indexing these pages. Has anyone encountered this before? Know any solutions? Thanks in advance.
Technical SEO | | CRO_first0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
How to Find all the Pages Index by Google?
I'm planning on moving my online store, http://www.filtrationmontreal.com/ to a new platform, http://www.corecommerce.com/ To reduce the SEO impact, I want to redirect 301 all the pages index by Google to the new page I will create in the new platform. I will keep the same domaine name, but all the URL will be customize on the new platform for better SEO. Also, is there a way or tool to create CSV file from those page index. Can Webmaster tool help? You can read my question about this subject here, http://www.seomoz.org/q/impacts-on-moving-online-store-to-new-platform Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
Should I delete a page or remove links on a penalized page?
Hello All, If I have a internal page that has low quality links point to it or a penality. Can I just remove the page, and start over versus trying to remove the links? Over time wouldn't this page disapear along with the penalty on that page? Kinda like pruning a tree? Cutting off the junk limbs so other could grow stronger, or to start new fresh ones. Example: www.domain.com Penalized Internal Page: (Say this page is penalized due to keyword stuffing, and has low quality links pointing to it like blog comments, or profiles) www.domain.com/penalized-internal-page.com Would it be effective to just delete this page (www.domain.com/penalized-internal-page.com) and start over with a new page. New Internal Page: www.domain.com/new-internal-page.com I would of course lose any good links point to that page, but it might be easier then trying to remove old back links. Thoughts? Thanks! Pete
Technical SEO | | Juratovic0 -
Should I allow index of category / tag pages on Wordpress?
Quite simply, is it best to allow index of category / tag pages on a Wordpress blog or no index them? My thought is Google will / might see it as duplicate content? Thanks, K
Technical SEO | | SEOKeith0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
How should 301 redirects affect Page Authority?
We recently setting up 301 redirects from one of our sites so that the site redirects from the www version to the non-www version for all pages. We want to quantify what we expect to see as results. From what the experts say, we'd expect that the Page Authority of the canonical versio (non-www) will be higher than either of the two separate ones were previously. For instance, if this page - www.website.com/information/ - had a PA of 57 and this one - website.com/information/ - had a PA of 53, some time after the 301 redirects from www to non-www have been put into place, we should see the non-www version of that page move up to some PA about 57. It our thinking correct? How long does it normally take to see a PA update take place in a scenario like this? Thanks, Richard
Technical SEO | | LDS-SEO0