Does page speed affect what pages are in the index?
-
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them.
I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
-
An SEO who thinks adding thousands of useless pages will do a website good? Get rid of them, or (preferably) get them re-educated!
-
I cant say that it is down to the panda update because im not 100% sure but from what your saying about the spun content and what you can see the panda update is all about then its likely to be.
Although the update is in July it does not mean your be hit straight away, but its only been a month from the update to you loosing results in the index and it just so happens the update is to combat duplicate and spun content.
Have your load times decreased?
-
I thought Panda was in July, this appears to be around mid Aug that the drop occurred.
-
Its the content.
Google launched an update to its algo called the panda update which basically hammered duplicate/spun content websites this year.
If you Google 'Google panda update' have a little read your find loads of ammo to throw back.
-
Yes, we have 1.2m pages with content generated from spintext like algorithms. I'm not in charge of our SEO strategy I'm the one that has to develop it but when i hear them blaming load times(my problem) instead of content(their problem) it really makes me question how well they're really doing. I've been trying to tell our "expert" load times are not the issue but yet he keeps coming back to us with that instead of changes to the content.
-
Well I just checked our webmaster tools and on average 1-2 seconds is a fast load time, so im 99% here your correct that its not load times.
When you say 'spun up' do you mean you have 1.2m pages which are basically spun content? If so thats most likely the problem.
-
I'm pretty sure they indexed about double of that at one point and then the pages that appeared in their index cut in half one day. Again our SEO guy told us this was normal and that we need to speed up the pages and release more pages.
-
It could be the structure,
You might find Google is struggling to find those pages that you want crawled.
If those pages are 5 clicks away from the homepage Google will need to follow down those links as well to find it.
So you could have homepage - category - sub category - paging number 9 - page you want found.
Just a thought!
-
With such fast load speeds there is no way you're running into trouble on that front. It's far more likely that it's a quality issue, especially if you believe there are a number of poorly generated pages.
Are there any discrepancies between the number of pages you're seeing on Google and Bing via the site:domain.com query, and the number of pages in the index as shown in Webmaster Tools? It's always possible that some other form of indexing issue is at play.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages being flagged in Search Console as having a "no-index" tag, do not have a meta robots tag??
Hi, I am running a technical audit on a site which is causing me a few issues. The site is small and awkwardly built using lots of JS, animations and dynamic URL extensions (bit of a nightmare). I can see that it has only 5 pages being indexed in Google despite having over 25 pages submitted to Google via the sitemap in Search Console. The beta Search Console is telling me that there are 23 Urls marked with a 'noindex' tag, however when i go to view the page source and check the code of these pages, there are no meta robots tags at all - I have also checked the robots.txt file. Also, both Screaming Frog and Deep Crawl tools are failing to pick up these urls so i am a bit of a loss about how to find out whats going on. Inevitably i believe the creative agency who built the site had no idea about general website best practice, and that the dynamic url extensions may have something to do with the no-indexing. Any advice on this would be really appreciated. Are there any other ways of no-indexing pages which the dev / creative team might have implemented by accident? - What am i missing here? Thanks,
Technical SEO | | NickG-1230 -
Indexing Issue
Hi, We have moved one of our domain https://www.mycity4kids.com/ in angular js and after that, i observed the major drop in the number of indexed pages. I crosschecked the coding and other important parameters but didn't find any major issue. What could be the reason behind the drop?
Technical SEO | | ResultFirst0 -
Is site: a reliable method for getting full list of indexed pages?
The site:domain.com search seems to show less pages than it used to (Google and Bing). It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed. Should I now accept the "pages" listed in site: is an unreliable metric?
Technical SEO | | bjalc20112 -
Removing a staging area/dev area thats been indexed via GWT (since wasnt hidden) from the index
Hi, If you set up a brand new GWT account for a subdomain, where the dev area is located (separate from the main GWT account for the main live site) and remove all pages via the remove tool (by leaving the page field blank) will this definately not risk hurting/removing the main site (since the new subdomain specific gwt account doesn't apply to the main site in any way) ?? I have a new client who's dev area has been indexed, dev team has now prevented crawling of this subdomain but the 'the stable door was shut after the horse had already bolted' and the subdomains pages are on G's index so we need to remove the entire subdomain development area asap. So we are going to do this via the remove tool in a subdomain specific new gwt account, but I just want to triple check this wont accidentally get main site removed too ?? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Is the Authority of Individual Pages Diluted When You Add New Pages?
I was wondering if the authority of individual pages is diluted when you add new pages (in Google's view). Suppose your site had 100 pages and you added 100 new pages (without getting any new links). Would the average authority of the original pages significantly decrease and result in a drop in search traffic to the original pages? Do you worry that adding more pages will hurt pages that were previously published?
Technical SEO | | Charlessipe0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
If a permanent redirect is supposed to transfer SEO from the old page to the new page, why has my domain authority been impacted?
For example, we redirected our old domain to a new one (leaving no duplicate content on the old domain) and saw a 40% decrease in domain authority. Isn't a permanent redirect supposed to transfer link authority to the place it is redirecting to? Did I do something wrong?
Technical SEO | | BlueLinkERP0 -
Two different page authority ranks for the same page
I happened to notice that trophycentral.com and www.trophycentral.com have two different page ranks even though there is a 301 redirect. Should I be concerned? http://trophycentral.com Page Authority: 47 Domain Authority: 42 http://www.trophycentral.com Page Authority: 51 Domain Authority: 42 Thanks!
Technical SEO | | trophycentraltrophiesandawards0