Does page speed affect what pages are in the index?
-
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them.
I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
-
An SEO who thinks adding thousands of useless pages will do a website good? Get rid of them, or (preferably) get them re-educated!
-
I cant say that it is down to the panda update because im not 100% sure but from what your saying about the spun content and what you can see the panda update is all about then its likely to be.
Although the update is in July it does not mean your be hit straight away, but its only been a month from the update to you loosing results in the index and it just so happens the update is to combat duplicate and spun content.
Have your load times decreased?
-
I thought Panda was in July, this appears to be around mid Aug that the drop occurred.
-
Its the content.
Google launched an update to its algo called the panda update which basically hammered duplicate/spun content websites this year.
If you Google 'Google panda update' have a little read your find loads of ammo to throw back.
-
Yes, we have 1.2m pages with content generated from spintext like algorithms. I'm not in charge of our SEO strategy I'm the one that has to develop it but when i hear them blaming load times(my problem) instead of content(their problem) it really makes me question how well they're really doing. I've been trying to tell our "expert" load times are not the issue but yet he keeps coming back to us with that instead of changes to the content.
-
Well I just checked our webmaster tools and on average 1-2 seconds is a fast load time, so im 99% here your correct that its not load times.
When you say 'spun up' do you mean you have 1.2m pages which are basically spun content? If so thats most likely the problem.
-
I'm pretty sure they indexed about double of that at one point and then the pages that appeared in their index cut in half one day. Again our SEO guy told us this was normal and that we need to speed up the pages and release more pages.
-
It could be the structure,
You might find Google is struggling to find those pages that you want crawled.
If those pages are 5 clicks away from the homepage Google will need to follow down those links as well to find it.
So you could have homepage - category - sub category - paging number 9 - page you want found.
Just a thought!
-
With such fast load speeds there is no way you're running into trouble on that front. It's far more likely that it's a quality issue, especially if you believe there are a number of poorly generated pages.
Are there any discrepancies between the number of pages you're seeing on Google and Bing via the site:domain.com query, and the number of pages in the index as shown in Webmaster Tools? It's always possible that some other form of indexing issue is at play.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Is site: a reliable method for getting full list of indexed pages?
The site:domain.com search seems to show less pages than it used to (Google and Bing). It doesn't relate to a specific site but all sites. For example, I will get "page 1 of about 3,000 results" but by the time I've paged through the results it will end and change to "page 24 of 201 results". In that example If I look in GSC it shows 1,932 indexed. Should I now accept the "pages" listed in site: is an unreliable metric?
Technical SEO | | bjalc20112 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
When creating parent and child pages should key words be repeated in url and page title?
We are in the direct mail advertising business: PrintLabelAndMail.com Example: Parent:
Technical SEO | | JimDirectMailCoach
Postcard Direct Mail Children:
Postcard Mailings
Postcard Design
Postcard Samples
Postcard Pricing
Postcard Advantages should "postcard" be repeated in the URL and Page Title? and in this example should each of the 5 children link back directly to the parent or would it be better to "daisy chain" them using each as parent for the next?0 -
41.000 pages indexed two years after it was redirected to a new domain
Hi!Two years ago, we changed the domain elmundodportivo.es to mundodeportivo.com. Apparently, everything was OK, but more than two years later, there are still 41.000 pages indexed in Google (https://www.google.com/search?q=site%3Aelmundodeportivo.es) even though all the domains have been redirected with a 301 redirect. I detected some problems with redirections that were 303 instead of 301, but we fixed that one month ago.A secondary problem is that the pagerank for elmundodportivo.es is 7 yet and mundodeportivo.com is 3.What I'm doing wrong?Thank you all,Oriol
Technical SEO | | MundoDeportivo0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Duplicates on the page
Hello SEOMOZ, I've one big question about one project. We have a page http://eb5info.com/eb5-attorneys and a lot of other similar pages. And we got a big list of errors, warnings saying that we have duplicate pages. But in real not all of them are same, they have small differences. For example - you select "State" in the left sidebar and you see a list on the right. List on the right panel is changing depending on the what you selecting on the left. But on report pages marked as duplicates. Maybe you can give some advices how to improve quality of the pages and make SEO better? Thanks Igor
Technical SEO | | usadvisors0 -
Search Result Page, Index or Not?
I believe Google doesn't want to index and show other search result pages in there SERP.
Technical SEO | | DigitalJungle
So instead of adding "noindex, follow" tag i have changed the url in my search result page like this: Original
http://www.mysite.com/kb-search.aspx?=travelguide&type=wiki&s=3 To
http://www.mysite.com/travelguide/attraction-guide.html And the search result page contains the title of the articles, a short descriptions (300 chars.) and a link to the articles. Does it help? Or should i add noindex, follow tag? Helps Please?0