Google dropping pages from SERPS
-
The website for my London based plumbing company has thousands of specifically tailored pages for the various services we provide to all the areas in London. It equates to approximately 6000 pages in total.
When google has all these pages indexed, we tend to get a fair bit of traffic - as they cater pretty well for long tail searches. However, every once in a while Google will drop the vast majority of our indexed pages from SERPs for a few days or weeks at a time - for example at the moment Google is only indexing 613 whereas last week it was back at the normal ~6000.
Why does this happen? We of course lose a lot of organic traffic when these pages don't displayed - what are we doing wrong?
Website: www.pgs-plumbers.co.uk
-
I am fairly sure that Pagerank is now constantly updated, it may well be that they only adjust the visible pagerank every three months though.
I can only say this next bit speculatively because I don't have the data (I would love to see the periods and numbers for the site above), but the cap itself might be a somewhat fluid idea, pages may be retained in the index as they are discovered but get pared back to their "hard cap" periodically.
-
Yeah, good shout. I was thinking about that as well, and maybe something to do with switching or upgrading of data centers, but probably not the latter to this extent.
-
Seems unlikely he would drop 90% of indexed pages and regain them back soon after if it was a PR thing, don't you think? Aren't PR updates generally quarterly?
-
I think it could be down to an indexation cap. I know Pagerank is not the force it once was but it does play a role in the number of pages google retains in it's index (if they regularly cull sites back to their cap that might explain the fluctuation, or maybe you fluctuate between two thresholds).
Matt Cutts made some comments which Rand broke down into bitesize chunks in the SEOmoz blog to that effect:
http://www.seomoz.org/blog/an-illustrated-guide-to-matt-cutts-comments-on-crawling-indexation
Your Pagerank is low, and without trawling through your site I don't know about duplicate content, but this could well be it. The fix is the same as an awful lot of SEO problems, more, better links.
-
I also think it's very possible his previous experiences with rankings dropping and pages getting de-indexed could have been at key algo udpates as well. Mayday, etc...
-
Yup David's point above is the prime suspect for a lot of changes just now, however you say this has happened before?
Are you doing anything else at the times when you've noticed you get dropped for a few days/weeks? Roughly how often does it happen?
EDIT - Just quickly put your site into OSE and you don't have a hugely diverse link profile, plus the sort of sites you're getting links from may have been effected by the farmer update which is had a knock on effect to your site (and similar updates may have done in the past). Not conclusive but something to think about as well.
-
The answer could easily be the Google Farmer update. If you are not familiar with this, read the latest post by Rand: http://www.seomoz.org/blog/googles-farmer-update-analysis-of-winners-vs-losers
That will help you understand the latest algorithm update and how you can do something about it.
It sounds like you dropped off due to the algo update and that leads me to believe that Google thought your content was not high enough quality?
I would read Rand's post, evaluate your website, and then have the content analyzed and see if you need to hire a really good writer to build high quality content that is valuable to the users. If you were building mediocre content to target thousands of keywords, specifically long-tail, you need to revamp that strategy and focus on amazing content first!
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Many Pages Being Combined Into One Long Page
Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page. It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content. However there are technical limitations with this in our CMS. The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?
Technical SEO | | AllyBank1 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
How to Find all the Pages Index by Google?
I'm planning on moving my online store, http://www.filtrationmontreal.com/ to a new platform, http://www.corecommerce.com/ To reduce the SEO impact, I want to redirect 301 all the pages index by Google to the new page I will create in the new platform. I will keep the same domaine name, but all the URL will be customize on the new platform for better SEO. Also, is there a way or tool to create CSV file from those page index. Can Webmaster tool help? You can read my question about this subject here, http://www.seomoz.org/q/impacts-on-moving-online-store-to-new-platform Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590 -
Google indexing page with description
Hello, We rank fairly high for a lot of terms but Google is not indexing our descriptions properly. An example is with "arnold schwarzenegger net worth". http://www.google.ca/search?q=arnold+schwarzenegger+net+worth&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a When we add content, we throw up a placeholder page first. The content gets added with no body content and the page only contains the net worth amount of the celebrity. We then go back through and re-add the descriptions and profile bio shortly after. Will that affect how the pages are getting indexed and is there a way we can get Google to go back to the page and try to index the description so it doesn't just appear as a straight link? Thanks, Alex
Technical SEO | | Anti-Alex0 -
Google Dmoz description in SERPS
My dmoz description is not as KW rich as my sites normal description. IS there an advantage or disadvantage to either? If so, How do I prevent google from doing this?
Technical SEO | | DavidS-2820610