Google dropping pages from SERPS
-
The website for my London based plumbing company has thousands of specifically tailored pages for the various services we provide to all the areas in London. It equates to approximately 6000 pages in total.
When google has all these pages indexed, we tend to get a fair bit of traffic - as they cater pretty well for long tail searches. However, every once in a while Google will drop the vast majority of our indexed pages from SERPs for a few days or weeks at a time - for example at the moment Google is only indexing 613 whereas last week it was back at the normal ~6000.
Why does this happen? We of course lose a lot of organic traffic when these pages don't displayed - what are we doing wrong?
Website: www.pgs-plumbers.co.uk
-
I am fairly sure that Pagerank is now constantly updated, it may well be that they only adjust the visible pagerank every three months though.
I can only say this next bit speculatively because I don't have the data (I would love to see the periods and numbers for the site above), but the cap itself might be a somewhat fluid idea, pages may be retained in the index as they are discovered but get pared back to their "hard cap" periodically.
-
Yeah, good shout. I was thinking about that as well, and maybe something to do with switching or upgrading of data centers, but probably not the latter to this extent.
-
Seems unlikely he would drop 90% of indexed pages and regain them back soon after if it was a PR thing, don't you think? Aren't PR updates generally quarterly?
-
I think it could be down to an indexation cap. I know Pagerank is not the force it once was but it does play a role in the number of pages google retains in it's index (if they regularly cull sites back to their cap that might explain the fluctuation, or maybe you fluctuate between two thresholds).
Matt Cutts made some comments which Rand broke down into bitesize chunks in the SEOmoz blog to that effect:
http://www.seomoz.org/blog/an-illustrated-guide-to-matt-cutts-comments-on-crawling-indexation
Your Pagerank is low, and without trawling through your site I don't know about duplicate content, but this could well be it. The fix is the same as an awful lot of SEO problems, more, better links.
-
I also think it's very possible his previous experiences with rankings dropping and pages getting de-indexed could have been at key algo udpates as well. Mayday, etc...
-
Yup David's point above is the prime suspect for a lot of changes just now, however you say this has happened before?
Are you doing anything else at the times when you've noticed you get dropped for a few days/weeks? Roughly how often does it happen?
EDIT - Just quickly put your site into OSE and you don't have a hugely diverse link profile, plus the sort of sites you're getting links from may have been effected by the farmer update which is had a knock on effect to your site (and similar updates may have done in the past). Not conclusive but something to think about as well.
-
The answer could easily be the Google Farmer update. If you are not familiar with this, read the latest post by Rand: http://www.seomoz.org/blog/googles-farmer-update-analysis-of-winners-vs-losers
That will help you understand the latest algorithm update and how you can do something about it.
It sounds like you dropped off due to the algo update and that leads me to believe that Google thought your content was not high enough quality?
I would read Rand's post, evaluate your website, and then have the content analyzed and see if you need to hire a really good writer to build high quality content that is valuable to the users. If you were building mediocre content to target thousands of keywords, specifically long-tail, you need to revamp that strategy and focus on amazing content first!
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google. Does anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
Technical SEO | | Chophel
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Can You Use More Then One Google Local Rich Snippet on a single site/ on a single page.
I am currently working on a website for a business that has multiple office locations. As I am trying to target all four locations I was wondering if it is okay to have more then one Local Rich Snippet on a single page. (For example they list all four locations and addresses within their footer and I was wondering if I could make these local rich snippets). What about having more then one on a single website. For example if a company has multiple offices located in several different cities and have set up individual contact pages for these cities, can each page have it's own Local Rich Snippet? Will Google look at these multiple "local rich snippets" as spaming or will they recognize the multiple locations and count it towards their local seo?
Technical SEO | | webdesignbarrie1 -
See Different Landing page for my main keyword in google search result
I have a website like http://www.bannerbuzz.com, i am promoting home page with vinyl banners keyword, but currently i can see my website's review page for vinyl banners result in google, i want to display my home page instead of review page for my keyword result in google, its frequently change, some time i can see home page for it and some time it shows review page as i attached image. i want to show my home page, so can you please help me to solve it, how can i stable my home page with main keywords. OtOXxiE.png
Technical SEO | | CommercePundit0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
How do I 301 redirect a number of pages to one page
I want to redirect all pages in /folder_A /folder_B to /folder_A/index.php. Can I just write one or two lines of code to .htaccess to do that?
Technical SEO | | Heydarian0 -
Why is our page not visible in Google-ranking? www.loseweight.com.
using Wordpress as platform. Using the URL gets into the site,- but seems to be non-existent for public... No comments at all, seems to be "invisible"?
Technical SEO | | gewi0 -
Severe Drops in Google UK rankings
Hi Guys, I'm Chris - SEO Noob & appointed website man @ www.customdesignedcable.co.uk I signed up to SEO moz at the end of February and i have slowly but surely been trying to implement the on-page optimisation suggestions with my selected keywords against the appropriate pages. Since i have done this, our pages are still ranking but no where near what they were at before. Out of 70 keywords, we've gone from at least 60+ pages on the first page down to 20+ and the number in the top three has dropped from 55 to 33. Obviously i'm no seo guru far far from it but all of the changes that have been suggested i have adhered to and the result is the exact opposite of what i was hoping for and expecting. I'm after some advice as to how to use seomoz to target my keywords to google primarily to get these page rankings back. Should i just upload backup copies of the site? How would you suggest that i move forward to try and achieve at least similar rankings to what i had previously? Any help is much appreciated as most of our enquiries come from the internet and it has almost completely dried up these last couple of months. Thanks Guys! Chris. P.S. Would redesigning the site look, layout and feel with the same content negatively affect seo? Also, as a completely separate question, how come sites with content that is a lot worse than ours rank higher for the same keywords as ours? Thanks Again
Technical SEO | | Chris_CDC0