Organic listings disappeared I don't know why!
-
Brief history:
I am MD of a medium sized health organisation in the UK. We have one of the leading websites in the world for our industry. We were hit by a Google algorithm update last year (Penguin or Panda, I can't remember, but that's not relevant here I don't think) and our daily visits went down from around 10,000 to around 5,000 in two separate hits over a couple of months. Then there was a steady decrease to about 3,000-4,000 visits a day until we totally updated the design of the site and did some good work on the content. We have always been white-hat and the site has around 3,000 pages with unique content added daily.
So things have really been on the up for the past couple of months. We have been receiving around 6,000 visits a day in recent weeks (a slow incline over the past few months), until Sunday. Sunday morning around 10am all of our organic listings pretty much disappear, including for our brand name. Monday morning a few come back, including our brand name and our main, most competitive keyword, which we were showing up on the third page for and we returned to this page. Then Tuesday morning another few of our most competitive keywords show up, back where they were before. This includes images which had disappeared from Google images.
Our PPC and business listings were not really affected at all.
My developer submitted a site map through webmaster tools on Monday morning and I'm not sure if this is the reason pages started to show up again. In our Webmaster tools the indexed pages are about a quarter of all of the ones on the site - all pages were indexed before. I just don't know what has happened! It doesn't make any sense as 1. Google don't seem to have rolled out any algorithm updates on that day 2. we do not have any messages in Webmaster Tools 3. a number of our main keywords have re-appeared - why would that happen if we had been hit by a Google update?!
Our organic hits, which previously made up about 80% of all our hits, have gone down by 80% and this is drastically affecting business. If this continues it is likely we will have to downsize the business and I'm not sure what to do.
When I saw that the 'indexed pages' in Webmaster tools started to increase (they were around 600 on Monday, around 900 yesterday and then this morning, around 1,300), I thought that we were on our way up and maybe this problem would just resolve itself and our listings would re-appear, but now our indexed pages have reduced slightly since this morning, back down to around 1,100 so the increase has stalled.
Can anybody help?! Do you have any idea what could be causing this? Apparently there have been no changes made to robots.txt and my developer says that no changes were made that could have affected our listings.
ANY ADVICE WOULD BE GREATLY APPRECIATED.
-
Interesting situation...and very frustrating for you, I'm sure.
You mentioned this below:
"I checked 'cached snapshot of page' in Google Toolbar for the pages that weren't being indexed, and it showed up as a 404 error. "
This sounds like you had some sort of technical error. But some things still don't add up for me. It kind of sounds like your pages were not resolving for Google. But, the odd thing is that if Google sees a 404 error, they keep trying for days, weeks or even months before they conclude that the pages should be removed from the index.
I don't have an answer for you but the first place I'd look is to make sure that your robots.txt file is not blocking Googlebot in some way. I'd also check server logs and perhaps check with your host to see if there was some significant down time for the site.
If there was a technical glitch, and the problem is now fixed, then your pages should come back into the index without you doing anything.
I'm pretty certain this isn't a penalty issue though.
-
Thank you. I will look into this, although I don't think the pages are set to no follow because there has been a further development. I checked 'cached snapshot of page' in Google Toolbar for the pages that weren't being indexed, and it showed up as a 404 error. These are pages that have always been cached before this problem occurred. I then went to 'submit URL to Google' and submitted a couple of URLs. They instantly showed up in Google's listings in the same spot as they were before and the cached snapshot of page then showed up correctly. I could do that for every page but 1. that would be a HUGE job and 2. would that look spammy or suspicious to Google? 3. is there a way of doing that for multiple pages at a time?! I feel like this problem is very close to being solved but I just don't quite know how to solve it.
-
The only time I've seen this type of thing happen - all of the pages in a site are no longer indexed, yet PPC still works, is when something on the site has been set to no-index / no-follow.
If you had a manual penalty from Google, that would show up in Google Webmaster Tools. Plus, the site would still be indexed, just ranked really, really low. If everything was missing from Google's cache, then the most likely explanation is that it was set accidentally to no-index / no-follow.
This is a very easy thing to mess up, and it's possible that someone might have hit the wrong button by accident, or updated the robots.txt file.
In the past, I had a project manager who messed this up for a client while doing a content update on the site, and it was about a week before anyone noticed. She's no longer here (not due just to that issue). But this is so critical for me and my company that we've put an automated and human testing check in place each day:
For our company, we have an automated script that runs through all of our sites (and client's sites) each day to make sure that the site is set to index / follow, both on the pages and in the robot.txt file. We also check the title tag and make sure that the name servers haven't changed.
I also pay someone on my team to run through a 12 step checklist each and every day to make sure that things like the site search are working, contact forms go through properly, and that pages are set to index / follow.
I hope this helps...
Thanks,
-- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Incorrect Spelling Indexed In Meta Info - Can't Change It
Hi,It would be great if a member of the community could help me to resolve this issue.Google is indexing an incorrect spelling on of our key pages and we can't identify the reason why.- The page in question: https://newbridgesilverware.com/jewelleryAs you can see from the attached image, the Meta Title is rendered to contain the keyword "jewelry" (the American spelling.) We want this to read as "jewellery" - the British-English spelling. Yet in the page source the word is given in the meta title as "jewellery". Nowhere in the page source or on the page itself does the American spelling appear - yet Google still renders it in the Meta Title.Can anyone identify why this is happening and offer any possible solutions?Much appreciatedDhqJp
Intermediate & Advanced SEO | | Johnny_AppleSeed1 -
301 redirects aren't passing value.
We recently migrated our shop to a new platform. We are using Wordpress for our main website, but we wanted a separate installation of Wordpress for our shop, so we left the main blog where it was, but moved the shop to a /shop/ sub directory with it's on WP installation. So now we have 2 installations of Wordpress. However, since we've done this, none of the pages on the new shop are ranking for anything. Their page rank is 0, and Moz page authority is 1 for every page on the new site. I've set up the proper 301 redirects, and they're redirecting fine, but none of the page value is coming over. It's been about a week now, and despite re-crawls by google, I'm not seeing any change. Also, one of the original (now re-directed) product pages still has a Page Authority of 13 according to Open Site Explorer. I know it's not high, but it had us ranking in the top 5 for a very important keyword, and now that value is being wasted. For example, one of our product pages that was ranking well was startupfashion.com/product/fashion-brand-line-sheet-template
Intermediate & Advanced SEO | | inkyj
That page is now redirected to
http://startupfashion.com/shop/product/fashion-line-sheet-template I've done 301's plenty of times and I've never seen this issue, so i'm wondering if it could have something to do with having multiple installations of Wordpress. I can't see any obvious issues with it... i have the Yoast SEO plugin configured properly on both installations, and all of the pages ARE being indexed by google. Not sure what is going on. Anyone have any experience with this, or have any ideas? Thanks!!0 -
Stock lists - follow of nofollow?
a bit of a catch 22 position here that i could use some advice on please! We look after a few Car dealership sites that have daily (some 3 times a day) stock feeds that add and remove cars form the site, which in turn removes/creates pages for each vehicle. We all know how much search engines like sites that have content that is updated regularly but the frequency it happens on our sites means we are left with lots of indexed pages that are no longer there. now my question is should i nofollow/disallow robots on all the pages that are for the details of the vehicles meaning the list pages will still be updated daily for "new content" or allow google to index everything and manage the errors to redirect to relevant pages? is there a "best practice" way to do this or is it really personal preference?
Intermediate & Advanced SEO | | ben_dpp0 -
Can't crawl website with Screaming frog... what is wrong?
Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
Intermediate & Advanced SEO | | McTaggart
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/0 -
What do you think about this links? Toxic or don't? disavow?
Hi, we are now involved in a google penalty issue (artificial links – global – all links). We were very surprised, cause we only have 300 links more less, and most of those links are from stats sites, some are malware (we are trying to fight against that), and other ones are article portals. We have created a spreadsheet with the links and we have analyzed them using Link Detox. Now we are sending emails, so that they can be removed, or disavow the links what happen is that we have very few links, and in 99% of then we have done nothing to create that link. We have doubts about what to do with some kind of links. We are not sure them to be bad. We would appreciate your opinion. We should talk about two types: Domain stats links Article portals Automatically generated content site I would like to know if we should remove those links or disavow them These are examples Anygator.com. We have 57 links coming from this portal. Linkdetox says this portal is not dangerous http://es.anygator.com/articulo/arranca-la-migracion-de-hotmail-a-outlook__343483 more examples (stats or similar) www.mxwebsite.com/worth/crearcorreoelectronico.es/ and from that website we have 10 links in wmt, but only one works. What do you do on those cases? Do you mark that link as a removed one? And these other examples… what do you think about them? More stats sites: http://alestat.com/www,crearcorreoelectronico.es.html http://www.statscrop.com/www/crearcorreoelectronico.es Automated generated content examples http://mrwhatis.net/como-checo-mi-correo-electronico-yaho.html http://www.askives.com/abrir-correo-electronico-gmail.html At first, we began trying to delete all links, but… those links are not artificial, we have not created them, google should know those sites. What would you do with those sites? Your advices would be very appreciated. Thanks 😄
Intermediate & Advanced SEO | | teconsite0 -
Leaking organic traffic - how to debug?
Hi all, We've been running an eCommerce marketplace for more than 2.5 years now. Most of our traffic and revenue have been from organic traffic, which have been growing steadily with our inventory and brand, peaking at March this year. From there, we started losing organic traffic (and revenue) each month, at a rate of about 15-20% - for no reason we can understand. In addition, some of our older pages no longer appear in search results (unless we add the name of the site to the search query). We launched a redesign on the end of May, which seemed to initially improve engagement, but didn't affect this trend of lower organic traffic. Our webmaster tools doesn't show anything special - if anything, we made an effort to clean-up every 404 that appears there and other small issues. We did make the following changes very recently, but it did not seem to have a positive effect (so far): We have deep pagination for some categories of the site, and we just added rel=prev,next in the head of every paginated series on the site. We started generating a dynamic sitemap and submitted it to google. For some reason only about a fourth of the pages on the sitemap are indexed. In addition, the "index status" as reported by webmaster tools shows some weird numbers. First, the number there is way bigger than the amount of pages we have - possibly all the combinations of our listing categories and pagination. That number was constant for a while, before taking a deep earlier this year, rising back up and declining again for the last couple of months. Screenshot of the graph What would be the first steps you'd take to understand the core of the problem? we're really at a loss here.
Intermediate & Advanced SEO | | erangalp1 -
Best practice for listings with outbound links
My site contains a number of listings for charities that offer various sporting activities for people to get involved in order to raise money. As part of the listing we provide an outbound link for the user to find out more info about each of the charities and their activities. Currently these listings are blocked in the robots.txt for fear that we may be viewed as a 'link farm or spam site' (as there are hundreds of charities listed on the scrolling page) but these links out are genuine and provide benefits and are a useful resource for the user and not paid links. What I'd like to do is make these listings fully crawlable and indexable to increase our search traffic to these listing, but I'm not sure whether this would have a negative impact on our Pagerank with Google potentially viewing all these outbound links as 'bad' or 'paid links', Would removing the listing pages from our robots.txt and making all the outbound links 'nofollow' be the way forward to allow us to properly index the listings without being penalised as some kind of link farm or spam site? (N.B. I have no interest in passing link juice to the external charity websites)
Intermediate & Advanced SEO | | simon_realbuzz0 -
Google Places Duplicate Listings
Hey Mozzers- I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive. I have a client with a claimed and verified listings page, which is here: http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461 There is also another listing (which I have not claimed yet) here: http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330 The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months). So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance! Ian
Intermediate & Advanced SEO | | itrogers0