Organic listings disappeared I don't know why!
-
Brief history:
I am MD of a medium sized health organisation in the UK. We have one of the leading websites in the world for our industry. We were hit by a Google algorithm update last year (Penguin or Panda, I can't remember, but that's not relevant here I don't think) and our daily visits went down from around 10,000 to around 5,000 in two separate hits over a couple of months. Then there was a steady decrease to about 3,000-4,000 visits a day until we totally updated the design of the site and did some good work on the content. We have always been white-hat and the site has around 3,000 pages with unique content added daily.
So things have really been on the up for the past couple of months. We have been receiving around 6,000 visits a day in recent weeks (a slow incline over the past few months), until Sunday. Sunday morning around 10am all of our organic listings pretty much disappear, including for our brand name. Monday morning a few come back, including our brand name and our main, most competitive keyword, which we were showing up on the third page for and we returned to this page. Then Tuesday morning another few of our most competitive keywords show up, back where they were before. This includes images which had disappeared from Google images.
Our PPC and business listings were not really affected at all.
My developer submitted a site map through webmaster tools on Monday morning and I'm not sure if this is the reason pages started to show up again. In our Webmaster tools the indexed pages are about a quarter of all of the ones on the site - all pages were indexed before. I just don't know what has happened! It doesn't make any sense as 1. Google don't seem to have rolled out any algorithm updates on that day 2. we do not have any messages in Webmaster Tools 3. a number of our main keywords have re-appeared - why would that happen if we had been hit by a Google update?!
Our organic hits, which previously made up about 80% of all our hits, have gone down by 80% and this is drastically affecting business. If this continues it is likely we will have to downsize the business and I'm not sure what to do.
When I saw that the 'indexed pages' in Webmaster tools started to increase (they were around 600 on Monday, around 900 yesterday and then this morning, around 1,300), I thought that we were on our way up and maybe this problem would just resolve itself and our listings would re-appear, but now our indexed pages have reduced slightly since this morning, back down to around 1,100 so the increase has stalled.
Can anybody help?! Do you have any idea what could be causing this? Apparently there have been no changes made to robots.txt and my developer says that no changes were made that could have affected our listings.
ANY ADVICE WOULD BE GREATLY APPRECIATED.
-
Interesting situation...and very frustrating for you, I'm sure.
You mentioned this below:
"I checked 'cached snapshot of page' in Google Toolbar for the pages that weren't being indexed, and it showed up as a 404 error. "
This sounds like you had some sort of technical error. But some things still don't add up for me. It kind of sounds like your pages were not resolving for Google. But, the odd thing is that if Google sees a 404 error, they keep trying for days, weeks or even months before they conclude that the pages should be removed from the index.
I don't have an answer for you but the first place I'd look is to make sure that your robots.txt file is not blocking Googlebot in some way. I'd also check server logs and perhaps check with your host to see if there was some significant down time for the site.
If there was a technical glitch, and the problem is now fixed, then your pages should come back into the index without you doing anything.
I'm pretty certain this isn't a penalty issue though.
-
Thank you. I will look into this, although I don't think the pages are set to no follow because there has been a further development. I checked 'cached snapshot of page' in Google Toolbar for the pages that weren't being indexed, and it showed up as a 404 error. These are pages that have always been cached before this problem occurred. I then went to 'submit URL to Google' and submitted a couple of URLs. They instantly showed up in Google's listings in the same spot as they were before and the cached snapshot of page then showed up correctly. I could do that for every page but 1. that would be a HUGE job and 2. would that look spammy or suspicious to Google? 3. is there a way of doing that for multiple pages at a time?! I feel like this problem is very close to being solved but I just don't quite know how to solve it.
-
The only time I've seen this type of thing happen - all of the pages in a site are no longer indexed, yet PPC still works, is when something on the site has been set to no-index / no-follow.
If you had a manual penalty from Google, that would show up in Google Webmaster Tools. Plus, the site would still be indexed, just ranked really, really low. If everything was missing from Google's cache, then the most likely explanation is that it was set accidentally to no-index / no-follow.
This is a very easy thing to mess up, and it's possible that someone might have hit the wrong button by accident, or updated the robots.txt file.
In the past, I had a project manager who messed this up for a client while doing a content update on the site, and it was about a week before anyone noticed. She's no longer here (not due just to that issue). But this is so critical for me and my company that we've put an automated and human testing check in place each day:
For our company, we have an automated script that runs through all of our sites (and client's sites) each day to make sure that the site is set to index / follow, both on the pages and in the robot.txt file. We also check the title tag and make sure that the name servers haven't changed.
I also pay someone on my team to run through a 12 step checklist each and every day to make sure that things like the site search are working, contact forms go through properly, and that pages are set to index / follow.
I hope this helps...
Thanks,
-- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moved company 'Help Center' from Zendesk to Intercom, got lots of 404 errors. What now?
Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/*** is where all our articles used to sit https://help.rewind.io/*** is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!
Intermediate & Advanced SEO | | henrycabrown1 -
Syndicated content with meta robots 'noindex, nofollow': safe?
Hello, I manage, with a dedicated team, the development of a big news portal, with thousands of unique articles. To expand our audiences, we syndicate content to a number of partner websites. They can publish some of our articles, as long as (1) they put a rel=canonical in their duplicated article, pointing to our original article OR (2) they put a meta robots 'noindex, follow' in their duplicated article + a dofollow link to our original article. A new prospect, to partner with with us, wants to follow a different path: republish the articles with a meta robots 'noindex, nofollow' in each duplicated article + a dofollow link to our original article. This is because he doesn't want to pass pagerank/link authority to our website (as it is not explicitly included in the contract). In terms of visibility we'd have some advantages with this partnership (even without link authority to our site) so I would accept. My question is: considering that the partner website is much authoritative than ours, could this approach damage in some way the ranking of our articles? I know that the duplicated articles published on the partner website wouldn't be indexed (because of the meta robots noindex, nofollow). But Google crawler could still reach them. And, since they have no rel=canonical and the link to our original article wouldn't be followed, I don't know if this may cause confusion about the original source of the articles. In your opinion, is this approach safe from an SEO point of view? Do we have to take some measures to protect our content? Hope I explained myself well, any help would be very appreciated, Thank you,
Intermediate & Advanced SEO | | Fabio80
Fab0 -
NGinx rule for redirecting trailing '/'
We have successfully implemented run-of-the-mill 301s from old URLs to new (there were about 3,000 products). As normal. Like we do on every other site etc. However, recently search console has started to report a number of 404s with the page names with a trailing forward slash at the end of the .html suffix. So, /old-url.html is redirecting (301) to /new-url.html However, now for some reason /old-url.html/ has 'popped up' in the Search Console crawl report as a 404. Is there a 'blobal' rule you can write in nGinx to say redirect *.html/ to */html (without the forward slash) rather than manually doing them all?
Intermediate & Advanced SEO | | AbsoluteDesign0 -
H1 tag found on page, but saying doesn't match keyword
We've run a on-page grader test on our home page www.whichledlight.com with the keyword 'led bulbs' it comes back with saying there is a H1 tag, although the content of the keyword apperently doesn't contain 'led bulbs... which seems a bit odd because the content of the tag is 'UK’s #1 Price Comparison Site for LED Bulbs` I've used other SEO checkers and some say we don't even have a H1 tag, or H2, H3 and so on for any page. Screaming Frog seems to think we have a H1 tag though, and can also detect the content of the tag. Any ideas? ** Update ** The website is a single page app (EmberJS) so we use prerender to create snapshots of the pages.
Intermediate & Advanced SEO | | TrueluxGroup
We were under the impression that MOZ can crawl these prerendered pages fine, so were a bit baffled as to why it would say we have a H1 tag, but think the contents of the tag still doesn't match our keyword.0 -
Why isn't my site being indexed by Google?
Our domain was originally pointing to a Squarespace site that went live in March. In June, the site was rebuilt in WordPress and is currently hosted with WPEngine. Oddly, the site is being indexed by Bing and Yahoo, but is not indexed at all in Google i.e. site:example.com yields nothing. As far as I know, the site has never been indexed by Google, neither before nor after the switch. What gives? A few things to note: I am not "discouraging search engines" in WordPress Robots.txt is fine - I'm not blocking anything that shouldn't be blocked A sitemap has been submitted via Google Webmaster Tools and I have "fetched as Google" and submitted for indexing - No errors I've entered both the www and non-www in WMT and chose a preferred There are several incoming links to the site, some from popular domains The content on the site is pretty standard and crawlable, including several blog posts I have linked up the account to a Google+ page
Intermediate & Advanced SEO | | jtollaMOT0 -
Local search vs. Organic Listings
Hi ~ I was interested to see if anyone feels there might be an advantage to keeping a business out of Google's Local Search listing area or at least trying to keep it out of the 7-pack display? It seems to me that sites who are not listed in the 7-pack can often be ranked above the maps/7-pack area in the regular organic listings. Also, is there anyway for a homepage to be listed on the 1st page in both the local search and organic listings? Thanks!
Intermediate & Advanced SEO | | hhdentist0 -
Best practice for listings with outbound links
My site contains a number of listings for charities that offer various sporting activities for people to get involved in order to raise money. As part of the listing we provide an outbound link for the user to find out more info about each of the charities and their activities. Currently these listings are blocked in the robots.txt for fear that we may be viewed as a 'link farm or spam site' (as there are hundreds of charities listed on the scrolling page) but these links out are genuine and provide benefits and are a useful resource for the user and not paid links. What I'd like to do is make these listings fully crawlable and indexable to increase our search traffic to these listing, but I'm not sure whether this would have a negative impact on our Pagerank with Google potentially viewing all these outbound links as 'bad' or 'paid links', Would removing the listing pages from our robots.txt and making all the outbound links 'nofollow' be the way forward to allow us to properly index the listings without being penalised as some kind of link farm or spam site? (N.B. I have no interest in passing link juice to the external charity websites)
Intermediate & Advanced SEO | | simon_realbuzz0 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0