Site went down and traffic hasn't recovered
-
Very curious situation. We have a network of sites. Sunday night one (only one) of our sites goes down, and since then we've seen a loss in traffic across all our sites!! Not only have we seen a loss of traffic, we also saw a loss of indexed pages. A complete drop off from 1.8 million to 1.3 million pages indexed.
Does anyone know why one site outtage would affect the rest of them? And the indexed pages? Very confused.
Thanks,
-
My indexation number went back to normal for 2/3 sites. But for one of my sites, the number still hasn't returned to normal. Do you have any idea as to why this might be? Do you think it's a bug with Google?
-
WOW that's the exact date that my index number went down! That is a huge relief, but at the same time I'm still concerned that my traffic went down during that time. Thank you for sharing the video.
-
Hi,
Just a note on the indexation drop, it seems Google has adjusted how they display this figure, so if you're referring to Google Search Console for your indexation stats, this may explain it:
http://searchengineland.com/google-adjusted-how-they-show-index-count-estimates-230158
Hope it helps!
-
Thank you Dmitrii for your response.
No, our sites were not hit by manual actions, I checked out GWT for that. But yes, they do link to each other a lot and provide a very reasonable amount of referral traffic to each other. It's not necessarily a network, it's just a handful of domains that have different content, audiences and pages that happen to be owned by the same company. So no, it's not a network in the spammy sense.
But, you do provide a good point about the duplicate content and rankings. I will check those to see if it could have any affect. There was a panda refresh around this time so perhaps that added to our troubles.
-
Thank you for your response, Michael.
No, the site isn't down anymore. It was down for a couple of hours. We are getting traffic again, but not to the level it was at before. I have already checked the robots.txt file, but I will try the Fetch and Render suggestion as well.
Thanks!
-
Hi.
"Network of sites" always makes my spam sensors tingle Please provide more information on the structure. Now, why did that website go down? Server problems? Have you checked the manual actions in GWT for those websites? You sure you're not being hit by some kinda penalization?
Now, the larger total number of indexed pages not necessarily means that it's good or would help your rankings/traffic. As well as deindexing does not mean the opposite. https://www.youtube.com/watch?v=AVOrml7fp2c However, the usual reasons of deindexing are connected/related to spammy techniques or duplicate content. What does your GWT crawling stats say? Was there a spike recently?
How about rankings of your websites? Did they go down as well? If so, then what happened first?
Since it's a "network", do those websites link to each other? do they link to each other a lot? Is it that much that the most of backlink profile of those websites are links from each other?
Anyway, the easiest explanation which comes to my head is (assuming that it's a "network of websites"):
Websites have links to each other -> One of them goes down -> Google sees that links went missing -> lowers the rankings (since backlink profile got worse) -> traffic goes down.
-
Is that site still down? Typically when I've seen sites go down, unless it's for a long time, Google doesn't seem to drop it from the index. I had a client site down all day Saturday and it continued to rank well.
And I don't see a reason why that would affect the other sites, unless a huge percentage of their inbound links were from the site that was down--but even then, it would have to be down weeks, at least.
I'm inclined to think that the site outage is a red herring, and that there's something else in common between the sites that's causing an issue. Have you done a fetch-and-render as Googlebot for each of the sites in Search Console? Maybe something is blocked by robots.txt in all the sites that's preventing rendering, and Google is seeing very little content above the fold? <-- bit of a wild guess there...but that's all I've got!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big drop in organic traffic after new site launched
Hi There, I have had a drop of around 40% in site traffic since we migrated our site from Magento to Woocommerce. The products were migrated across and kept the same title tags, meta descriptions, copy etc... I set up 301's on the top 100 landing pages and submitted a new site map using Google Web Master Tools. It looked like the traffic was coming back to where it was but the gap has widened again. Can anyone advise me on what I may have missed or how to go about diagnosing the problem and fixing it
Technical SEO | | JonesBros0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Why can't I rank for my brand name?
We are soon to launch a new company in New Zealand called Zing. I have been tasked with the challenge of ranking as highly as possible for anything to do with Zing before launch in February. Zing is in the financial industry so my colleagues thought that it would be a good idea to make a small blog (very small with literally one post) that reviewed other financial lenders. This sight stayed online for a couple of months before it was replaced. The official website is still yet to launch, so as an in between, I asked that we make a splash page with a small competition on it (see here at zing.co.nz). I would have preferred there were more keywords on the website but this was not achieved. I am still pushing for this and am hoping to get a few pages on there in the near future. Instead of getting the keywords on the splash page, I was given permission to start a subdomain, (blog.zing.co.nz). This contains many more common search terms and although its not quite doing the job I would like, the rankings for Zing have started to increase. At the moment, we are ranking number 1 for a few brand related keywords such as zing loans. This is why I feel something is wrong, because we rank number 1 for over 10 similar terms but yet we DO NOT EVEN APPEAR on the search engines at all for Zing. Have we been penalized? Do you have any suggestions at all? Do you think we could have been penalized for the first average blog? Maybe I messed up the swap over? Any help would be hugely appreciated!
Technical SEO | | Startupfactory0 -
'No Follow' and 'Do Follow' links when using WordPress plugins
Hi all I hope someone can help me out with the following question in regards to 'no follow' and 'do follow' links in combination with WordPress plugins. Some plugins that deal with links i.e. link masking or SEO plugins do give you the option to 'not follow' links. Can someone speak from experience that this does actually work?? It's really quite stupid, but only occurred to me that when using the FireFox add on 'NoDoFollow' as well as looking at the SEOmoz link profile of course, 95% of my links are actually marked as FOLLOW, while the opposite should be the case. For example I mark about 90% of outgoing links as no follow within a link masking plugin. Well, why would WordPress plugins give you the option to mark links as no follow in the first place when they do in fact appear as follow for search engines and SEOmoz? Is this a WordPress thing or whatnot? Maybe they are in fact no follow, and the information supplied by SEO tools comes from the basic HTML structure analysis. I don't know... This really got me worried. Hope someone can shed a light. All the best and many thanks for your answers!
Technical SEO | | Hermski0 -
Web page is showing up on Google but doesn't show when it was cached, so is it indexed?
Hey everyone So I created a new page on a WordPress website, it was live for a few hours till I changed my mind & switched it back to a draft. Just out of curiosity I did the Site:www.example.com/Example search on Google to see if it had been indexed & apparently it had but when I click on cached to see what time it got indexed at exactly it's showing me an error. So does this mean it is indexed or not?
Technical SEO | | conversiontactics0 -
Are these 'not found' errors a concern?
Our webmaster report is showing thousands of 'not found' errors for links that show up in javascript code. Is this something we should be concerned about? Especially since there are so many?
Technical SEO | | nicole.healthline0 -
Switching Site to a Domain Name that's in Use
I'm comfortable with the steps of moving a site to a new domain name as recommended by Google. However, in this case, the domain name I'm asked to move to is not really "new" ... meaning it's currently hosting a website and has been for a long time. So my question is, do I do this in steps and take the old website down first in order to "free up" the domain name in they eyes of search engines to avoid large numbers of 404s and then (in step 2) switch to the "new" domain in a few months? Thanks.
Technical SEO | | R2iSEO0 -
Do or don't —forward a parked domain to a live website?
Hi all, I'm new to SEO and excited to see the launch of this forum. I've searched for an answer to this question but haven't been able to find out. I "attended" two webinars recently regarding SEO. The above subject was raised in each one and the speakers gave a polar opposite recommendations. So I'm completely at a loss as to what to do with some domains that are related to a domain used on a live website that I'm working to improve the SEO on. The scenario: Live website at (fictitious) www.digital-slr-camera-company.com. I also have 2 related domain names which are parked with the registrar: www.dslr.com, www.digitalslr.com. The question: Is there any SEO benefit to be gained by pointing the two parked domains to the website at www.digitalcamercompany.com? If so, what method of "pointing" should be used? Thanks to any and all input.
Technical SEO | | Technical_Contact0