Site went down and traffic hasn't recovered
-
Very curious situation. We have a network of sites. Sunday night one (only one) of our sites goes down, and since then we've seen a loss in traffic across all our sites!! Not only have we seen a loss of traffic, we also saw a loss of indexed pages. A complete drop off from 1.8 million to 1.3 million pages indexed.
Does anyone know why one site outtage would affect the rest of them? And the indexed pages? Very confused.
Thanks,
-
My indexation number went back to normal for 2/3 sites. But for one of my sites, the number still hasn't returned to normal. Do you have any idea as to why this might be? Do you think it's a bug with Google?
-
WOW that's the exact date that my index number went down! That is a huge relief, but at the same time I'm still concerned that my traffic went down during that time. Thank you for sharing the video.
-
Hi,
Just a note on the indexation drop, it seems Google has adjusted how they display this figure, so if you're referring to Google Search Console for your indexation stats, this may explain it:
http://searchengineland.com/google-adjusted-how-they-show-index-count-estimates-230158
Hope it helps!
-
Thank you Dmitrii for your response.
No, our sites were not hit by manual actions, I checked out GWT for that. But yes, they do link to each other a lot and provide a very reasonable amount of referral traffic to each other. It's not necessarily a network, it's just a handful of domains that have different content, audiences and pages that happen to be owned by the same company. So no, it's not a network in the spammy sense.
But, you do provide a good point about the duplicate content and rankings. I will check those to see if it could have any affect. There was a panda refresh around this time so perhaps that added to our troubles.
-
Thank you for your response, Michael.
No, the site isn't down anymore. It was down for a couple of hours. We are getting traffic again, but not to the level it was at before. I have already checked the robots.txt file, but I will try the Fetch and Render suggestion as well.
Thanks!
-
Hi.
"Network of sites" always makes my spam sensors tingle Please provide more information on the structure. Now, why did that website go down? Server problems? Have you checked the manual actions in GWT for those websites? You sure you're not being hit by some kinda penalization?
Now, the larger total number of indexed pages not necessarily means that it's good or would help your rankings/traffic. As well as deindexing does not mean the opposite. https://www.youtube.com/watch?v=AVOrml7fp2c However, the usual reasons of deindexing are connected/related to spammy techniques or duplicate content. What does your GWT crawling stats say? Was there a spike recently?
How about rankings of your websites? Did they go down as well? If so, then what happened first?
Since it's a "network", do those websites link to each other? do they link to each other a lot? Is it that much that the most of backlink profile of those websites are links from each other?
Anyway, the easiest explanation which comes to my head is (assuming that it's a "network of websites"):
Websites have links to each other -> One of them goes down -> Google sees that links went missing -> lowers the rankings (since backlink profile got worse) -> traffic goes down.
-
Is that site still down? Typically when I've seen sites go down, unless it's for a long time, Google doesn't seem to drop it from the index. I had a client site down all day Saturday and it continued to rank well.
And I don't see a reason why that would affect the other sites, unless a huge percentage of their inbound links were from the site that was down--but even then, it would have to be down weeks, at least.
I'm inclined to think that the site outage is a red herring, and that there's something else in common between the sites that's causing an issue. Have you done a fetch-and-render as Googlebot for each of the sites in Search Console? Maybe something is blocked by robots.txt in all the sites that's preventing rendering, and Google is seeing very little content above the fold? <-- bit of a wild guess there...but that's all I've got!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
I'm getting an error in Search Console that pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. Unfortunately I can't post images on here but I've linked some url's below. The page below in search console shows the error above... https://mixeddigitaleduconsulting.com/ As does this one. https://mixeddigitaleduconsulting.com/independent-school-marketing-communications/ However, this page does not have the error and is indexed by Google. The meta robots tag looks identical. https://mixeddigitaleduconsulting.com/blog/leadership-team/jill-goodman/ Any and all help is appreciated.
Technical SEO | | Sean_White_Consult0 -
We can't figure out why competitors have better position(s) in Google
We are using MOZ analytics for some days now, and it really helps us with important information about our rankings.
Technical SEO | | wilcoXXL
I hope you guys can help us out with the following particular case; In google.nl (dutch) we rank position #18 with the following searchterm 'sphinx 345' one of our competitors rank position #3.
We used the MOZ On Page Grade tool to find out some details about the two pages:
Our page #18: http://goo.gl/cTsbmI
Competitor page #3: http://goo.gl/qk21sM Our page hits an A and Keyword usage for "sphinx 345" = 52
The competitors page hits an A too and Keyword usage for "sphinx 345" = 45 About the link structure; for our page there is no link data found in Open Site Explorer. The url exists about a year and a half now.
I'm also very sure we have many internal links to this url.
Does Google and other crawlers have a hard time to crawl our site?(it's a Magento site, our competitors do have custom-made e-commerce systems, maybe that has something to do with it?) As i were saying;we can't figure this out. I hope you guys can help to get us any further. Regards, Wilco0 -
Moving articles to new site, can't 301 redirect because of panda
I have a site that is high quality, but was hit by penguin and perhaps panda. I want to remove some of the articles from my old site and put them on my new site. I know I can't 301 redirect them because I will be passing on the bad google vibes. So instead, I was thinking of redirecting the old articles to a page on the old site which explains that the article is moved over to the new site. I assume that's okay? I'm wondering how long I should wait between the time I take them down from the old site to the time I repost them on the new site. Do I need to wait for Google to de-index them in order to not be considered duplicate content/syndication? We'll probably reword them a bit, too - we really want to avoid panda. Thanks!
Technical SEO | | philray
Phil0 -
We registered with Yahoo Directory. Why won't this show up as a a linking root domain in our link analysis??
Recently checked our link analysis report for 2 of our campaigns who are registered in the dir.yahoo.com (yahoo directory). For some reason, we don't see this being a domain that shows up as linking to our website - why is this?
Technical SEO | | MMP0 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0 -
How much of an issue is it if a site is somehow connected to a site that was penalized by Google?
I am working with someone that is about to launch a new site, and one of the sites was affected by the Panda update. Does it matter if the two sites are connected? Share the same hosting provider and same Google Webmaster's account?
Technical SEO | | nicole.healthline0 -
Switching ecommerce CMS's - Best Way to write URL 301's and sub pages?
Hey guys, What a headache i've been going through the last few days trying to make sure my upcoming move is near-perfect. Right now all my urls are written like this /page-name (all lowercase, exact, no forward slash at end). In the new CMS they will be written like this: /Page-Name/ (with the forward slash at the end). When I generate an XML sitemap in the new ecomm CMS internally it lists the category pages with a forward slash at the end, just like they show up through out the CMS. This seems sloppy to me, but I have no control over it. Is this OK for SEO? I'm worried my PR 4, well built ecommerce website is going to lose value to small (but potentially large) errors like this. If this is indeed not good practice, is there a resource about not using the forward slash at the end of URLS in sitemaps i can present to the community at the platform? They are usually real quick to make fixes if something is not up to standards. Thanks in advance, -First Time Ecommerce Platform Transition Guy
Technical SEO | | Hyrule0