Site went down and traffic hasn't recovered
-
Very curious situation. We have a network of sites. Sunday night one (only one) of our sites goes down, and since then we've seen a loss in traffic across all our sites!! Not only have we seen a loss of traffic, we also saw a loss of indexed pages. A complete drop off from 1.8 million to 1.3 million pages indexed.
Does anyone know why one site outtage would affect the rest of them? And the indexed pages? Very confused.
Thanks,
-
My indexation number went back to normal for 2/3 sites. But for one of my sites, the number still hasn't returned to normal.
Do you have any idea as to why this might be? Do you think it's a bug with Google?
-
WOW that's the exact date that my index number went down! That is a huge relief, but at the same time I'm still concerned that my traffic went down during that time. Thank you for sharing the video.
-
Hi,
Just a note on the indexation drop, it seems Google has adjusted how they display this figure, so if you're referring to Google Search Console for your indexation stats, this may explain it:
http://searchengineland.com/google-adjusted-how-they-show-index-count-estimates-230158
Hope it helps!
-
Thank you Dmitrii for your response.
No, our sites were not hit by manual actions, I checked out GWT for that. But yes, they do link to each other a lot and provide a very reasonable amount of referral traffic to each other. It's not necessarily a network, it's just a handful of domains that have different content, audiences and pages that happen to be owned by the same company. So no, it's not a network in the spammy sense.
But, you do provide a good point about the duplicate content and rankings. I will check those to see if it could have any affect. There was a panda refresh around this time so perhaps that added to our troubles.
-
Thank you for your response, Michael.
No, the site isn't down anymore. It was down for a couple of hours. We are getting traffic again, but not to the level it was at before. I have already checked the robots.txt file, but I will try the Fetch and Render suggestion as well.
Thanks!
-
Hi.
"Network of sites" always makes my spam sensors tingle
Please provide more information on the structure. Now, why did that website go down? Server problems? Have you checked the manual actions in GWT for those websites? You sure you're not being hit by some kinda penalization?
Now, the larger total number of indexed pages not necessarily means that it's good or would help your rankings/traffic. As well as deindexing does not mean the opposite. https://www.youtube.com/watch?v=AVOrml7fp2c However, the usual reasons of deindexing are connected/related to spammy techniques or duplicate content. What does your GWT crawling stats say? Was there a spike recently?
How about rankings of your websites? Did they go down as well? If so, then what happened first?
Since it's a "network", do those websites link to each other? do they link to each other a lot? Is it that much that the most of backlink profile of those websites are links from each other?
Anyway, the easiest explanation which comes to my head is (assuming that it's a "network of websites"):
Websites have links to each other -> One of them goes down -> Google sees that links went missing -> lowers the rankings (since backlink profile got worse) -> traffic goes down.
-
Is that site still down? Typically when I've seen sites go down, unless it's for a long time, Google doesn't seem to drop it from the index. I had a client site down all day Saturday and it continued to rank well.
And I don't see a reason why that would affect the other sites, unless a huge percentage of their inbound links were from the site that was down--but even then, it would have to be down weeks, at least.
I'm inclined to think that the site outage is a red herring, and that there's something else in common between the sites that's causing an issue. Have you done a fetch-and-render as Googlebot for each of the sites in Search Console? Maybe something is blocked by robots.txt in all the sites that's preventing rendering, and Google is seeing very little content above the fold? <-- bit of a wild guess there...but that's all I've got!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirects and site map isn't showing
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening: 1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools 2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects Have Bluehost messed things up? Hope you can help thanks
Technical SEO | | Caffeine_Marketing0 -
Product schema GSC Error 'offers, review, or aggregateRating should be specified'
I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier. Google Search Console gave me an error today that said: 'offers, review, or aggregateRating should be specified' I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions? Thanks in advance.
Technical SEO | | RoxBrock1 -
The use of tabs on productpages, do or don't?
Does google has any trouble reading content tabs? The content is not loaded by ajax and is already in the page source code.
Technical SEO | | wilcoXXL
As i'm checking some big e-commerce websites or (amazon.com for example) they get rid of the tabs with content and put the different content below eachother. Is his better for SEO purpose? But what about user experience? For users it think it is easier to navigate by tabs then to have a long page to scroll. What do you guys think about this issue?0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
Does a CMS inhibit a site's crawlability?
I smell baloney but I could use a little backup from the community! My client was recently told by an SEO that search engines have a hard time getting to their site because using a CMS (like WordPress) doesn't allow "direct access to the html". Here is what they emailed my client: "Word Press (like your site is built with) and other similar “do it yourself” web builder programs and websites are not good for search engine optimization since they do not allow direct access to the HTML. Direct HTML access is needed to input important items to enhance your websites search engine visibility, performance and creditability in order to gain higher search engine rankings." Bots are blind to CMSs and html is html, correct? What do you think about the information given by the other SEO?
Technical SEO | | Adpearance0 -
How can I best find out which URLs from large sitemaps aren't indexed?
I have about a dozen sitemaps with a total of just over 300,000 urls in them. These have been carefully created to only select the content that I feel is above a certain threshold. However, Google says they have only indexed 230,000 of these urls. Now I'm wondering, how can I best go about working out which URLs they haven't indexed? No errors are showing in WMT related to these pages. I can obviously manually start hitting it, but surely there's a better way?
Technical SEO | | rango0 -
Url's don't want to show up in google. Please help?
Hi Mozfans 🙂 I'm doing a sitescan for a new client. http://www.vacatures.tuinbouw.nl/ It's a dutch jobsite. Now the problem is here: The url http://www.vacatures.tuinbouw.nl/vacatures/ is in google.
Technical SEO | | MaartenvandenBos
On the same page there are jobs (scroll down) with a followed link.
To a url like this: http://www.vacatures.tuinbouw.nl/vacatures/722/productie+medewerker+paprika+teelt/ The problem is that the second url don't show up in google. When i try to make a sitemap with Gsitecrawler the second url isn't in de sitemap.. :S What am i doing wrong? Thanks!0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0