Server 500: website deindexed?
-
Hi mozzers,
Since August 22nd, (not a site I manage) has had a Server error 500 and all the pages got deindexed?
This is obviously a server issue but why it got deindexed is it because it's been a while since it had this server issue?
On the pages I checked the pages loads correctly so I am a bit confused here! His webmaster account show 1500 server errors!
Can someone tell me what is going on and how to fix it?
Thanks
-
It's hard to diagnose the problem without seeing it. Does the Fetch As Google results show any errors or report anything but "Success"?
The report just shows you what Google is seeing so look for anything missing from the page. Usually if it can fetch, then Google can see your site. Double check there isn't a noindex added into the meta somewhere.
-
Hi Mozzers,
Thank you for your help. Unfortunately can't share the details. The server hosting the site wasn't helpful at all. The owner of the site sent me his Fetch but not sure how to interpret that.
Can one of you tell me how to determine the issue by reading the Fetch from Google? Any other steps I can do to identify the issue?
Thanks!
-
Would you be able to post the domain for us to look at?
If the site has been "down" or unaccessible by Google, it's natural for them to deindex the site. There's a grace period for server issues but a couple months will definitely get your pages removed. That doesn't mean every single page is deindexed, but some definitely will. Bring the site and servers back to good standing and the pages will flow back into Google's results.
-
Hi Tay,
I think I don't understand clearly. You mean that "your" site has been returning a 500 during more than 2 months?
Than it's normal that google may have deindexed it, don't forget that google main task is to return to users the best results so they threat 404 and 500 the same in that sense. Maybe you can consider filing a reconsideration request if, after having solved every issue you can't see your site raising again into google serps where it was before.
-
Definitely a server issue and exactly the reason why your pages will have been de-indexed! Could you share the URL for the site so that I can take a look?
Unfortunately a 500 error could be literally anything and you might still get 500 errors even if the pages are being served. It's not always easy to work out what the issue is as a 500 is a catch-all error and you won't get anything more detailed than that!
I would firstly recommend talking to whoever owns the server and explaining the issue to them to see if they can narrow it down. Also, run a "fetch as Googlebot" through your Webmaster tools account and see if this highlights anything...
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did my website DA fell down?
Hello, Could you please let me know why might my website's DA have fallen down in merely a week? What might be a reason? I also noticed traffic from google dropped down at the very same week. Will be very thankful for any advise!
Technical SEO | | kirupa0 -
Suggestions on Website Recovery
Hello Mozzers! I have been tasked with recovering a site from partial link penalty that was previous brought to my attention for this website www.active8canada.com. Upon reviewing the site backlinks and reporting info in Google webmaster tools, I found there was no penalty showing, could it have expired? We spent the last few months doing link cleanup as we recognize that there was some bad links that needed to be addressed. We requested removal of all the bad links after spending time categorizing all of them. Targeting commercial anchor text and bringing those numbers back to acceptable levels. Following this we did a disavow of the bad links which could not be removed through requests. We are actively building out additional content for the website as we recognize that some pages have thin content. We have earned some links as well to show some positive signals during the cleanup but have seen no change for better or worse. My question is, does anyone else see anything else we could be missing here? Should I revisit links again? Some of the links we disavowed are still showing in our backlink reports, but I cross referenced our disavows with the existing backlink profile to try and get an accurate sense of the remaining links. We never saw a decline in ranks further after the disavow, so I'm lead to believe that the links we removed had little, if any impact. I am a little hesitant to begin earning new links through content and partnership outreach as I still feel something is off that I can't quite put my finger on. It was previously confirmed that there was a penalty, but without that showing now in Google webmaster tools I'm grasping at any possible angle I may have missed. If anyone had a couple minutes to spare to shed some light on this situation, it would be greatly appreciated!
Technical SEO | | toddmumford0 -
2 websites 1 Google account?
quick question, I have set up my second website purely for seasonal stock so getting it online early ready for when the time is right. But i only have a single web master tools & ad words account, would there be any problem with the single accounts having the details for both websites? or would it be wise to have separate accounts for each site?
Technical SEO | | GarethEJones0 -
4 websites on same IP crosslinking in footer
Hello, we have 3 separate websites and domains. Industry directory websites Industrialdomain.com DA:62, 8yrs old Medicaldomain.com DA:45, 8yrs old and Hosptalitydomain.com DA:24, 1yr old These sites are cross linked site-wide via footer links and the sites flow a substantial amount of well converting traffic between each other. How does Google see this? Could we place the links in a better position than the footer? The links should be site-wide as we receive many deep visits to these sites from organic and PPC sources. Our traffic is down 20% post penguin, we are going through the backlinks now and weeding out some iffy links but the site has never been forum spammed or anything even close to that. One other thing is that they are linked to with anchor text type links
Technical SEO | | JeremyNathan
For example,
Hospitality Equipment
Medical Supplies & Devices
Industrial Directory It may be best to start of just linking with an image.0 -
Mobile website settings - I am doing right?
Hi, http://www.schicksal.com has a "normal" and a "mobile' version. We are using a browser detection routine to redirect the visitor to the "default site" or the "mobile site". The mobile site is here:
Technical SEO | | GeorgFranz
http://www.schicksal.com/m The robots.txt contains these lines: User-agent: *
Allow: / User-agent: Googlebot
Disallow: /m
Allow: / User-agent: Googlebot-Mobile
Disallow: /
Allow: /m Sitemap: http://www.schicksal.com/sitemaps/index So, the idea is: Only allow the Googlebot-Mobile Bot to access the mobile site. We have also separate sitemaps for default and mobile version. One of the mobile sitemap is here My problem: Webmaster tool is saying that Google received 898 urls from the mobile sitemap, but none has been indexed. (Google has indexed 550 from the "web sitemap".) I've checked the webmaster tools - no errors on the sitemap. So, if you are searching at google.com/m - you are getting results from the default web page, but not the mobile version. This is not that bad because you will be redirected to the mobile version. So, my question: Is this the "normal" behaviour? Or is there something wrong with my config? Would it be better to move the mobile site to a subdomain like m.schicksal.com? Best wishes, Georg.0 -
NEED HELP ASAP: SERVER ISSUE
Hey guys, Some of you may be aware of our story. We have a website about or son who was born with Down syndrome. Two days a go a post I wrote went sort of viral, and I woke up this morning to an email from my host saying they had to take my site down as an emergency because of the amount of resources it is using. So now my site is down (noahsdad.com.) ...any ideas how to proceeded? I really need to get my site back online asap. Thank you.
Technical SEO | | NoahsDad0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0