Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Temporarily shut down a site
-
What would be the best way to temporarily shut down a site the right way and not have a negative impact on SEO?
-
I asked the Q&A associates their opinion, and several people also responded that a 503 would be the way to go.
-
It is due to some legal matter. So we need it to shut it down
-
Can you give us some more details about the shutdown (the reasons, why it needs to be so long, etc)? We can help you a bit better if we know more information.
When we switched from SEOmoz.org to moz.com, we were only down for half an hour, if that. If this is about upgrading, is there a testing server that you can use to get the website rebuilt and tested on the testing/staging server before you make it live? We used multiple staging servers to test out the site and did lots of checks so that we had minimal downtime when it came time to move the site.
-
What if it is more than a week?
-
I'm also assuming that you're talking about just a day or two, and not two months. There was a post on Moz last year about this that can also help, in addition to the good info provided by CleverPhD http://moz.com/blog/how-to-handle-downtime-during-site-maintenance
-
Appreciate the positive comment EGOL!
-
That was a great answer. Thanks. I didn't know that.
-
Thank you - please mark my response as Good Answer if it helps.
Cheers!
-
Thank you
-
According to Matt Cutts
"According to Google's Distinguished Engineer Matt Cutts if your website is down just for a day, such as your host being down or a server transfer, there shouldn't be any negative impact to your search rankings. However, if the downtime is extended, such as for two weeks, it could have impact on your search rankings because Google doesn't necessarily want to send the user to a website that they know has been down, because it provides the user with a poor user experience.
Google does make allowances for websites that are sporadically having downtime, so Googlebot will visit again 24 hours later so and see if the site is accessible."
That said, what should you show Google?
http://yoast.com/http-503-site-maintenance-seo/
According to Yoast, you should not show a 200 (ok) or 404 (file not found), but a 503 code on all pages with a retry-after header to Google.
The 503 (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html) tells Google "The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay. If known, the length of the delay MAY be indicated in a Retry-After header. If no Retry-After is given, the client SHOULD handle the response as it would for a 500 response.:
The retry after tells Google when to come back. You should set this to a time that is generous to allow you plenty of time to get everything back up and running.
Another point from Yoast that he links to https://plus.google.com/+PierreFar/posts/Gas8vjZ5fmB - if the robots.txt file shows a 503 then Google will stop wasting time crawling all your pages (and wasting time) until it sees a 200 back on your robots.txt file. So it is key that you get the 503 and retry after properly on the robots.txt
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Wrong titles in site links
Hello fellow marketers, I have found this weird thing with our website in the organic results. The sitelinks in the SERP shows wrong written text. As in grammatically incorrect text. My question is where does Google get the text from? It is not the page title as we can see it. kKsFv0X.png
Intermediate & Advanced SEO | | auke18101 -
Should I 'nofollow' links between my own sites?
We have five sites which are largely unrelated but for cross-promotional purpose our company wishes to cross link between all our sites, possibly in the footer. I have warned about potential consequences of cross-linking in this way and certainly don't want our sites to be viewed as some sort of 'link ring' if they all link to one another. Just wondering if linking between sites you own really is that much of an issue and whether we should 'nofollow' the links in order to prevent being slapped with any sort of penalty for cross-linking.
Intermediate & Advanced SEO | | simon_realbuzz0 -
Noindex a meta refresh site
I have a client's site that is a vanity URL, i.e. www.example.com, that is setup as a meta refresh to the client's flagship site: www22.example.com, however we have been seeing Google include the Vanity URL in the index, in some cases ahead of the flagship site. What we'd like to do is to de-index that vanity URL. We have included a no-index meta tag to the vanity URL, however we noticed within 24 hours, actually less, the flagship site also went away as well. When we removed the noindex, both vanity and flagship sites came back. We noticed in Google Webmaster that the flagship site's robots.txt file was corrupt and was also in need of fixing, and we are in process of fixing that - Question: Is there a way to noindex vanity URL and NOT flagship site? Was it due to meta refresh redirect that the noindex moved out the flagship as well? Was it maybe due to my conducting a google fetch and then submitting the flagship home page that the site reappeared? The robots.txt is still not corrected, so we don't believe that's tied in here. To add to the additional complexity, the client is UNABLE to employ a 301 redirect, which was what I recommended initially. Anyone have any thoughts at all, MUCH appreciated!
Intermediate & Advanced SEO | | ACNINTERACTIVE0 -
Link Building Ideas for a health site
Hi, I am trying to rank a health related website. This is the url: www.ridpiles.com Domain age is 1 year 6 months. Done Directory submissions Blog Comments + Forum posts Done Social Bookmarks Article submissions (Not much) I have done competitor analysis. All of my competitors are just had links from directories and some link exchanges. They got links from quality sites like Yahoo dir. I know my site is far better than my competitors and has 100% unique content. I have submitted to yahoo directory inclusion, but still no luck i hadn't accepted into it. I am planning to go for a sponsered review but dont know, weather the link will be valuable for that much of money. I was left with Guest Blogging. I see this is the only option for me to build links. But i have a very tough competiton, i must compete with most reputed sites like webmd.com etc, i need to get more good links. But i cant get what other ways to get authoritative links. If Guest blogging is the only option for me, how many posts do i need to do daily? And can someone suggest me good Guest blogging sites? Anyhelp would be appreciated.
Intermediate & Advanced SEO | | Indexxess0 -
Multiple sites in the same niche
Hi All A question regarding multiple sites in the same niche... If I have say 10 sites all targetting the same niche yet all on different C-class IPs with different hosts, registrars, whois data and ages can I use the same template, or will Google discern a pattern? Basically I have developed a WordPress template which I want to use on the sites albeit with different logos / brand colours. NB/ All of the 10 sites will have unique, original content and they will NOT be interlinked
Intermediate & Advanced SEO | | danielparry1 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90