Do 301s still work after hosting is discontinued?
-
I am in the process of phasing out a website that has been acquired by another company.
Its web pages are being 301 redirected to their counterparts on the website of the company that has acquired them.
How long should I maintain the hosting of the phased out website? Technically, do 301s still work after the hosting has been discontinued?
Thanks, Caro
-
Thank you Erica.
-
If you're 301 redirecting site A to site B, if you quit paying for or sell hosting/domain for site A, then the 301s will cease to work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema.org markup for breadcrumbs: does it finally work?
Hi, TL;DR: Does https://schema.org/BreadcrumbList work? It's been some time since I last implemented schema.org markup for breadcrumbs. Back then the situation was that google explicitly discouraged the use of the schema.org markup for breadcrumbs. In my experience it had been pretty hit or miss - sometimes it worked without issues; sometimes it did not work without obvious reason. Consequently, I ditched it for the data-vocabulary.org markup which did not give me any issues. However, I prefer using schema.org and currently a new site is being designed for a client. Thus, I'd like to use schema.org markup for the breadcrumb - but of course only if it works now. Google has dropped the previous warning/discouragements and by now lists a schema.org code https://developers.google.com/structured-data/breadcrumbs based on the new-ish https://schema.org/BreadcrumbList. Has anybody here used this markup on a site (preferably more than one) and can confirm whether or not it is reliably working and showing the breadcrumb trail / site hierarchy in the SERP? Thanks for your answers! Nico
Technical SEO | | netzkern_AG0 -
Both links with ".html" and without are working , Is that a problem ?
Default format of my url ending with ".html" , I know it's not a problem .. But both links with ".html" and without are working , Is that critical problem or not ? and how to solve it ?
Technical SEO | | Mohamed_Samer0 -
Does using data-href="" work more effectively than href="" rel="nofollow"?
I've been looking at some bigger enterprise sites and noticed some of them used HTML like this: <a <="" span="">data-href="http://www.otherodmain.com/" class="nofollow" rel="nofollow" target="_blank"></a> <a <="" span="">Instead of a regular href="" Does using data-href and some javascript help with shaping internal links, rather than just using a strict nofollow?</a>
Technical SEO | | JDatSB0 -
Subdomain hosted on a different server VS Subfolder on main server
We have a website developed in ColdFusion on a server does not support PHP. We have a blog for the site using WordPress (PHP), hosted on a different server, with a subdomain as the URL. (example: blog.website.com) I've heard that search engines treat subdomains as completely different websites from the main domain, so they could actually be in competition for rankings in the search engines - is that correct? I am under the impression that the traffic to the blog will not show as traffic to the main website, because it is hosted on a different server - is that right? If I am correct, I assume the best solution would be to install PHP on our main server, and put the blog in a subfolder ... or would the subdomain be OK as long as the blog is hosted on the main server? Thanks!
Technical SEO | | vermont0 -
Disallow: /search/ in robots but soft 404s are still showing in GWT and Google search?
Hi guys, I've already added the following syntax in robots.txt to prevent search engines in crawling dynamic pages produce by my website's search feature: Disallow: /search/. But soft 404s are still showing in Google Webmaster Tools. Do I need to wait(it's been almost a week since I've added the following syntax in my robots.txt)? Thanks, JC
Technical SEO | | esiow20130 -
No Google cached snapshot image... 'Text-only version' working.
We are having an issue with Googles cached image snapshops... Here is an example: http://webcache.googleusercontent.com/search?q=cache:IyvADsGi10gJ:shop.deliaonline.com/store/home-and-garden/kitchen/morphy-richards-48781-cooking/ean/5011832030948+&cd=308&hl=en&ct=clnk&gl=uk I wondered if anyone knows or can see the cause of this problem? Thanks
Technical SEO | | pekler1 -
Need specifics about mod_proxy for blog domain and 301s
I am getting the IT staff to move our blog from "blog." to "/blog" using mod_proxy for apache, but I had a couple of questions about this I was hoping someone here might be able to help with. Is it correct that just setting up mod_proxy will make the blog available at both URLs? the "blog." subdomain and the "/blog" folder? If so, what is the best way to 301 redirect all traffic from "blog." to "/blog"? I assume this could be handled with a blanket 301 style rewrite, but I wanted to get some other opinions before getting with my IT guys to do it. I am technical enough to talk about this, but not do it myself, so experienced opinions are appreciated. Thanks!
Technical SEO | | SL_SEM0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0