Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple Domains on 1 IP Address
-
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company.
They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses?
Thank you!
-
Sorry, I'm confused about the setup. Hosts routinely run multiple sites off of shared IPs, but each domain name resolves as itself. Users and search bots should never see that redirection at all and shouldn't be crawling the IPs. This isn't an SEO issue so much as a setup issue. Likewise, any rel=canonical tags on each site would be tied to that site's specific domain name.
-
Hello Peter,
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
-
I think that situation's a bit different - if you aren't interlinking and the sites are very different (your site vs. customer sites), there's no harm in shared hosting. If you share the IP and one site is hit with a severe penalty, there's a small chance of bleedover, but we don't even see that much these days. Now that we're running out of IPv4 addresses, shared IPs are a lot more common (by necessity).
-
I have something similar. I'm with Hostgator, I have a VPS level 5. It comes with 4 IP address's and I have about 15 sites, some mine, some customer sites spread out over the addresses. There is very little interlinking between the sites but I was concerned too. I have read that Add-on sites are bad for SEO, but as long as you arent feature building crappy sites and linking them to your main site, should be fine.
-
I think @cgman and @Nakul are both right, to a point. Technically, it's fine. Google doesn't penalize shared IPs (they're fairly common). If you're cross-linking your sites, though, it's very likely Google will devalue those links. That tactic has just been abused too much, and a shared IP is a dead giveaway.
Now, is it worth splitting all these out to gain a little more link-juice? In most cases, probably not. Google knows you own the sites, and may devalue them anyway. Chances are, they've already been devalued a bit. So, I don't think it's worth hours and hours and thousands of dollars to give them all their own homes, in most cases (it is highly situational, though).
The only other potential problem is if one site were penalized - there have been cases where that impacted sites on the same IP, especially cross-linked sites. It's not common, and you may not be at any risk, but it's not unheard of. As @Nakul said, it's a risk calculation.
-
I am presuming all those domains are linking to each other, correct ?
Are they regular or nofollow links ? It boils down how much authority you have on your main domain as well as the other domains. If I were you, I will keep the main e-commerce website on one server and everything else including niche blogs etc on a different server. It's not just SEO, but also security issues.
Essentially, to answer your question, it may not be hurting you to have the niche blogs, a forum with user generated content, the articles site and the corporate site on the same IP/server, but it would help you a lot more if they were on a different server, possibly different Class C IPs. So, you will gain from these links being on a different server. Keep in mind, these links are important for you and its good to increase their value by hosting them separately, because these sites are links that your competition can never get linked from. I would also consider doing a nofollow on them, and that's just my thoughts. I prefer lower risk. Again, it depends on what your e-commerce website's link profile is.
-
There is nothing wrong with having multiple sites / blogs on the same C block IP address. However, if you're trying to use your blogs to link to your products to boost SEO scores then you might want to consider other link building techniques in addition. Building backlinks from sites on same IP is okay, but you'll have greater benefits getting links from sites hosted on other servers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Blocking certain countries via IP address location
We are a US based company that ships only to US and Canada. We've had two issues arise recently from foreign countries (Russia namely) that caused us to block access to our site from anyone attempting to interact with our store from outside of the US and Canada. 1. The first issue we encountered were fraudulent orders originating from Russia (using stolen card data) and then shipping to a US based International shipping aggregator. 2. The second issue was a consistent flow of Russian based "new customer" entries. My question to the MOZ community is this: are their any unintended consequences, from an SEO perspective, to blocking the viewing of our store from certain countries.
Technical SEO | | MNKid150 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
Redirect root domain to www
I've been having issues with my keyword rankings with MOZ and this is what David at M0Z asked me to do below. Does anyone have a solution to this? I'm not 100% sure what to do. Does it hurt ranking to have a domain at the root or not? Can I 301 redirect a whole site or do I have to do individual pages. "Your campaign is looking for rankings for the www version of the campaign but the URL resolves as a root domain. This would explain the discrepancy. Since there is no re-direct between the two, you can have brickmarkers.com 301 re-direct to www.site.com which will prevent you from re-creating your campaign to track the root domain. Once the re-direct is in place it will take a while for Google to show the www version in the results in which your campaign rankings will be accurate." Thanks
Technical SEO | | SeaDrive0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
Subdomain and Domain Rankings
I have read here that domain names with keywords might add a boost to your search rank For instance using a completely inane example monkey-fights.com might get a boost compared to mfl.com (monkey fighting league) when searching for "monkey fights" There seems to be a hot debate as to how much bonus the first domain might get over the second, but leaving that aside for the moment. Question 1. Would monkey-fights.mfl.com get the same kind of bonus as a root domain bonus? Question 2. If the answer to 1 above was yes would a 301 redirect from the suddomain URL to root domain URL retain that bonus I was just thinking on how hard it is to get root domains these days that are not either being squatted on etc. and if this might be a way to get the same bonus, or maybe subdomains are less bonus prone and so it would be a waste of time Thanks
Technical SEO | | bThere0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Using hyphenated sub-domains or non-hyphenated sub-domains? What is the question! I Any takers?
For our corporate business level domain, we are exploring using a hyphenated sub-domain foir a project. Something like www.go-figure.extreme.com I thought from a user perspective it seems cluttered. The domain length might also be an issue with the new Algorithm big G has launched in recent past. I know with past experience, hyphenated domains usually take longer to index, as they are used by spammers more frequently and can take longer to get out of the supplementary index. Our company site has over 90 million viewers / year, so our brand is well established and traffic isn't an issue. This is for a corporate level project and I didn't have the answer! Will this work? anyone have any experience testing this. Any thoughts will help! Thanks, Rob
Technical SEO | | RobMay0