Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The benefits from having a dedicated IP
-
Is the true? Claim by SiteGround
Having a dedicated IP for each website is considered by some experts as an advantage for search engine optimization. There is a common believe that sites with dedicated IP addresses do better in the search engine results than those on shared IPs. Such sites do not share the risk of being banned for sharing the same IP in case another website hosted on the same server gets banned by a search engine.
-
I have 7-8 ad-sense blog websites under one hosting, Now I am planing to create selling website. My blogs were not having good content and they are decreasing in ranking (my be panda). So I need to remove those websites from the hosting? should they effect my new selling website negatively?
-
In the great Google infrastructure, I'm sure that Google knows what IP address your site is hosted on, and all the ones tied to it. In one of the past MOZ blog posts you can see a number of factors Google looks at to see what you control. We used dedicated IP's for each client, just in case anything ever happened to one account, it would not affect the others. They are close in IP address range, since they are on the same block, but none are on the same number. This isn't an attempt to gain SEO rank, as much as it is to protect the client.
Locally, I have reason to believe its a different story. For example, we are located in St Louis, and use a local server center located in downtown St Louis. After changing our site from hostgator under a shared IP, (Provo, Utah) to a local server center, we saw an drastic (in internet time) improvement in site load time, responsiveness, and believe it or not, a ranking boost....true story, no joke. It wasn't a large boost, but we moved up 2 spots on our main keyword on page one, and 1-2 in other places. We didn't make any other changes to the site, other than adding a few blog posts, and this was not around any major algorithm shift or update. We have seen this pattern repeat with other clients as well.
My guess is that Google liked the decreased load times, the local server location (as it matched the city on our site, somehow verifying our location further), and the fact that the site was on a dedicated IP address. If we had just changed the site's IP address by itself, I do not think we would have seen any impact or result change.
"there is really no SEO benefit of having a unique IP for each of your sites unless you're attempting to pass link juice between each, which falls into the greyhat category."
I don't think you would get away with this for very long, or that it would benefit you in any way. Google would see that you host or control these sites through your analytics account, or IP range. If you wanted to pull it off, and have separate analytics accounts, dedicated IP's etc, I doubt the result would be worth the time.
-
Depending on what your site's purpose is, I have to respectfully disagree with the above comments. If you're site is selling something, you need an SSL certificate, and (I'm reasonably) certain, you can't have that without a dedicated IP address. All things equal, e-commerce sites with an SSL certificate will rank higher than sites without one. Plus, there are other non-seo benefits to a dedicated ip address, and it's inexpensive. To me it's a no-brainer, but I understand why people would disagree.
- Ruben
-
Google bans sites (domain names) rather than IP addresses. However if you are thinking of moving your site so https then you would need a dedicated IP address. Yoast has published an interesting article here Moving your website to https / SSL: tips & tricks perhaps that's what they are referring to.
-
I agree with Bill, there is really no SEO benefit of having a unique IP for each of your sites unless you're attempting to pass link juice between each, which falls into the greyhat category.
-
Nope, as far as I know. Matt Cutt's commented on it https://www.youtube.com/watch?v=AsSwqo16C8s
The only time I could see it was useful if you were doing some black hatish stuff and didn't want multiple domains on the same C Block that were related, but I'm pretty sure Penguin/ Panda is catching that sort of thing now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will using a reverse proxy give me the benefits of the main sites domain authority?
If I am running example.com and have a blog on exampleblog.com Will moving the blog to example.com/blog and using a reverse proxy give the blog the same domain authority as example.com Thanks
Intermediate & Advanced SEO | | El-Bracko0 -
Is there any benefit to changing 303 redirects to 301?
A year ago I moved my marketplace website from http to https. I implemented some design changes at the same time, and saw a huge drop in traffic that we have not recovered from. I've been searching for reasons for the organic traffic decline and have noticed that the redirects from http to https URLs are 303 redirects. There's little information available about 303 redirects but most articles say they don't pass link juice. Is it worth changing them to 301 redirects now? Are there risks in making such a change a year later, and is it likely to have any benefits for rankings?
Intermediate & Advanced SEO | | MAdeit0 -
What IP Address does Googlebot use to read your site when coming from an external backlink?
Hi All, I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink. I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website? E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address. Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain? Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc. If anyone has any insight this would be great.
Intermediate & Advanced SEO | | MattBassos0 -
Benefits/drawbacks to different Schema markup languages (ie. JSON-LD, Microdata, RDFa)
Just a question (or questions) I have wondered about. What's the difference, besides the actual encoding, between the three? Why have three? Why not just the one? Seems to me that Microdata is the easiest, but maybe I am wrong. Is there a reason to use one versus another? I have not found anything explaining this on schema.org - I suppose this is just a discussion versus getting one right or wrong answer. I am just curious of the opinions of people in the SEO MOZ community. Unless of course there is one answer. I'll take that too.
Intermediate & Advanced SEO | | Brian_Dowd1 -
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code. Is this normal? If so, why? If not, why not? I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening. Thanks guys!
Intermediate & Advanced SEO | | danatanseo0 -
Why is my servers ip address showing up in Webmaster Tools?
In links to my site in Google Webmaster Tools I am showing over 28,000 links from an ip address. The ip address is the address that my server is hosted on. For example it shows 200.100.100.100/help, almost like there are two copies of my site, one under the domain name and one under the ip address. Is this bad? Or is it just showing up there and Google knows that it is the same since the ip and domain are from the same server?
Intermediate & Advanced SEO | | EcommerceSite0 -
Malicious site pointed A-Record to my IP, Google Indexed
Hello All, I launched my site on May 1 and as it turns out, another domain was pointing it's A-Record to my IP. This site is coming up as malicious, but worst of all, it's ranking on keywords for my business objectives with my content and metadata, therefore I'm losing traffic. I've had the domain host remove the incorrect A-Record and I've submitted numerous malware reports to Google, and attempted to request removal of this site from the index. I've resubmitted my sitemap, but it seems as though this offending domain is still being indexed more thoroughly than my legitimate domain. Can anyone offer any advice? Anything would be greatly appreciated! Best regards, Doug
Intermediate & Advanced SEO | | FranGen0 -
IP address guideline for 2 sites on same server linking each other.
Hi Guys! I have two websites which link to each other but are on the same server. Both the sites have a great PR and link juice. I want to know what steps shall I take in order to make google feel that both the sites are not owned by me. Like shall i get different IP and different servers for both or something more? Looking forward for you thoughts and help!
Intermediate & Advanced SEO | | HiteshBharucha0