Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The benefits from having a dedicated IP
-
Is the true? Claim by SiteGround
Having a dedicated IP for each website is considered by some experts as an advantage for search engine optimization. There is a common believe that sites with dedicated IP addresses do better in the search engine results than those on shared IPs. Such sites do not share the risk of being banned for sharing the same IP in case another website hosted on the same server gets banned by a search engine.
-
I have 7-8 ad-sense blog websites under one hosting, Now I am planing to create selling website. My blogs were not having good content and they are decreasing in ranking (my be panda). So I need to remove those websites from the hosting? should they effect my new selling website negatively?
-
In the great Google infrastructure, I'm sure that Google knows what IP address your site is hosted on, and all the ones tied to it. In one of the past MOZ blog posts you can see a number of factors Google looks at to see what you control. We used dedicated IP's for each client, just in case anything ever happened to one account, it would not affect the others. They are close in IP address range, since they are on the same block, but none are on the same number. This isn't an attempt to gain SEO rank, as much as it is to protect the client.
Locally, I have reason to believe its a different story. For example, we are located in St Louis, and use a local server center located in downtown St Louis. After changing our site from hostgator under a shared IP, (Provo, Utah) to a local server center, we saw an drastic (in internet time) improvement in site load time, responsiveness, and believe it or not, a ranking boost....true story, no joke. It wasn't a large boost, but we moved up 2 spots on our main keyword on page one, and 1-2 in other places. We didn't make any other changes to the site, other than adding a few blog posts, and this was not around any major algorithm shift or update. We have seen this pattern repeat with other clients as well.
My guess is that Google liked the decreased load times, the local server location (as it matched the city on our site, somehow verifying our location further), and the fact that the site was on a dedicated IP address. If we had just changed the site's IP address by itself, I do not think we would have seen any impact or result change.
"there is really no SEO benefit of having a unique IP for each of your sites unless you're attempting to pass link juice between each, which falls into the greyhat category."
I don't think you would get away with this for very long, or that it would benefit you in any way. Google would see that you host or control these sites through your analytics account, or IP range. If you wanted to pull it off, and have separate analytics accounts, dedicated IP's etc, I doubt the result would be worth the time.
-
Depending on what your site's purpose is, I have to respectfully disagree with the above comments. If you're site is selling something, you need an SSL certificate, and (I'm reasonably) certain, you can't have that without a dedicated IP address. All things equal, e-commerce sites with an SSL certificate will rank higher than sites without one. Plus, there are other non-seo benefits to a dedicated ip address, and it's inexpensive. To me it's a no-brainer, but I understand why people would disagree.
- Ruben
-
Google bans sites (domain names) rather than IP addresses. However if you are thinking of moving your site so https then you would need a dedicated IP address. Yoast has published an interesting article here Moving your website to https / SSL: tips & tricks perhaps that's what they are referring to.
-
I agree with Bill, there is really no SEO benefit of having a unique IP for each of your sites unless you're attempting to pass link juice between each, which falls into the greyhat category.
-
Nope, as far as I know. Matt Cutt's commented on it https://www.youtube.com/watch?v=AsSwqo16C8s
The only time I could see it was useful if you were doing some black hatish stuff and didn't want multiple domains on the same C Block that were related, but I'm pretty sure Penguin/ Panda is catching that sort of thing now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will using a reverse proxy give me the benefits of the main sites domain authority?
If I am running example.com and have a blog on exampleblog.com Will moving the blog to example.com/blog and using a reverse proxy give the blog the same domain authority as example.com Thanks
Intermediate & Advanced SEO | | El-Bracko0 -
Is there any benefit to changing 303 redirects to 301?
A year ago I moved my marketplace website from http to https. I implemented some design changes at the same time, and saw a huge drop in traffic that we have not recovered from. I've been searching for reasons for the organic traffic decline and have noticed that the redirects from http to https URLs are 303 redirects. There's little information available about 303 redirects but most articles say they don't pass link juice. Is it worth changing them to 301 redirects now? Are there risks in making such a change a year later, and is it likely to have any benefits for rankings?
Intermediate & Advanced SEO | | MAdeit0 -
What IP Address does Googlebot use to read your site when coming from an external backlink?
Hi All, I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink. I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website? E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address. Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain? Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc. If anyone has any insight this would be great.
Intermediate & Advanced SEO | | MattBassos0 -
SEO impact of 301 redirects based on IP addresses from a specific state
Hello Moz Community! We are facing an issue that may or may not be unique, but need some advice and/or clarification on the best way to address the issue. We recently rebranded and launched a new site under a new domain and things have been progressing well. However, despite all the up front legwork on trademarks and licensing, we have recently encountered a hiccup that forces us to revert to the old URL/branding for one specific state. This may be a temporary issue that lasts a couple of months or it could potentially be in the court system for a couple of years. One potential solution we have discussed is to redirect the new site to the old site based on IP addresses for the state in question. Looking for any guidance on what type of impact this may have on SEO. Also open to any other suggestions or guidance on dealing with this situation. Thanks
Intermediate & Advanced SEO | | VeteransFirstMarketing0 -
Benefits/drawbacks to different Schema markup languages (ie. JSON-LD, Microdata, RDFa)
Just a question (or questions) I have wondered about. What's the difference, besides the actual encoding, between the three? Why have three? Why not just the one? Seems to me that Microdata is the easiest, but maybe I am wrong. Is there a reason to use one versus another? I have not found anything explaining this on schema.org - I suppose this is just a discussion versus getting one right or wrong answer. I am just curious of the opinions of people in the SEO MOZ community. Unless of course there is one answer. I'll take that too.
Intermediate & Advanced SEO | | Brian_Dowd1 -
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code. Is this normal? If so, why? If not, why not? I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening. Thanks guys!
Intermediate & Advanced SEO | | danatanseo0 -
Changing Server IP Addresses. Should I be concerned?
Hello Mozers Our site has been on a dedicated server for about four years now. (no other sites, just ours on the server) I have made the decision to move it to a much better and faster server than the current server we are on for more than one reason. My big fear is Google will lose trust for my site because of the IP change. Ip's stay with the server at 1and1 they do not follow the website. So, I have done my due diligence and copied over all code and databases and have tested it completely to insure there are no issues when I change the DNS to point to the new server. Made sure 1and1 is giving me an IP that has never been used, I am Keeping the old server on until cached DNS records expire for it. Is there anything else I need to do to make sure I do not lose current rankings in Google? I have heard nightmare stories about making these kinds of changes but at this point for our site there is no turning back this is a change that must take place. Any pointers and advice would be much appreciated! Thanks!
Intermediate & Advanced SEO | | Robbie82991 -
Duplicate internal links on page, any benefit to nofollow
Link spam is naturally a hot topic amongst SEO's, particularly post Penguin. While digging around forums etc, I watched a video blog from Matt Cutts posted a while ago that suggests that Google only pays attention to the first instance of a link on the page As most websites will have multiple instances of a links (header, footer and body text), is it beneficial to nofollow the additional instances of the link? Also as the first instance of a link will in most cases be within the header nav, does that then make the content link text critical or can good on page optimisation be pulled from the title attribute? I would appreciate the experiences and thoughts Mozzers thoughts on this thanks in advance!
Intermediate & Advanced SEO | | JustinTaylor880