Dedicated IP Address on my forum site www.astigtayo.com?
-
Hello and Good Day,
Does having a dedicated IP Address to my site affect my search engine ranking?
-
Highland,
I read this and went to do a bit of research regarding whether or not Google would ban an entire IP address.... Well I found nothing, either way. Typical Google. But, I think you are right and that I may have spread an urban myth or at least something that became an urban myth.
So, I agree, I do not think the entire IP address would be banned to begin with. Thanks for making me think.Best
-
Mark Anthony
Nijzing is correct re that links from the same IP address have less value would be one effect on your site. The other in the case of a non dedicated server is if the other sites hosted on the host gator server you are on should have a spam site within the group, it can cause the search engines to ignore the entire IP address.
So you are on same server as SpammyJoe.com and he gets busted by Google, they will not list anything on that server. So, if you are not sure of the issues that could hurt you.
Best
-
Some people have issued the warning of "Make sure you're not on a banned IP" but, so far, I've never seen an example where a person jumps onto an IP and finds out that Google banned that IP. Google bans sites, not IPs, and in every example I've seen, the sites banned on an IP all had some common problem (i.e. the server got hacked and malware was installed).
There has never been any advantage shown to having a dedicated IP vs being on a virtual host. What Nijzing means in his post is that some people like to build their sites to link to each other and use dedicated IPs to achieve that. As someone who virtual hosts, I can say that it really doesn't matter that much. I link my sites together (where it makes sense) and have had no real problems. Then again, I don't rely on those links wholly for my backlinks either. I think the whole "links your sites together by using different IPs" mess is a ton of work for no real gain.
-
Hello Mark,
some say that IP adress is a factor for ranking but mainly in the sence that linking from the same ip adres to other websites with the same adres doesn't do much. Having a own server with a own ip adres is better (linkbuilding wise).
others say that ip adresses are only used to determine the country your websites is located and nothing more.
I don't think you can say for sure that it does affect your results but as long as you make sure that links come from other ip adresses i don't think it will harm your website
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our site dropped by April 2018 Google update about content relevance: How to recover?
Hi all, After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest! Thank you
Algorithm Updates | | vtmoz0 -
Fred Google Update & Ecommerce Sites
Hi I've seen a couple areas of our site drop in average rankings for some areas since the 'Fred' update. We don't have ads on our site, but I'm wondering if it's 'thin' content - http://www.key.co.uk/en/key/ We are an ecommerce site and we have some content on our category pages - which is a bit more generic about the section/products within that section - but how can it not be if it's a category page with products on? I am working on adding topic based content/user guides etc to be more helpful for customers, but I'd love some advice on generating traffic to category pages. Is it better to rank these other topic/user guide pages instead of the category page & then hope the customer clicks through to products? Advice welcome 🙂
Algorithm Updates | | BeckyKey0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Site not indexed on Google UK after 4 days?
Hello!
Algorithm Updates | | digitalsoda
Wonder if anyone can help with this one? I have an ecommerce site www.doggydazzles.co.uk which went live on Friday and was submitted to Google via webmaster tools on saturday morning, but I can't find any trace of it in a google search?
I'm a bit stuck with this as its never happened to any of my other sites.
Can anyone help please or make suggestions as to what I can do to get ranked quicker? Thanks0 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
How does this site rank no 1 for big terms with no optimisation?
Hi, A client recently asked me abut a site that appears to have popped up out of nowhere and is ranking for big terms within their industry: http://bit.ly/11jcpky I have looked at the site for a particular term: Cheap Beds I was using unpersonalised search on google.co.uk with location set to London. The site currently ranks no 1 for that term and other similar terms. The question is how? SEO Moz reports no backlinks (they must have blocked?) Ahrefs and Majestic report report some backlinks but not many and no anchor text with the term in. The Page title and meta do not contain the term nor does the page seem to contain the term anywhere. The domain does have some age though has no keyword match in the URL. I'm a little stumped to how they are achieving these results. Any Ideas Anyone?
Algorithm Updates | | JeusuDigital0 -
How do you block incoming links to your site?
With the new update to google focusing on link spam and multiple anchor text ? If you have incoming links that you would like to block or make no follow?
Algorithm Updates | | HelpingHandNetwork1 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0