Dedicated IP Address on my forum site www.astigtayo.com?
-
Hello and Good Day,
Does having a dedicated IP Address to my site affect my search engine ranking?
-
Highland,
I read this and went to do a bit of research regarding whether or not Google would ban an entire IP address.... Well I found nothing, either way. Typical Google. But, I think you are right and that I may have spread an urban myth or at least something that became an urban myth.
So, I agree, I do not think the entire IP address would be banned to begin with. Thanks for making me think.Best
-
Mark Anthony
Nijzing is correct re that links from the same IP address have less value would be one effect on your site. The other in the case of a non dedicated server is if the other sites hosted on the host gator server you are on should have a spam site within the group, it can cause the search engines to ignore the entire IP address.
So you are on same server as SpammyJoe.com and he gets busted by Google, they will not list anything on that server. So, if you are not sure of the issues that could hurt you.
Best
-
Some people have issued the warning of "Make sure you're not on a banned IP" but, so far, I've never seen an example where a person jumps onto an IP and finds out that Google banned that IP. Google bans sites, not IPs, and in every example I've seen, the sites banned on an IP all had some common problem (i.e. the server got hacked and malware was installed).
There has never been any advantage shown to having a dedicated IP vs being on a virtual host. What Nijzing means in his post is that some people like to build their sites to link to each other and use dedicated IPs to achieve that. As someone who virtual hosts, I can say that it really doesn't matter that much. I link my sites together (where it makes sense) and have had no real problems. Then again, I don't rely on those links wholly for my backlinks either. I think the whole "links your sites together by using different IPs" mess is a ton of work for no real gain.
-
Hello Mark,
some say that IP adress is a factor for ranking but mainly in the sence that linking from the same ip adres to other websites with the same adres doesn't do much. Having a own server with a own ip adres is better (linkbuilding wise).
others say that ip adresses are only used to determine the country your websites is located and nothing more.
I don't think you can say for sure that it does affect your results but as long as you make sure that links come from other ip adresses i don't think it will harm your website
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Technical Argument to Prefer non-www to www?
I've been recommending using non-www vs. www as a preferable set up if a client is starting a site from scratch, and there aren't any pre-existing links to consider. I'm wondering if this recommendation still holds? I’ve been looking on the interwebs and I’m seeing far fewer articles arguing for the non-www version. In the two courts, I’m seeing highlighted: Pro www: (ex: www.domain.com) Works better with CDN networks, where a domain needs to be specified (though that argument is 3 years old) Ability to restrict cookies to one hostname (www) or subdomain (info. blog. promo.) if using multiple subdomains IT people generally prefer it Pro non-www (ex: domain.com) If you ever want to support or add https://, you don’t have to support 2 sets of urls/domains Mindset: fewer and fewer people think in terms of typing in www before a site url, the future is heading towards dropping that anyway. Though that is a bit of a cosmetic argument…. Is there a trend going back to www? Is there a technical argument to recommend non-www over www? Thanks!
Algorithm Updates | | Allie_Williams0 -
How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
Algorithm Updates | | northerncs0 -
Why some sites doesn't get ranked in Google but in Bing and Yahoo
Few of my sites e.g. Business-Training-Schools.com and Ultrasoundtechnicians.com doesnt get much visits from Google but these sites get top ranked in Bing and Yahoo. I have tried searching for answer to these question but i did not find anything convincing.
Algorithm Updates | | HQP2 -
Is there a way to know what rank my site is listed on google ?
My current client web page was listed at the 4th page 1 month ago. Im trying real hard to make him understand that the traffic from beiing on the first page is important and that he need to give me additionnal ressource to make it happen ( i don't prog at all). So i had the idea of checking every page to see whats is current rank. but instead of looking from page 1 to page X, i was wondering if there was something somewhere that could give me my rank right away. It woud help saving time. Thx.
Algorithm Updates | | Promoteam0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0