Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
-
I am considering using cloudflare for a couple of my sites.
What is your experience?I researched a bit and there are 3 issues I am concerned about:
-
google may consider site bad neighbourhood in case other sites on same DNS/IP are spammy.
Any way to prevent this? Anybody had a problem? -
ddos attack on site on same DNS could affect our sites stability.
-
blocking false positives. Legitimate users may be forced to answer captchas etc. to be able to see the page. 1-2% of legit visitor were reported by other moz member to be identified as false positive.
Can I effectively prevent this by reducing cloudflare basic security level?
Also did you experience that cloudflare really helped with uptime of site? In our case whenever our server was down for seconds also cloudflare showed error page and sometimes cloudflare showed error page that they could not connect even when our server response time was just slow but pages on other domains were still loading fine.
-
-
Thanks Cyrus.
-
You may be interested in this post titled "Cloudflare and SEO" : https://blog.cloudflare.com/cloudflare-and-seo/
"We did a couple things. First, we invented a new technology that, when it detects a problem on a site, automatically changes the site's CloudFlare IP addresses to isolate it from other sites. (Think of it like quarantining a sick patient.) Second, we worked directly with the crawl teams at the big search engines to make them aware of how CloudFlare worked. All the search engines had special rules for CDNs like Akamai already in place. CloudFlare worked a bit differently, but fell into the same general category. With the cooperation of these search teams we were able to get CloudFlare's IP ranges are listed in a special category within search crawlers. Not only does this keep sites behind them from being clustered to a least performant denominator, or incorrectly geo-tagged based on the DNS resolution IP, it also allows the search engines to crawl at their maximum velocity since CloudFlare can handle the load without overburdening the origin."
-
Thanks Tom.
I will move now one of my main domains and will use their PRO plan. Noticed they have quite a number of settings to address the false positives. Our problem with cloudflare error pages may have been a temporary one while they where building the cache of the site. Anyway it is easy to enable/disable the cloudflare protection. So not much risk here. Could save us of a lot of potential headache in the future if it works as advertised. -
Hi,
-
I have used CloudFlare for a few sites and never had an issue with this. It is a risk/concern with all shared hosting, but CloudFlare are very proactive about addressing anything impacting their customers, so I would not have a concern on this side of things at all.
-
Again, I wouldn't have concerns here. CloudFlare are very adept at handling large-scale DDOS attacks . Having read some of their post-attack analysis reports, they usually mitigate any impact to customers very quickly. They have loads of customers, and if this sort of thing was an issue I think we'd hear about it fairly often.
-
I can't speak to the % of users that might get falsely identified as a risk and presented a CAPTCHA, but I'd be very surprised if it was as high as 1-2%; I've rarely seen that CAPTCHA screen myself. You should check what CloudFlare have to say on this issue, but I would have no concern here either.
I have never had an issue with CloudFlare impacting SEO performance or impacting the user experience. It has generally performed well for me, but the biggest issue I see with it is people hoping it is a 'cure all' and means they don't need to properly address issues affecting the performance of their site. If your database performance is very poor, meaning dynamic pages take a long time to load, then CloudFlare is not the answer (it may help - but you should address the underlying issue).
I am unsure about the issue with CloudFlare failing when your server is slow - I'd imagine CloudFlare support could help you with this - there may be a configuration option somewhere.
Overall - my suggestion would be that you go for it.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Old product URLs still indexed and maybe causing problems?
Hi all, Need some expertise here: We recently (3 months ago) launched a newly updated site with the same domain. We also added an SSL and dropped the www (with proper redirects). We went from http://www.mysite.com to https://mysite.com. I joined the company about a week after launch of the new site. All pages I want indexed are indexed, on the sitemap and submitted (submitted in July but processes regularly). When I check site:mysite.com everything is there, but so are pages from the old site that are not on the sitemap. These do have 301 redirects. I am finding our non-product pages are ranking with no problem (including category pages) but our product pages are not, unless I type in the title almost exactly. We 301 redirected all old urls to new comparable product, or if the product is not available anymore to the home page. For better or worse, as it turns out and prior to my arrival, in building the new site the team copied much of the content (descriptions, reviews, etc) from the old site to create the new product pages. After some frustration and research I am finding the old pages are still indexed and possibly causing a duplicate content issue. Now, I gather there is supposedly no "penalty", per se, for duplicate content but a page or site will simply not show in the SERPs. Understandable and this seems to be the case. We also sell a lot of product wholesale and it turns out many dealers are using the same descriptions we have (and have had) on our site. Some are much larger than us so I'd expect to be pushed down a bit but we don't even show in the top 10 pages...for our own product. How long will it take for Google to drop the old and rank the new as unique? I have re-written some pages but much is technical specifications and tough to paraphrase or re-write. I know I could do this in Search Console but I don't have access to the old site any longer. Should I remove the 301s a few at a time and see if the old get dropped faster? Maybe just re-write ALL the content? Wait? As a site note, I'm also on a Drupal CMS with a Shopify ecommerce module so maybe the shop.mysite.com vs mysite.com is throwing it off with the products(?) - (again the Drupal non-product AND category pages rank fine). Thoughts on this would be much appreciated. Thx so much!
Intermediate & Advanced SEO | | mcampanaro0 -
Why do some reputable publishers have problems with their microdata?
I'm using the Google Structured Data Testing Tool to test: https://search.google.com/structured-data/testing-tool NY Times and Women's Health being two good examples. These two reputable publishers don't seem to have the microdata they've implemented recognized. Are they doing something wrong or is there a problem with the tool?
Intermediate & Advanced SEO | | Edward_Sturm1 -
AddThis good or bad for SEO - Urgent
I have heard rumours that AddThis isn't good for SEO is that correct? Just thinking about adding it to my site.
Intermediate & Advanced SEO | | seoman100 -
Replicating keywords in the URL - bad?
Our site URL structure used to be (example site) frogsforsale.com/cute-frogs-for-sale/blue-frogs wherefrogsforsale.com/cute-frogs-for-sale/ was in front of every URL on the site. We changed it by removing the for-sale part of the URL to be frogsforsale.com/cute-frogs/blue-frogs. Would that have hurt our rankings and traffic by removing the for-sale? Or was having for-sale in the URL twice (once in domain, again in URL) hurting our site? The business wants to change the URLs again to put for-sale back in, but in a new spot such as frogsforsale.com/cute-frogs/blue-frogs-for-sale as they are convinced that is the cause of the rankings and traffic drop. However the entire site was redesigned at the same time, the site architecture is very different, so it is very hard to say whether the traffic drop is due to this or not.
Intermediate & Advanced SEO | | CFSSEO0 -
Content position and topic modelling
Hi, Two questions here, First: Does the position of content have any impact on performance? For example say a page displays a league table (20 rows) so eats up most of the above-fold space. Would that table being top followed by content have a negative impact? Would creating 'some' content before a table help? Second: Does topic modelling actually help relevance signals? So say I sold guitars and the page had the word 'guitar' throughout the content, would including electric, acoustic, strings, amps etc also in the content help the page become more relevant for the term 'guitar'? Or would it just expand the terms the page would be eligible to show for? Thanks.
Intermediate & Advanced SEO | | followuk1 -
Pop Up Advertisement - Bad for SEO?
So i have been working with a company running their SEO for close to two years now. Since i started to engage with them they have always used a very simple pop up for the first time an end user visits their website (via javascript and cookies). The pop up simply ask them if they would like to download a solutions brochure from their website. So as far as pop ups go, it is at least relevant. The client loves this pop up, i do not. For a while we have always held spots #1-3 for a lot of our keywords but we have started to drop to lower on the first page. So i have been researching to see if some of the new algorithm changes are targeting sites with this type of functionality. If i have some data i could definitely get them to remove it. So the question is, do pop-ups hurt your organic ranking? Thanks for the input! Kyle
Intermediate & Advanced SEO | | kchandler1 -
Soft 404 problem
I have a soft 404 problem in webmaster tools for http://www.musicliveuk.com/about/feed and I'm not sure why. I read on here that if it is a main content page it should be fixed but I don't know how. I've tried to 301 redirect the page to http://www.musicliveuk.com/about/ but the redirect doesn't appear to be working? how do I fix this?
Intermediate & Advanced SEO | | SamCUK0