Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
-
I am considering using cloudflare for a couple of my sites.
What is your experience?I researched a bit and there are 3 issues I am concerned about:
-
google may consider site bad neighbourhood in case other sites on same DNS/IP are spammy.
Any way to prevent this? Anybody had a problem? -
ddos attack on site on same DNS could affect our sites stability.
-
blocking false positives. Legitimate users may be forced to answer captchas etc. to be able to see the page. 1-2% of legit visitor were reported by other moz member to be identified as false positive.
Can I effectively prevent this by reducing cloudflare basic security level?
Also did you experience that cloudflare really helped with uptime of site? In our case whenever our server was down for seconds also cloudflare showed error page and sometimes cloudflare showed error page that they could not connect even when our server response time was just slow but pages on other domains were still loading fine.
-
-
Thanks Cyrus.
-
You may be interested in this post titled "Cloudflare and SEO" : https://blog.cloudflare.com/cloudflare-and-seo/
"We did a couple things. First, we invented a new technology that, when it detects a problem on a site, automatically changes the site's CloudFlare IP addresses to isolate it from other sites. (Think of it like quarantining a sick patient.) Second, we worked directly with the crawl teams at the big search engines to make them aware of how CloudFlare worked. All the search engines had special rules for CDNs like Akamai already in place. CloudFlare worked a bit differently, but fell into the same general category. With the cooperation of these search teams we were able to get CloudFlare's IP ranges are listed in a special category within search crawlers. Not only does this keep sites behind them from being clustered to a least performant denominator, or incorrectly geo-tagged based on the DNS resolution IP, it also allows the search engines to crawl at their maximum velocity since CloudFlare can handle the load without overburdening the origin."
-
Thanks Tom.
I will move now one of my main domains and will use their PRO plan. Noticed they have quite a number of settings to address the false positives. Our problem with cloudflare error pages may have been a temporary one while they where building the cache of the site. Anyway it is easy to enable/disable the cloudflare protection. So not much risk here. Could save us of a lot of potential headache in the future if it works as advertised. -
Hi,
-
I have used CloudFlare for a few sites and never had an issue with this. It is a risk/concern with all shared hosting, but CloudFlare are very proactive about addressing anything impacting their customers, so I would not have a concern on this side of things at all.
-
Again, I wouldn't have concerns here. CloudFlare are very adept at handling large-scale DDOS attacks . Having read some of their post-attack analysis reports, they usually mitigate any impact to customers very quickly. They have loads of customers, and if this sort of thing was an issue I think we'd hear about it fairly often.
-
I can't speak to the % of users that might get falsely identified as a risk and presented a CAPTCHA, but I'd be very surprised if it was as high as 1-2%; I've rarely seen that CAPTCHA screen myself. You should check what CloudFlare have to say on this issue, but I would have no concern here either.
I have never had an issue with CloudFlare impacting SEO performance or impacting the user experience. It has generally performed well for me, but the biggest issue I see with it is people hoping it is a 'cure all' and means they don't need to properly address issues affecting the performance of their site. If your database performance is very poor, meaning dynamic pages take a long time to load, then CloudFlare is not the answer (it may help - but you should address the underlying issue).
I am unsure about the issue with CloudFlare failing when your server is slow - I'd imagine CloudFlare support could help you with this - there may be a configuration option somewhere.
Overall - my suggestion would be that you go for it.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to tell if you were doing everything perfectly on the SEO front what your average SERP position and page views could be?
If you had optimised everything 100% on your website (keywords, links, everything), is there some tool or some way to figure out what your average SERP position could be and what volume of traffic you could be getting?
Intermediate & Advanced SEO | | AnnaKot0 -
Reclaiming Ranking positions in Google
We have a website we are working on that was ranking well in Google but since having a hosting upgrade has completely dropped in rankings. When a hosting upgrade was made, the developer added an incorrect robots.txt file that restricted the site from being found, hence resulting in lost rankings. We have since sorted out that issue so the robots.txt is OK. However, ranking results have yet to be reclaimed. We are unsure why these rankings haven't rebounded back, as it has been a while now. The site is https://www.brightonpanelworks.com.au. We have since also attempted to add a sitemap however to help the site be better crawled and to regain rankings, however, it appears that sitemap generators are having problems creating a sitemap for this site and we are not sure why. And we are not sure whether this may relate to why Google has not picked up on pages and ranking results have not be restored. If you have any ideas as to how we can reclaim rankings to the strong positions they were in previously, that would be much appreciated. We believe we may be missing something here that is not allowing webpages to be picked up and ranked by Google.
Intermediate & Advanced SEO | | Gavo0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Multiple 301 redirects for a HTTPS URL. Good or bad?
I'm working on an ecommerce website that has a few snags and issues with it's coding. They're using https, and when you access the website through domain.com, theres a 301 redirect to http://www.domain.com and then this, in turn, redirected to https://www.domain.com. Would this have a deterimental effect or is that considered the best way to do it. Have the website redirect to http and then all http access is redirected to the https URL? Thanks
Intermediate & Advanced SEO | | jasondexter0 -
Geo-targeting Content Based On IP address?
What are the benefits / disadvantages of geo-targeting content based on IP address. A client is interested in serving up different content on their homepage based on what area the user is coming from. This seems like an SEO nightmare to me as search engine spiders could potentially see different content depending on when they visit. Is there a best practices here? Or is it looked down upon in regards to SEO? Any information would be helpful.
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Stuck at position 23 - Google Sandbox?
Hello, We have an exact match domain, that for it's exact match term is stuck at position 23. We launched around June 20 I'm talking about only the exact match term On the home page, we have yet to put keyword into the content, h1, or meta description, although they do occur in fragments in the meta description. The keyword is in the title This is not a very competitive industry For other sites that I built a couple of years ago, exact match domains (for their exact match term) would get on the first page after the sandbox was complete (4-6 months) for non-competitive industries even without backlinks. Are we experiencing a sandbox? How long? What information do you need? Thank you.
Intermediate & Advanced SEO | | BobGW0 -
Duplicate description problem in Wordpress.
Webmaster tools is flagging up duplicate descriptions for the page http://www.musicliveuk.com/live-acts. The page is one page in the wordpress page editor and the web designer set it up so that I can add new live acts from a seperate page editor on the left menu and that feeds into the page 'live-acts'. (it says under template 'live-acts-feed'. The problem is as I add more acts it creates new url's eg http://www.musicliveuk.com/live-acts/page/2 and http://www.musicliveuk.com/live-acts/page/3 etc... I use the all in one SEO pack and webmaster tools tells me that page 2/3/4/ etc all have the same description. How can I overcome this? I can't write new descriptions for each page as the all in one SEO pack will only allow me to enter one for the page 'live-acts'.
Intermediate & Advanced SEO | | SamCUK0 -
Losing Positions
HELP!!! after implementing some changes suggested by SEOMoz Pro the only changed is that i lost some very very very dear positions. I had expected that my website would raise in position due to the changes but in fact it lost some. I live in the netherlands so i use Google NL and Bing NL for rank tracking. For some specific keywords like: kamperen bij de boer, boerderijcampings, boerencampings, Vekabo Campings, minicampings, kampeerartikelen, boerencamping frankrijk i only lost positions and thus visitors. Some of the keywords lost about 9 positions. What am i doing wrong? I do everything SEOMoz suggested including changing 301 titles on my website due to titletags with more then 70 characters and the only thing that happens is losing positions? Duplicate content is no longer an issue. No errors were found what so ever and still in stead of improving the rankings the declined. Who can help me because this is not what i expected from this program.
Intermediate & Advanced SEO | | JarnoNijzing0