Blocking certain countries via IP address location
-
We are a US based company that ships only to US and Canada. We've had two issues arise recently from foreign countries (Russia namely) that caused us to block access to our site from anyone attempting to interact with our store from outside of the US and Canada.
1. The first issue we encountered were fraudulent orders originating from Russia (using stolen card data) and then shipping to a US based International shipping aggregator.
2. The second issue was a consistent flow of Russian based "new customer" entries.
My question to the MOZ community is this: are their any unintended consequences, from an SEO perspective, to blocking the viewing of our store from certain countries.
-
Both answers above are correct and great ones.
From a strategical point of view, formally blocking russian IPs does not have any SEO effect in your case, because - as a business - you don't even need an SEO strategy for the Russian market.
-
Fully agree with Peter, very easy to bypass IP blocking these days, there are some sophisticated systems that can still detect but mostly outside the range of us mere mortals!
If you block a particular country from crawling your website it is pretty certain you will not rank in that country (which I guess isn't a problem anyway) but I suspect this would only have a very limited (if any) impact on your rankings in other countries.
We have had a similar issue, here are a couple of ideas.
1. When someone places an order use a secondary method of validation.
2. With the new customer entries/registrations make sure you have a good captcha, most of this sort of thing tends to be from bots. A captcha Will often fix that problem.
-
Blocking IPs on geolocation can be dangerous. But you can use MaxMind GeoIP database:
https://github.com/maxmind/geoip-api-php
or you also can implemente GeoIP in "add to cart" or "new user" as additional check. So when user is outside of US/CA you can require them to fill captcha or just ignore their requests.Now from bot point of view - if bot visit with US IP and with UK (example) IP they will see same pages. Just within UK they can't create new user or adding to cart. HTML code will be 100% same.
PS: I forgot... VPN or Proxies are cheap these days. I have few EC2 instances with everything just for mine own needs. Bad Guys also can use them so think twice about possible "protection". Note the quotes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt blocking Addon Domains
I have this site as my primary domain: http://www.libertyresourcedirectory.com/ I don't want to give spiders access to the site at all so I tried to do a simple Disallow: / in the robots.txt. As a test I tried to crawl it with Screaming Frog afterwards and it didn't do anything. (Excellent.) However, there's a problem. In GWT, I got an alert that Google couldn't crawl ANY of my sites because of robots.txt issues. Changing the robots.txt on my primary domain, changed it for ALL my addon domains. (Ex. http://ethanglover.biz/ ) From a directory point of view, this makes sense, from a spider point of view, it doesn't. As a solution, I changed the robots.txt file back and added a robots meta tag to the primary domain. (noindex, nofollow). But this doesn't seem to be having any effect. As I understand it, the robots.txt takes priority. How can I separate all this out to allow domains to have different rules? I've tried uploading a separate robots.txt to the addon domain folders, but it's completely ignored. Even going to ethanglover.biz/robots.txt gave me the primary domain version of the file. (SERIOUSLY! I've tested this 100 times in many ways.) Has anyone experienced this? Am I in the twilight zone? Any known fixes? Thanks. Proof I'm not crazy in attached video. robotstxt_addon_domain.mp4
Technical SEO | | eglove0 -
HTTP Status showing up in opensiteexplorer top pages as blocked by robot.txt file
I am trying to find an answer to this question it has alot of url on this page with no data when i go into the data source and search for noindex or robot.txt but the site is visible in the search engines ?
Technical SEO | | ReSEOlve0 -
IP canonization
Hi, I need your opinions about IP canonization. Site www.peoplemaps.com is on 78.136.30.112 IP. Now we redirect that IP to the main page (because of possible duplicate content). But, we have more sites on the same IP address. How can that affect on their SEO? Before redirecting, when we visit that IP address, the browser showed mainpage of www.peoplemaps.com, not any other site. Thanks, Milan edit: We have used 301 redirect.
Technical SEO | | MilanB.0 -
Web server locations for international seo
We have a site that is currently hosted in the far east for the far eastern market. We are having issues with the hosting co. so we are considering bringing the site back onto our servers in the UK. However, we don't obviously want to damage too much the bit of uplift we get from local hosting. What is our best approach? Is it ok just to have the site in the UK even though its aimed at the Far East? Or is the use of a proxy server good? Or should we look for other local hosts? Any help very gratefully received. Iain
Technical SEO | | iain0 -
An EMD with top level domain of another country. Still useful ?
Normally, EMDs (with couple of keywords in it) has more advantage over other domain names ( I understand other factors matters too & I am aware of EMD update). But what if there is an EMD with couple of keywords but top level domain is of another country. Confused? see ex - shoesinsydney.co.uk Will it STILL have a natural advantage over other domains (with no keywords in them)?
Technical SEO | | Personnel_Concept0 -
Would it make sense to add no-follow on certain interlinks?
I run a job board and some results pages may display Page 1, 2, 3 last. Should I consider making those page numbers no-follow. What about sign in and join us pages? There is nothing of relevance on those pages. I have previously posted a similar question and was advised not to no-follow pages from my own site. However, I see very successful competitors no-following very few select interlinks (like the page 1, 2, 3 scenario). This does not mean their approach is the best, but it makes me question......
Technical SEO | | knielsen0 -
Redirect everything from a certain url
I have a new domain (www.newdomain.com) and and an old domain (www.olddomain.com). Currently both domains are pointing (via dns nameserves) at the new site. I want to 301 everything that comes from the www.oldsite.com to www.newsite.com. I've used this htaccess code RewriteEngine On RewriteCond %{HTTP_HOST} !^www.newsite.com$
Technical SEO | | EclipseLegal
RewriteRule (.*) http://www.newsite.com/$1 [R=301,L] Which works fine and redirects if someone visits www.olddomain.com but I want it to cover everything from the old domain such as www.olddomain.com/archives/article1/ etc. So if any subpages etc are visited from the old domain its redirected to the new domain. Could someone point me in the right direction? Thanks0 -
Should I Block Tag, Category, Author Pages
Just finished reviewing the first crawl of my first SEOmoz campaign for a site that I am working on. The site I"m working on uses Wordpress as a CMS, and most if not all of the warnings and notices have to do with author, category, and tag pages. Should I block these from being indexed? Why or why not?
Technical SEO | | Falconberg0