Blocking certain countries via IP address location
-
We are a US based company that ships only to US and Canada. We've had two issues arise recently from foreign countries (Russia namely) that caused us to block access to our site from anyone attempting to interact with our store from outside of the US and Canada.
1. The first issue we encountered were fraudulent orders originating from Russia (using stolen card data) and then shipping to a US based International shipping aggregator.
2. The second issue was a consistent flow of Russian based "new customer" entries.
My question to the MOZ community is this: are their any unintended consequences, from an SEO perspective, to blocking the viewing of our store from certain countries.
-
Both answers above are correct and great ones.
From a strategical point of view, formally blocking russian IPs does not have any SEO effect in your case, because - as a business - you don't even need an SEO strategy for the Russian market.
-
Fully agree with Peter, very easy to bypass IP blocking these days, there are some sophisticated systems that can still detect but mostly outside the range of us mere mortals!
If you block a particular country from crawling your website it is pretty certain you will not rank in that country (which I guess isn't a problem anyway) but I suspect this would only have a very limited (if any) impact on your rankings in other countries.
We have had a similar issue, here are a couple of ideas.
1. When someone places an order use a secondary method of validation.
2. With the new customer entries/registrations make sure you have a good captcha, most of this sort of thing tends to be from bots. A captcha Will often fix that problem.
-
Blocking IPs on geolocation can be dangerous. But you can use MaxMind GeoIP database:
https://github.com/maxmind/geoip-api-php
or you also can implemente GeoIP in "add to cart" or "new user" as additional check. So when user is outside of US/CA you can require them to fill captcha or just ignore their requests.Now from bot point of view - if bot visit with US IP and with UK (example) IP they will see same pages. Just within UK they can't create new user or adding to cart. HTML code will be 100% same.
PS: I forgot... VPN or Proxies are cheap these days. I have few EC2 instances with everything just for mine own needs. Bad Guys also can use them so think twice about possible "protection". Note the quotes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some questions about URL structure and multi country website
Gajanand angela dayHi,
Technical SEO | | Shahjahaaan
I have a question from SEO experts and web developers.
I want to setup a job website for 5 countries. for each country i will provide daily jobs listing on the basis of
1. jobs by categories - for example : accounting jobs. IT jobs, Sales jobs
2. jobs by city - for example : jobs in boston, jobs in chicago
3. jobs by companies for example : jobs in facebook, jobs in emirates case :
a company name " emirates " located in "boston" having vacancy of "accounting job " having position of full time this case job will be present in following categories . 1. accounting jobs in boston
2. jobs in boston
3. jobs in emirates and open any above option there will be filter box on left side showing
position i.e full time
salary i.e 1000-1500
location i.e boston,chicago Q.1
i want to know when user search on google these terms "accounting jobs in boston " or "jobs in boston" or "jobs in emirates" same job will display which url structure is recommended in for each search term? Q.2 how we can do on page SEO for these terms because jobs listing will be changing daily because of new jobs addition and content is changing not Q.3 should i create website on separate domains for each country or same domain but with different folders in it
.co.uk or com/uk for UK and .ae OR .com/uae for UAE Note : i will also attach blog on it and each blog will focus on specific country knowledge for example for USA , how to find jobs in new york and for UAE how to find jobs in Dubai etc . Thanks in Advance0 -
Use hreflang language or hreflang language & country code
Hi, our website has 7 languages, but only one English version (site.com/en). When I add a hreflang tag below, is it enough to just target English search queries no matter where they come from by using only the language code, or should I specify all countries (UK, USA, Ireland, Australia, NZ, ...) by using separate hreflangs? Same for Portuguese, Dutch & French... Should I just add the language tags or specify all countries? Like I said, we don't have localized versions for those countries, with specific content targeting those countries.
Technical SEO | | jorisbrabants0 -
How to Target Other Countries Using TLDs?
I would like to know if it is possible (and beneficial) to target other countries using country-based TLDs? When visiting a company website for instance, you often get redirected to your country's site. For instance, when you visit cafepress.com from Canada, you get redirected to cafepress.ca. Since both websites (cafepress.com and cafepress.ca) have the same content, how they get away with it with no duplicate content issues?
Technical SEO | | sbrault740 -
What if my developers tell me they can only handle a certain amount of 301 redirects?
We recently launched a new site and I felt the need to redirect all of our old site URLs to the new site URLs. Our developers told me they would only be able to do about 1000 before it starts to bog down the site. Has anyone else came across this before? On top of that, with our new site structure, whenever our content team changes a title (which is more often than i had hoped), the URL changes. This means I'm finding i have many other redirects I need to put in place, but cant at the moment. Advice please??
Technical SEO | | CHECOM0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
How to move many domains form an address to another?
We need to move a site from a domain to another one, and there are also hundreds or even thousands of subdomains to move. What would be the best practise to do it in order to save at least some of the visibility in search results?
Technical SEO | | Tulos0 -
Google (GWT) says my homepage and posts are blocked by Robots.txt
I guys.. I have a very annoying issue.. My Wordpress-blog over at www.Trovatten.com has some indexation-problems.. Google Webmaster Tools data:
Technical SEO | | FrederikTrovatten22
GWT says the following: "Sitemap contains urls which are blocked by robots.txt." and shows me my homepage and my blogposts.. This is my Robots.txt: http://www.trovatten.com/robots.txt
"User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/ Do you have any idea why it says that the URL's are being blocked by robots.txt when that looks how it should?
I've read a couple of places that it can be because of a Wordpress Plugin that is creating a virtuel robots.txt, but I can't validate it.. 1. I have set WP-Privacy to crawl my site
2. I have deactivated all WP-plugins and I still get same GWT-Warnings. Looking forward to hear if you have an idea that might work!0