Block access to site from everywhere but north america
-
I have a site that is being attacked very hard by bots, malware, etc. Most of it seems to be originating from Asia and Eastern Europe so I want to block off access to the site to everybody but people in North America. We do not ship out of the country anyways so it really does not need to be seen by people around the world.
How can I set this up?
-
I do not see that function in the free version of cloud flare? I am adding the "challenge" rule to hopefully cut back. My client does not have the money for the paid plans.
My next step is to go disavow those links.
-
That is exactly the problem we had. Cloudflare helped with that as well as blocking the ips with our old webhost.
Also, make sure that you disavow the links from those domains in webmaster tools as well.
Good luck
Ken
-
It is people that are duplicating our website, hacking other servers, then uploading our modified malware filled site to these servers.
I do not care about fake referral traffic. I just need to totally 100% block traffic from Russia. I do not want them to even see the site.
-
Hi-
We had a similar situation which got even worse when someone initiated a DDOS attack on our site from out of the country. Since then we have used cloudflare.com and things have been a lot better.
Good Luck
Ken
-
Then you can follow the last part of my previous answer. Which are the spammers that you are seeing? If it's ghost spam like 4webmasters, free-social-buttons, then the only way to stop them is with filters in google analytics
-
It is drastic but it's needed/ We do not ship anywhere outside of the continent so there really is no need for traffic from Russia.
I want an easy solution that I an put in the robots.txt file or something
-
Hi Noah,
Excluding the whole world would be a drastic solution, I know the spam is a bad issue, but you could be missing real traffic even if you are a local website, and even if you don't, you will still be getting spam coming from USA and there is quite some. I recommend you to try other solution a filter based on your hostnames.
This solution requires a little more time to set up, but it has 3 huge advantages, and you won't have to exclude all the world except USA.
- You will stop the spam before it hits you, adding a filter for the referral after you see it will stop it, but by the time you apply it you will have already hits of the spam.
- You will need only ONE filter to stop all ghost spam, instead of creating various sets of filters.
- Lately, some of the spammers(e.g. free-social-buttons) have been hitting GA accounts with fake direct visits along with the referral, the filter for the referral won’t stop the direct visit, on the other hand. The Valid hostname filter will stop ALL ghost spam in any form whether it shows as a referral, keyword or direct visit.
This is what I've been using on my accounts for the last moths and I haven't received a single hit of ghost spam. You can find more information of how this filter works and a detailed guide to set it up in this article.
http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/
If you are not convinced and still want to exclude all countries you can follow the same guide for the valid hostname on the article and just change the filter field for Country and put United States in the filter pattern
Hope it helps,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What will happen if i manually click on my backlinks placed on other sites using vpn will it increase my linking domain as i am using vpn
will the linking domain of my website increases when i manually click on the backlinks of my website using vpn
International SEO | | calvinkj0 -
What's the best homepage experince for an international site?
Greeting Mozzers. I have a question for the community, which I would appreciate your input on. If you have a single gTLD that services multiple countires, what do you think is the best homepage UX for the root homepage and why? So the example would be you own website www.company.org and target content to Germany, Japan and Australia with content through the folder structure eg. www.company.org/de-de If someone comes to the www.company.org from a region, would you: Redirect them based on location IP – so if from Germany they land on www.company.org/de-de Let them land on the homepage which offers location selection Let them land on a page with content and offer location selection eg. pop-up or obvious selection box Something I’ve not thought of… I'd appreciate your input. Thanks
International SEO | | RobertChapman0 -
Correct site internationalization strategy
Hi, I'm working on the internationalization of a large website; the company wants to reach around 100 countries. I read this Google doc: https://support.google.com/webmasters/answer/182192?hl=en in order to design the strategy. The strategy is the following: For each market, I'll define a domain or subdomain with the next settings: Leave the mysitename.com for the biggest market in which it has been working for years, and define the geographic target in Google search console. Reserve the ccTLD domains for other markets In the markets where I'm not able to reserve the ccTLD domains, I'll use subdomains for the .com site, for example us.mysitename.com, and I'll define in Google search console the geographic target for this domain. Each domain will only be in the preferred language of each country (but the user will be able to change the language via cookies). The content will be similar in all markets of the same language, for example, in the .co.uk and in .us the texts will be the same, but the product selections will be specific for each market. Each URL will link to the same link in other countries via direct link and also via hreflang. The point of this is that all the link relevance that any of them gets, will be transmitted to all other sites. My questions are: Do you think that there are any possible problems with this strategy? Is it possible that I'll have problems with duplicate content? (like I said before, all domains will be assigned to a specific geographic target) Each site will have around 2.000.000 of URLs. Do you think that this could generate problems? It's possible that only primary and other important locations will have URLs with high quality external links and a decent TrustRank. Any other consideration or related experience with a similar process will be very appreciated as well. Sorry for all these questions, but I want to be really sure with this plan, since the company's growth is linked to this internationalization process. Thanks in advance!
International SEO | | robertorg0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
How to interlink 16 different language versions of site?
I remember that Matt Cutts recommended against interlinking many language versions of a site.
International SEO | | lcourse
Considering that google now also crawls javascript links, what is best way to implement interlinking? I still see otherwhise extremely well optimized large sites interlinking to more than 10 different language versions e.g. zalando.de, but also booking.com (even though here on same domain). Currently we have an expandable css dropdown in the footer interlinking 16 different language versions with different TLD. Would you be concerned? What would you suggest how to interlink domains (for user link would be useful)?0 -
International Sites - Sitemaps, Robots & Geolocating in WMT
Hi Guys, I have a site that has now been launched in the US having originally just been UK. In order to accommodate this, the website has been set-up using directories for each country. Example: domain.com/en-gb domain.com/en-us As the site was originally set-up for UK, the sitemap, robots file & Webmaster Tools account were added to the main domain. Example: domain.com/sitemap.xml domain.com/robots.txt The question is does this now need changing to make it specific for each country. Example: The sitemap and robots.txt for the UK would move to: domain.com/en-gb/sitemap.xml domain.com/en-gb/robots.txt and the US would have its own separate sitemap and robots.txt. Example : domain.com/en-us/sitemap.xml domain.com/en-us/robots.txt Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb? Any help would be appreciated! Thanks!
International SEO | | CarlWint0 -
Ranking in Different Countries - Ecommerce site
My client has a .com ecommere site with UK-based serves and he wants to target two other countries (both English speaking). By the looks of it, he wouldn't want to create separate local TLDs targeting each country, I therefore wanted to suggest adding subdomains / subfolders geo-targeted to each country that they want to target, however, I'm worried that this will cause duplicate content issues... What do you think would be the best solution? Any advice would be greatly appreciated! Thank you!
International SEO | | ramarketing0 -
How should I make my site better?
I am glad to join seomoz,I am from China,the seomoz is a famous seo service provider company,some reason is one seoer guru named zac introduce seomoz to Chinese seor. So I think if seomoz provider seo tools or service to chinese seoer is a good idea.The market is very big.But chinese biggest SE is www.baidu.com,not google.There is something diffrent from baidu and google. My site is www.cn-sen.com, It's good performance at google with the keyword "除湿机",but it's have some trouble at baidu.I think the content of website is the main reason.and internal link is not good. Could someone give me some advise of seo to make my site better performance? thanks very much.
International SEO | | tylrr1230