Best way to block spambots in htaccess
-
I would like to block Russian Federation, China and Ukraine spam as well as semalt and buttonsforwebsite. I have come up with the following code, what do you think?
For the countries:
BLOCK COUNTRY DOMAINS
RewriteCond %{HTTP_REFERER} .(ru|cn|ua)(/|$) [NC]
RewriteRule .* - [F]And for buttons-for-website.com and semalt-semalt.com:
BLOCK REFERERS
RewriteCond %{HTTP_REFERER} (semalt|buttons) [NC]
RewriteRule .* - [F]or should it be:
BLOCK USER AGENTS
RewriteCond %{HTTP_USER_AGENT} (semalt|buttons) [NC]
RewriteRule .* - [F]Could I add (semalt|buttons|o-o-6-o-o|bestwebsitesawards|humanorightswatch) or is that too many?
-
Hi
I think you're on the right track.
A very good blog post by Jared Gardner has recently addressed this question on the Moz Blog.
Hope it helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Right way to delete and update old blog-posts?
Hi all, Rand Fishkin once tweeted about success story of Quickbooks blog as they deleted their old and outdated content to show only high quality content to their audience. We are planning to implement the same strategy to our blog which consists of 800+ blog-posts. I'm just wondering the best way to proceed on this and planning to follow as below. Please correct me if I'm wrong and if there are any better steps of follow: Get the list of blog-posts. Check the traffic of each blog-post. If the blog-post needs to be existed, update any info on the blog. If the blog no more needed, do I need to delete or noindex or redirect? What's the better way to measure the success? Thanks
Reporting & Analytics | | vtmoz1 -
The best way to track internal links in Google Analytics
Hi there, we are a retail business and we have invested in quality editorial content which sits in our Blog at ourwebsite.co.uk/blog/ The Blog links to the main site (an online store) and I want to track the 'value' of the blog by how many clicks the blog content generates back to the main store. At the moment we're using this code on the end of every link in the Blog: ?utm_source=Blog&utm_medium=Widget&utm_campaign=FromBlog Does this affect SEO and is there a better way of doing it? Thanks.
Reporting & Analytics | | Bee1590 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
How can I remove parameters from the GSC URL blocking tool?
Hello Mozzers My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google. The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool. Thank you Mozzers!
Reporting & Analytics | | Vsood0 -
Does anyone know of a way to do a profile level filter to exclude all traffic if it enters the site via certain landing pages?
Does anyone know of a way to do a profile level filter to exclude all traffic if it enters the site via certain landing pages? The problem I have is that we have several pages that are served to visitors of numerous other domains but are also served to visitors of our site. We end up with inflated Google Analytics numbers because people are viewing these pages from our partners' domains but never actually entering our site. I've made an advanced segment that serves the purpose but I'd really like to filter it at the profile level so the numbers across the board are more accurate without having to apply an advanced segment to every report. The advanced segment excludes visits that hit these pages as landing pages but includes visits where people have come from other pages on our domain. I know that you can do profile filters to exclude visits to pages or directories entirely but is there a way to filter them only if they are a landing pages? Any other creative thoughts? Thanks in advance!
Reporting & Analytics | | ATIseo0 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0 -
Google Analytics Best Practice Set up for Clients
Hi When setting up new Google Analytics accounts for clinets what is the preferred/best practice. At present we have our own company google account and add new clinets this way (to our account) - the disadvantage with this, we can only grant them limited account access otherwise they would be able to view all the accounts we cretaed. Plus we can't link their adwords to the GA account we cretaed them. Is it best practice to set the client up with their own Google Account and then we just link to their account. Advise would be appreciated, thank you.
Reporting & Analytics | | daracreative0