Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved The Moz.com bot is overloading my server
-
. How to solve it?
-
For a step-by-step guide on setting up the tool, check out the solara executor download tutorial.
-
maybe crawl delay will help.
-
@paulavervo
Hi,
We do! The best way to chat with us is via our contact form or direct email. We also have chat within Moz Pro.
Please contact us via help@moz.com or https://moz.com/help/contact
We will be happy to help.
Cheers,
Kerry. -
very nice brother i like it very good keep it up !
-
very nice !
-
does the moz team even monitor this forum?
-
If the Moz.com bot is overloading your server, there are several steps you can take to manage and mitigate the issue effectively. First, you can adjust the crawl rate in your
robots.txt
file by specifying a crawl delay for the Moz bot using directives likeUser-agent: rogerbot
andUser-agent: dotbot
, followed byCrawl-delay: 10
to make the bot wait 10 seconds between requests. If this does not suffice, you can temporarily block the bot by disallowing it in yourrobots.txt
file. Additionally, it's a good idea to contact Moz’s support team to explain the issue, as they may offer solutions to adjust the crawl rate for your site. Implementing server-side rate limiting is another effective strategy. For Apache servers, you can add rules in your.htaccess
file to return a 429 Too Many Requests status code to the Moz bots, while for Nginx servers, you can set up rate limiting in your configuration file to control the number of requests per second from a single user or IP address. Monitoring your server’s performance and log files can help identify specific patterns or peak times, allowing you to fine-tune your settings. Furthermore, using a Content Delivery Network (CDN) can help distribute the load by caching content and serving it from multiple locations, reducing the direct impact on your server caused by crawlers. By taking these steps, you can manage the load from the Moz.com bot and maintain your server’s stability and responsiveness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
Unsolved Need Moz SEO Wordpress Plugin With API
Re: Moz WordPress Plugin? Hi guys,
Moz Pro | | mrezair
I need some Moz SEO Wordpress Plugins For my website working with Moz API. I've already found Moz DA-PA Checker plugin Moz DA-PA Checker But Need SEO Plugins too. Any Suggestion will be appreciated.0 -
How long does it takes for Moz pro to find backlinks and increase my score?
Hi, I'm running a website for tourism & travel called Visit Guide or دليل السفر and i have many backlinks from many websites ( DoFollow & NoFollow ) They are older than 10 days and Moz has not discovered them yet nor found them? How to push Moz Pro to find my backlinks? or how to SpeedIndex my backlinks in other words?
Moz Pro | | VisitDotGuide0 -
Competitor getting External Links from search.aol.com
Recently, I noticed that one of the competitors I track within my Moz campaign received about 12 new inbound links. As a result, there DA jumped about 10 points. I reviewed these new external links and was surprised to see that they are all "search.aol.com/aol/search?query= ..." with Link Anchor Text that is good for the industry we compete in. Can anyone tell me why these are being counted as "Inbound Links". It just doesn't seem right. Is this some sort of black hat seo tactic?
Moz Pro | | itvisionsinc0 -
Ive been using moz for just a minute now , i used it to check my website and find quite a number of errors , unfortunately i use a wordpress website and even with the tips , is till dont know how to fix the issues.
ive seen quite a number of errors on my website hipmack.co a wordpress website and i dont know how to begin clearing the index errors or any others for that matter , can you help me please? ghg-1.jpg
Moz Pro | | Dogara0 -
Domain.com and domain.com/index.html duplicate content in reports even with rewrite on
I have a site that was recently hit by the Google penguin update and dropped a page back. When running the site through seomoz tools, I keep getting duplicate content in the reports for domain.com and domain.com/index.html, even though I have a 301 rewrite condition. When I test the site, domain.com/index.html redirects to domain.com for all directories and root. I don't understand how my index page can still get flagged as duplicate content. I also have a redirect from domain.com to www.domain.com. Is there anything else I need to do or add to my htaccess file? Appreciate any clarification on this.
Moz Pro | | anthonytjm0