Help - we're blocking SEOmoz cawlers
-
We have a fairly stringent blacklist and by the looks of our crawl reports we've begin unintentionally blocking the SEOmoz crawler.
can you guys let me know the useragent string and anything else I need to enable mak sure you're crawlers are whitelisted?
Cheers!
-
Hi Keri,
Still testing, though i see no reason why this shouldn't work so will close the QA ticket.
cheers!
-
Hi! Did this work for you, or would you like our help team to lend a hand?
-
We maintain a crawler (and others) blacklist to control server loads, so I'm just looking for the useragent string I can add to the white list. this one should do the trick;
Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
-
Still way to early for me ;-). I block specific robots rather than excluding all but a few.
I have not tried the following (but think/hope it will work) - this should block all robots, but allow SeoMoz and Google:
User-agent: *
Disallow: /User-agent: rogerbot
Disallow:User-agent: Google
Disallow:You would already have something like this in your robots.txt (unless your block occurs on a network/firewall level).
-
Thanks Gerd, though looks like your robots.txt is a disallow rule, when I'm looking to let it through.
I'll give this one a try: Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
-
I have it as "rogerbot"
<code>User-agent: rogerbot Disallow: /</code>
Access-log: Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking subdomains with Robots.txt file
We noticed that Google is indexing our pre-production site ibweb.prod.interstatebatteries.com in addition to indexing our main site interstatebatteries.com. Can you all help shed some light on the proper way to no-index our pre-prod site without impacting our live site?
Technical SEO | | paulwatley0 -
Strange URL's indexed
Hi, I got the message "Increase in not found errors" (404 errors) in GWT for one of my website. I did not change anything but I now see a lot of "strange" URL's indexed (~50) : &ui=2&tf=1&shva=1 &cat_id=6&tag_id=31&Remark=In %22%3EAny suggestion on how to fix it ?Erwan
Technical SEO | | johnny1220 -
Case sensitive url's
Hi, Really appreciate advice on this one in advance! We had a problem with case sensitive urls (eg: /web-jobs or /Web-jobs) We added a code to convert all urls into lowercase letters and added 301 redirection. We are now experiencing problems with duplicate page content. Each time a url contains caps letter it is converted and redirected to small letter url. I can convert all urls into lowercase letters (all places) but the problem now is google have already indexed urls so they may cause duplicate content issue. The solution: Remove 301 redirection added to convert url into small letter. Add canonical url which converts url into complete small letter, so google index content only from canonical url. But I am little confused about what will happen to already indexed pages with caps in url. Appreciate any advice you can give? Simon
Technical SEO | | simmo2350 -
Rel Canonical ? please help again!
Hi, I have been looking at the on page section and the grading. And I have noticed on nearly all of my pages an error. No More Than One Canonical URL Tag Moderate fix <dl> <dt>Number of Canonical tags</dt> <dd>2</dd> <dt>Explanation</dt> <dd>The canonical URL tag is meant to be employed only a single time on an individual URL (much like the title element or meta description). To ensure the search engines properly parse the canonical source, employ only a single version of this tag.</dd> <dt>Recommendation</dt> <dd>Remove all but a single canonical URL tag</dd> </dl> <a class="more expanded">Minimize</a> Please how do I make sure these canonicals are working properly, My rankings are getting worst fro long tail and short tail keywords. I am not even ranking for the main keywords "Probate" at all now! Our site is probate, we sell probate, we talk aout probate and now we are out of the top 200??? http://www.finalduties.co.uk Kind Regards Elissa HAyes
Technical SEO | | Chris__Chris0 -
So I created a site for the purpose of testing SEOMOZ
The site is build in wordpress and only has 1 post and no other pages. nonetheless seomoz tells me i have several duplicate pages. how do i fix this in wordpress. | Permission Marketing Dentistry http://permissionmarketingdentistry.com 2 1 0 Permission Marketing Dentistry http://permissionmarketingdentistry.com/ 2 1 0 admin | Permission Marketing Dentistry http://permissionmarketingdentistry.com/author/admin/ 3 1 0 Uncategorized | Permission Marketing Dentistry http://permissionmarketingdentistry.com/category/uncategorized/ |
Technical SEO | | dad7more0 -
Does redirect of domain alias help rankings?
Yes... It iz I again ;o) Here's one for you savy techies out there: So, I've got a primary domain which is live, optimized and running smooooth. And then I've got a couple of misspelled domains as well (17 to be exact). Will it have an effect if I 301 those misspelled domains? What's Best Practice for several domain aliases? Example.
Technical SEO | | nosuchagency
Primary domain: bryghusprojektet.dk
Alias domain 1: bryghusprojekt.dk (301 redirects to primary domain)
Alias domain 2: bryghus-projekt.dk (Hosting company infopage)
Alias domain 3: bryghus-projekter.dk (Not activated) Regards.1 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1