Blocking pages from Moz and Alexa robots
-
Hello,
We want to block all pages in this directory from Moz and Alexa robots - /slabinventory/search/
Here is an example page - https://www.msisurfaces.com/slabinventory/search/granite/giallo-fiesta/los-angeles-slabs/msi/
Let me know if this is a valid disallow for what I'm trying to.
User-agent: ia_archiver
Disallow: /slabinventory/search/*User-agent: rogerbot
Disallow: /slabinventory/search/*Thanks.
-
Hi,
Firstly, yes, that robots.txt is valid and would work for your purpose.
There's a great tool (https://technicalseo.com/tools/robots-txt/) that allows you to put in your proposed robots.txt file contents, the URL you want to test and even the robot you want to test against and it lets you know the result.
-
That looks valid to me. It's possible you may not need "*" at the end of each rule but I can't see it doing any harm either
I might go more like:
User-agent: ia_archiver
Disallow: /*/search/User-agent: rogerbot
Disallow: /*/search/^ this would stop all search URLs being indexed, so even if you introduced new search facilities later in other directories - they would 'probably' be caught too (assuming that is your intention, assuming they were still in /search/ subdirs)
Don't think what you have done is wrong though.
Always check using Google's robots.txt tester to be safe. Just put your rules into the tester (altering them to be used for all user-agents), and try out some different URL patterns. When it works as you like, update your real robots.txt file (remembering of course, to restore your rogerbot / alexa UA targeting - if you don't want the rules to also apply to Google!)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Page Juice not moving???
Moved URL's from ldnwicklesscandles.com to ldnwicklesscandles.co.uk because I wanted to rank better for UK where I'm located and thought also the .co.uk for my competitors may have been giving them the advantage. Use Squarespace 7 (transferred over from SS5)----they told me to set primary domain to .co.uk and I've done it. I've also done a 301 redirect and done a change of address in webmaster tools although I'm not sure if all of this is needed? Squarespace seem to think just setting the primary domain is enough. My question is its been a couple of weeks, I've resubmited to Google webmaster to try to speed things up, the new URL is appearing in Google but none of my Page Juice seems to be transferring yet? How long will it take? I know not all the juice will move over but my PA/DA is non existent now and I have no idea if I'm just being impatient or I've done something wrong here. Not a Pro, Just a small biz owner here so forgive me if this has been asked before.
Technical SEO | | ldnwickless0 -
Pages extensions
Hi guys, We're in the process of moving one of our sites to a newer version of the CMS. The new version doesn't support page extensions (.aspx) but we'll keep them for all existing pages (about 8,000) to avoid redirects. The technical team is wondering about the new pages - does it make any difference if the new pages are without extensions, except for usability? Thanks!
Technical SEO | | lgrozeva0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
I accidentally blocked Google with Robots.txt. What next?
Last week I uploaded my site and forgot to remove the robots.txt file with this text: User-agent: * Disallow: / I dropped from page 11 on my main keywords to past page 50. I caught it 2-3 days later and have now fixed it. I re-imported my site map with Webmaster Tools and I also did a Fetch as Google through Webmaster Tools. I tweeted out my URL to hopefully get Google to crawl it faster too. Webmaster Tools no longer says that the site is experiencing outages, but when I look at my blocked URLs it still says 249 are blocked. That's actually gone up since I made the fix. In the Google search results, it still no longer has my page title and the description still says "A description for this result is not available because of this site's robots.txt – learn more." How will this affect me long-term? When will I recover my rankings? Is there anything else I can do? Thanks for your input! www.decalsforthewall.com
Technical SEO | | Webmaster1230 -
Is it better to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Is it better for SEO to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Technical SEO | | CustomOnlineMarketing0 -
Page Over-optimized?
I read over this post on the blog tonight: http://www.seomoz.org/blog/lessons-learned-by-an-over-optimizer-14730 & it's got me concerned that I might be having a similar issue on our site? Back in March & April of last year, we ranked fairly well for a number of long tail keywords, here is one in particular 'Mio Drink' for this page: http://www.discountqueens.com/free-mio-drink-from-kraft-facebook-offer The page is still indexed, but appears back on page #3 for the search term. During this time we had made a number of different updates to our site & I can't seem to put an exact finger on what might have caused the problem? Can anyone see any issues that might have caused this to drop? Thanks, BJ
Technical SEO | | seointern0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0