Unsolved What would the exact text be for robots.txt to stop Moz crawling a subdomain?
-
I need Moz to stop crawling a subdomain of my site, and am just checking what the exact text should be in the file to do this.
I assume it would be:
User-agent: Moz
Disallow: /But just checking so I can tell the agency who will apply it, to avoid paying for their time with the incorrect text!
Many thanks.
-
To disallow Moz from crawling a specific subdomain, you would need to add a robots.txt file to the root directory of that subdomain with the following content:
User-agent: rogerbot
Disallow: /This will disallow Moz's web crawler, Rogerbot, from crawling any page or file within the subdomain. Keep in mind that this will only prevent Moz from crawling the subdomain - other search engines or bots may still be able to access it unless you add specific disallow rules for them as well.
-
@Simon-Plan No, when you put just slash / you will disallow everything.
Instead you need to put /foo/ where foo is your subdomain. Please see here for a reference to some relevant examples: https://searchfacts.com/robots-txt-allow-disallow-all/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Why is Moz.com saying that none are linking to www.oneworldcetner.eu
It still says 0 after 6 weeks if i use other tooks like http://openlinkprofiler.org/r/oneworldcenter.eu#.VhbF7ivzJ8E I do see the backlinks that i was counting on was there
Getting Started | | onewordcenter0 -
Clarifications on the Moz Analytics package (Medium - $149 per month)
What are the Moz tools available with this package? What factors of SEO can be checked with these tools? With this package, is it possible to provide a single URL (preferably home page) and Moz will analyse the entire site and highlight how the site performs wrt various SEO factors? This package states that with this package we can run 10 Moz Analytics campaigns. Our understanding of Moz Analytics Campaign is every site; say www.test.com is one analytics campaign. Are we correct? Does the subdomains within a parent domain also considered as one analytics campaign. For e.g., if I have sites: www.mydomain.com and www.xxx.mydomain.com are they considered two separate campaigns or are they considered as one single campaign? In this package it is listed as 750 keywords, what does this signify? In what way this feature can be used to check our site’s SEO compliance. Please elaborate. In this package it is listed as 15 social accounts, what does this signify? In what way this feature can be used to check our site’s SEO compliance. Please elaborate. What do you mean by branded reports?
Getting Started | | WebCCTrial0 -
More problems with Moz - Monthly Reporting
Hello All, I may be missing something, but I only want to create monthly reports for my clients from this system. A week in SEO is a terrifying rollercoaster, but any I digress. Currently, i'm adding clients to the system, setting up Analytics, Twitter, Google+, keywords etc. I then go to the custom reports menu as I want to build my report now and schedule it to run at the end of the month, however, because a month has not yet passed, I can only create weekly report segments. This means that I have to add a client to your system and then come back a month later to build a report, this is silly, am I using this wrong or is this not right? I don't care if any data is available, I just want to create the report so I don't have to do it later ideally.
Getting Started | | Paul_Tovey1 -
Moz Staff: History + Watch List
Hi All, I have been going crazy trying to find a thread I read last week ... I think!
Getting Started | | Mark_Ch
Several search attempts later, no closer to finding the thread in question. A genius moment or simple frustration got me thinking. How cool would it be to have the following features on MOZ Q&A Forum. Watch List
1] Check off any threads that you would like to monitor over time. Should you stumble upon a thread of value then simply add to watch list and view it's progress over time. No searching for the thread in question. History
2] Keep a history off all threads viewed over a period of time. Thanks Mark0 -
Moz API and license
Hi Support,We are new to SEOMOZ and would appreciate if you can answer the following:1. Can we build our own Hubspot right to our own website where members can access and see how their sites are performing using your API? If that is possible, do you have tips/suggestions on how?2. What is a crawl limit? How does that work?3. How many sites can we work on under the developers - professional license ?Looking forward to your responses.Thanks,Tony
Getting Started | | chickenjoy20130 -
Is the Moz Rank Tracker Broken?
Hi, I've waited a few days for data to appear in the account but all of the keywords show 'not in top 50'. When I know this isn't the case since I can see half of them myself either on page one or two (yes I have cleared cache / history and have tried an incognito window to see the results for myself; I have even tried another PC. Also I'm really not keen on the new layout of the pro account, everything is so much smaller event the text. Personally I think it doesn't look half as clean and simple as it used to. Another thing, the link analysis tab isn't showing all of my links - it's only showing 2! Which is way off, even when I check with Open Site Explorer there are a lot more than that. Thanks Ricky
Getting Started | | Thirsty-Media0