The use of robots.txt
-
-
Thank you Martijn. It helps indeed.
-
Hi Daniela,
I can confirm that it won't be any problem if you don't have a robots.txt file if you don't want to block any pages. For myself I find it more useful to still have a robots.txt file in there which allows search engines to crawl the complete site. But that's just my personal opinion.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is 301 redirect the only way when using Vanity URLs?
We have been using vanity urls for some of our pages. Mostly the pages that have a vanity URL have a long URL length. But now the problem is, the vanity URL is getting displayed on the search engine when the particular keyword related to the page is entered. I checked the google search console, the vanity URL is indexed and the original URL remains unindexed. What should I do? Is adding 301 redirect to the vanity URLs are solution? Since some of vanity URLs are not redirecting to the original. Some of the original pages are not getting traffic. Also, can using canonical tag help?
Technical SEO | | tejasbansode0 -
Role of Robots.txt and Search Console parameters settings
Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?
Technical SEO | | LivDetrick0 -
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Use keywords that has another keyword in it for another link
Hi, I have these two links: A1 & A2 and the keywords for them are these: Pest control for A1 Pest control service for A2 is google smart enough to differentiate these two & rank the exact page for them accordingly? or does google guess pest control keyword in A2 link as well? please help me with this issue. & the same with these : termite inspection & termite inspections Arizona!! many thanks Shervin
Technical SEO | | Shervin0 -
Should I use canonicals? Best practice?
Hi there, I've been working on a pretty dated site. The product pages have tabs that separate the product information, e.g., a tab for specifications, a tab for system essentials, an overview tab that is actually just a copy of the product page. Each tab is actually a link to a completely separate page, so product/main-page is split into product/main-page/specs, product/main-page/resources, etc. Wondering if canonicals would be appropriate in this situation? The information isn't necessarily duplicate (except for the overview tabs) but with each tab as a separate page, I would imagine that's diluting the value of the main page? The information all belongs to the main page, shouldn't it be saying "I'm a version of the main page"?
Technical SEO | | anneoaks0 -
Using 302 redirect for SEO
Hello, I'm in charge of SEO for an information website on which articles are only accessible if you have a login and password. Most of the natural links we get point to our subscribers' subomain : subscribers.mywebsite.com/article1 If they follow these natural links, visitors who are not logged get redirected (302) to www.mywebsite.com/article1 on which there is an extract of the article and they can request a free test subscription to read the end of the article. My goal is to optimize SEO for the www.mywebsite.com/article1 page. Does this page benefit from the links I get to the subscribers.mywebsite.com/article1 page or are theses links lost in terms of SEO? Thanks for your help, Sylvain
Technical SEO | | Syl200 -
Robots.txt for subdomain
Hi there Mozzers! I have a subdomain with duplicate content and I'd like to remove these pages from the mighty Google index. The problem is: the website is build in Drupal and this subdomain does not have it's own robots.txt. So I want to ask you how to disallow and noindex this subdomain. Is it possible to add this to the root robots.txt: User-agent: *
Technical SEO | | Partouter
Disallow: /subdomain.root.nl/ User-agent: Googlebot
Noindex: /subdomain.root.nl/ Thank you in advance! Partouter0 -
Is anyone using Media Temple?
I'm looking to move 5 of my sites from Hostgator's shared servers to Media Temple's dedicated virtual servers. Anyone have experience with (mt)? I'm planning on adding a few more sites this year and several things they offer are attractive to me: A (virtually) dedicated environment: Faster websites, better user experience, plus I like having some control over my site's resources Scalability: I can add more resources easily (although not super cheap) Unique control panels for each site: More control for my tech savvy clients. Unique IPs for $1 a month: More linkjuice between my related sites. $50/month is a big jump from my $12/month Hostgator account but I'm thinking it will be worth it. Am I on the right track or is this a fool's errand?
Technical SEO | | AaronParrish0