Is robots.txt a must-have for 150 page well-structured site?
-
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them).
I have used rel=nofollow for internal links that point to my Login page.
Is there any reason to include a generic robots.txt file that contains "user-agent: *"?
I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
-
Thanks, Keri. No, it's a hand-built blog. No CMS.
I think the googlebot is doing a good job of indexing my site. The site is small and when I search for my content I do find it in google. I was pretty sure that google worked the way you describe. So it sounds like sitemaps are an optional hint, and perhaps not needed for relatively small sites (couple hundred pages of well linked content). Thanks.
-
The phrase "blog entries" makes me ask are you on a CMS like Wordpress, or are the blog entries pages you are creating from scratch?
If you're on WP or a CMS, you'll want a robots.txt so that your admin, plugin, and other directories aren't indexed. On the plus side, WP (and other CMSs) have plugins that will generate a sitemap.xml file you as you add pages.
Google will find pages if you don't have a site map, or forget to add them. The sitemap is a way to let Google know what is out there, but it a) isn't required for Google to index a page and b) won't force Google to index a page.
-
Thanks, Keith. Makes sense.
So how important is an xml sitemap for a 150 page site with clean navigation? As near as I can tell (from the site: command) my whole site is already being indexed by Google. Does a sitemap buy me anything? What happens if my sitemap is partial (ie if I forget to add new pages to it, but I do link to the new pages from my other indexed pages, then will the new pages get indexed)? I'm a little worried about sitemap maintenance as I add new blog entries and so on...
-
Hi Mike...
I am sure that you are always going to get a range of opinions to this kind of question.
I think that for your site the answer may be simply that having a robots.txt file is more of a “belt and braces” safe harbour-type thing – the same goes for say whether you should have a keywords meta tag – many say these pieces of code can be of marginal value but, when you are competing head to head for a #1 listing (ie 35%+ of the clicks) then you should use every option and weapon possible ...furthermore, if your site is likely to grow significantly or eventually have content/files that you may want excluded, it’s just a “tidy” thing to have had indexed over time.
Also, don’t forget that best practice robots.txt file taxonomy is to also include directions to your xml sitemap/s.
Here is an example from one of our sites...
User-agent: *
Disallow: /design_examples.xml
Disallow: /case_studies.xmlUser-agent: Googlebot-Image
Disallow: /Sitemap: http://www.sitetopleveldomain.com/sitemap.xml
In this example there are two root files specifically excluded from all bots and this site has also specifically excluded the Google Images bot as they were getting a lot of traffic from image searches and then subsequently seeing the same copyright images turn up on a hundred junk sites – this doesn’t stop image scraping but certainly reduces the ease of finding them.
In relation to the “or 1-line file giving all bots all access” part of your question...
Some bots (most notably Google) now support an additional field called "Allow:"
As the name suggests, "Allow:" lets you specifically indicate what files/folders CAN be crawled, excluding all others. However, this field is currently not part of the "robots.txt" protocol and so not universally supported, so my suggestion would be to test it for your site for a week, as it might confuse some less intelligent crawlers.
So, in summary, my recommendation is to keep a simple robots.txt file, test if the Allow: field works for you and also ensure you have that guide to your xml sitemap – although wearing a belt and braces might not be a good look, at least your pants are unlikely to fall down
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have two robots.txt pages for www and non-www version. Will that be a problem?
There are two robots.txt pages. One for www version and another for non-www version though I have moved to the non-www version.
Technical SEO | | ramb0 -
Best Topography for eCommerce Site Product Pages (flat nav/off the root OR in products subfolder) ?
Hi Im SEO'ing a Shopify site (new/not yet live) at the moment and all the products are in a 'Products' subfolder along the lines of: domain.com/products/blue-widgets/ etc I understand that many ecommerce SEO's these days go 'Flat Navigation' with all products 'off the root' rather than in a sub folder. Then they communicate product & categories/departmental relationships via breadcrumbs & other internal linking etc In the case of a platform like Shopfy is this a good idea or is it best to leave 'as is' and the 'Products' subfolder is a perfectly good place for the product pages ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
URL / sitemap structure for support pages
I am creating a site that has four categories housed in folders off of the TLD. Example: example.com/category-1
Technical SEO | | InterCall
example.com/category-2
example.com/category-3
example.com/category-4 Those category folders contain sub-folders that house the products inside each category. Example: example.com/category-1/product-1
example.com/category-2/product-1
etc. Each of the products have a corresponding support page with technical information, FAQs, etc. I have three options as to how to structure the support pages' URLs. Option 1 - Add new sub-folder with "support" added to string: example.com/category-1/product-1-support Option 2 - Add a second sub-folder off of the product sub-folder for support: example.com/category-1/product-1/support Option 3 - Create a "support" folder with product sub-folders: example.com/support/product-1 Which of these three options would you choose? I don't like having one large /support folder that houses all products. It seems like this would create a strange crawling and UX situation. The sitemap would have a huge /support folder with all of my products in it and the keywords in my category folders would be replaced with the word "support." Because I would rather have the main product pages ranking over any of the support pages (outside of searches containing the word "support"), I am leaning toward Option 2: example.com/category-1/product-1/support. I think this structure indicates to crawlers that the more important page is the product page, while the support page is secondary to that. It also makes it clear to users that this is the support page for that particular product. Does anyone have any experience or perspective on this? I'm open to suggestions and if I'm overthinking it, tell me that too. Thanks, team.0 -
Does adding subcategory pages to an commerce site limit the link juice to the product pages?
I have a client who has an online outdoor gear company. He mostly sells high end outdoor gear (like ski jackets, vests, boots, etc) at a deep discount. His store currently only resides on Ebay. So we're building him an online store from scratch. I'm trying to determine the best site architecture and wonder if we should include subcategory pages. My issue is that I think the subcategory pages might be good from a user experience, but it'll add an additional layer between the homepage and the product pages. The problem is that I think a lot of user's might be searching for the product name to see if they can find a better deal, and my client's site would be perfect for them. So I really want to rank well for the product pages, but I'm nervous that the subcategory pages will limit the link juice of the product pages. Home --> SubCategory --> Product List --> Product Detail Home --> Men's Ski Clothing --> Men's Ski Jack --> North Face Mt Everest Jacket Should I keep the SubCategory page "Men's Ski Clothing" if it helps usability? On a separate note, the SubCategory pages would have some head keyword terms, but I don't think that he could rank well for these terms anytime soon. However, they would be great pages / terms to rank for in the long term. Should this influence the decision?
Technical SEO | | Santaur0 -
Will getting backlinks to landing page from low quality sites negatively affect SEO?
I've recently started an initiative at my company to get our customers to publish a blog post about our company and to include a link to a landing page which sits on a subdomain attached to our main domain. The reason for directing visitors to the post to a landing page is to help with conversion. I've recently been thinking that couldn't the backlinks to this landing page from our customers' blogs (generally small sites) have a negative impact on the overall SEO of my companies domain? Thanks in advance.
Technical SEO | | JustinButlion0 -
Pages Linking to Sites that Return 404 Error
We have just a few 404 errors on our site. Is there any way to figure out which pages are linking to the pages that create 404 errors? I would rather fix the links than create new 301 redirects. Thanks!
Technical SEO | | jsillay0 -
Robots.txt questions...
All, My site is rather complicated, but I will try to break down my question as simply as possible. I have a robots.txt document in the root level of my site to disallow robot access to /_system/, my CMS. This looks like this: # /robots.txt file for http://webcrawler.com/
Technical SEO | | Horizon
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/ I have another robots.txt file in another level down, which is my holiday database - www.mysite.com/holiday-database/ - this is to disallow access to /holiday-database/ControlPanel/, my database CMS. This looks like this: **User-agent: ***
Disallow: /ControlPanel/ Am I correct in thinking that this file must also be in the root level, and not in the /holiday-database/ level? If so, should my new robots.txt file look like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /holiday-database/ControlPanel/ Or, like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /ControlPanel/ Thanks in advance. Matt0 -
International Site, flow of page rank?
OK. I'm working on an international site. The site is setup with folders for UK, US, AU e.g www.site.com/UK/index.aspx The root (non folder based) is the international version of the site e.g www.site.com/index.aspx www.site.com/index.aspx has the lions share of links. Therefore, the pages immediately linked from www.site.com/index.aspx have page rank distributed between them. My UK, US and AU home pages are linked via a country selector from the www.site.com/index.aspx page via an aspx redirect page that 301's to the appropriate country home page. Therefore the home pages of UK, US, AU are recieving some of the 'juice' that is coming in to www.site.com/index.aspx (but only a fraction via the redirect links) Am I right in thinking that pages on the international version of the site will have much more potential to rank (because of their 'juice') than the pages on UK, US and AU versions of the site? If so, am I right in thinking that these will tend to rank over the equivalent UK, US and AU versions of the pages in each country version of Google despite having set directory level Geo-targetting in GWT?
Technical SEO | | QubaSEO1