Two Robots.txt files
-
Hi there
Can somebody please help me that one of my client site have two robot.txt files (please see below). One txt file is blocked few folders and another one is blocked completely all the Search engines. Our tech team telling that due to some technical reasons they using second one which placed in inside the server and search engines unable to see this file. www.example.co.uk/robots.txt - Blocked few folderswww.example.co.uk/Robots.txt - Blocked all Search Engines I hope someone can give me the help I need in this one.
Thanks in advance!
Cheers,
Satla -
Thank Riera
-
Hi Satia,
You mentioned that one robots.txt file placed in inside the server and search engine unable to see the file. If search engine won't see robots.txt file then what is the use of that robots.txt file?
AFAIK it must be placed under root directory an there is no way to keep two files with the same name. So you must have only one robots.txt and that should be placed under root directory.
Hope this helps.
Thanks
-
Hi Satla,
You're going to need to get rid of that 2nd version ASAP. The official standard for a robots.txt file is all lower case in the file name, so that's most likely what bots are seeing. But to err on the side of caution, I'd remove any possibilities of a "disallow: /" and remove that Robots.txt version.
Some servers are case sensitive, so you could run into issues here as well.
-
Hi,
I don't find any good reason why there are two files. There should be just one, where you specify everything you'd like to done.
If the tech team doesn't want to correct and leave just one file, may be because they are lazy or there might be some other issue that if they delete one file, the hole site blows up.Here, I leave you 2 Moz's articles about the robots.txt file.
What is Robots.txt? - Moz Learn
Learn About Robots.txt with Interactive Examples - Moz BlogTake into account that the name of the file must be in lower case. I've never seen any different and the servers are usually case sensitive with the filenames.
Hope its helpful.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Question for E-Commerce Sites
Hi All, I have a couple of e-commerce clients and have a question about URLs. When you perform a search on website all URLs contain a question mark, for example: /filter.aspx?search=blackout I'm not sure that I want these indexed. Could I be causing any harm/danger if I add this to the robots.txt file? /*? Any suggestions welcome! Gavin
On-Page Optimization | | IcanAgency0 -
Two links to different page with same link label
What will be the impact in Google if I have two links in the same page pointing to different pages, but with the same label.
On-Page Optimization | | kjerstibakke0 -
.htaccess file uploaded, website won't load
I uploaded the .htaccess file with the below, and now my website won't load at all? Then I deleted the htaccess file and it still won't load? But then it would load on my phone when I took it down, not on chrome, or explorer? Then I put it back up and looked again on my phone, wouldn't load on phone. Then deleted file and it still won't load on my phone? What is going on? RewriteEngine on
On-Page Optimization | | dwebb007
RewriteCond %{HTTP_HOST} !^http://freightetc.com$
RewriteRule ^(.)$ http://www.freightetc.com/$1 [R=301]
RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.*)index.php$ http://www.freightetc.com/$1 [R=301]0 -
Two sites, one with a ccTLD domain, the other with TLD domain, same content
Hi there! I have a site which can be accessed with two different domains: one ccTLD for Spain: www.piensapiensa.es one TLD www.piensapiensa.com Should I take care of something regarding SEO? I have also a redirection from www.piensapiensa.com to piensapiensa.com. I have set up them in webmasters tools individually, with the same sitemap obviously. Thanks in advanced.
On-Page Optimization | | juanmiguelcr0 -
Login webpage blocked by robots
Hi, the SEOMOZ crawl diagnostics shows that this page: www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow) Is there any problem with that?
On-Page Optimization | | juanmiguelcr0 -
How to optimize for a product by two names
So let us assume I am selling an item on my website and it comes in a large array of varieties. Let us also assume that this item is commonly referred to by two different names. (i.e. Cover & Case, Car & Automobile, Notepad & Notebook) Both of these names that are used, in regards to this product, have, for the sake of argument here, the exact same search volume. I want to make sure that I rank for both terms. In my Title Tags I am currently thinking about the following methodology to help that cause. "GE Motors Super Fast and Awesome Car / Automobile" "Ghostwriter Kids Notebook / Notepad" "Super Soft Pillow Cover / Case" Notice I have the space in between the words and the / but my question is if this is necessary or not? What is Google's policy on how they view that / ? Can I do this and still have Google see it as two different words? "GE Motors Super Fast and Awesome Car/Automobile" "Ghostwriter Kids Notebook/Notepad" "Super Soft Pillow Cover/Case" Apologies if this is a fairly basic question but cannot seem to find this information.
On-Page Optimization | | DRSearchEngOpt0 -
Right way to block google robots from ppc landing pages
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know. Adding metatags noindex nofollow on the other side will block adwords robot as well. right? Thank you very much, Serge
On-Page Optimization | | Kotkov0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0