Two Robots.txt files
-
Hi there
Can somebody please help me that one of my client site have two robot.txt files (please see below). One txt file is blocked few folders and another one is blocked completely all the Search engines. Our tech team telling that due to some technical reasons they using second one which placed in inside the server and search engines unable to see this file. www.example.co.uk/robots.txt - Blocked few folderswww.example.co.uk/Robots.txt - Blocked all Search Engines I hope someone can give me the help I need in this one.
Thanks in advance!
Cheers,
Satla -
Thank Riera
-
Hi Satia,
You mentioned that one robots.txt file placed in inside the server and search engine unable to see the file. If search engine won't see robots.txt file then what is the use of that robots.txt file?
AFAIK it must be placed under root directory an there is no way to keep two files with the same name. So you must have only one robots.txt and that should be placed under root directory.
Hope this helps.
Thanks
-
Hi Satla,
You're going to need to get rid of that 2nd version ASAP. The official standard for a robots.txt file is all lower case in the file name, so that's most likely what bots are seeing. But to err on the side of caution, I'd remove any possibilities of a "disallow: /" and remove that Robots.txt version.
Some servers are case sensitive, so you could run into issues here as well.
-
Hi,
I don't find any good reason why there are two files. There should be just one, where you specify everything you'd like to done.
If the tech team doesn't want to correct and leave just one file, may be because they are lazy or there might be some other issue that if they delete one file, the hole site blows up.Here, I leave you 2 Moz's articles about the robots.txt file.
What is Robots.txt? - Moz Learn
Learn About Robots.txt with Interactive Examples - Moz BlogTake into account that the name of the file must be in lower case. I've never seen any different and the servers are usually case sensitive with the filenames.
Hope its helpful.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Count Links Loaded from JavaScript Files After the Page Loads
Hi, I have a simple question. If I want to put an image with a link to another site like a banner ad on my page, but do not want it counted by Google. Can I simply load the link and banner using jQuery onload from a separate .js file? The ideal result would be for Google to index a script tag instead of a link.
On-Page Optimization | | CopBlaster.com1 -
Two Day After Starting Moz Pro campaign i experienced Sudden Huge Traffic Drop and My site ranki Drops in my site..Please I need your help
Two Day After Starting Moz Pro campaign i experienced Sudden Huge Traffic Drop and Drops in my site..Please I need your help
On-Page Optimization | | zizutz0 -
How important are image file names
Hi, How important do you think the image file names are for image search?
On-Page Optimization | | jjtech
I know it used to be the best practice a while ago but is it still important? Thanks in advance, JJ0 -
Using Robots Meta Tag on Review Form Pages
I have gone over this so many times and I just can't seem to get it straight and hope someone can help me out with a couple of questions: Right now, on my dynamically created pages created by filters (located on the category pages) I am using rel""canonical" to point them to their respective category page. Should I also use the robots meta tag as well? Similarly, each product I have on my site has a review form on it and thus is getting indexed by Google. I have placed the same canonical tag on them as well pointing them to the page with the review form on it. In the past I used robots.txt to block google from the review pages but this didn't really do much. Should I be using the robots meta tag on these pages as well? If I used the robots meta tag should I noindex,nofollow? Thanks in advance, Jake
On-Page Optimization | | jake3720 -
Right way to block google robots from ppc landing pages
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know. Adding metatags noindex nofollow on the other side will block adwords robot as well. right? Thank you very much, Serge
On-Page Optimization | | Kotkov0 -
Two points of view on optimizing our search pages. What should we go with?
So we're in the process of going back and forth with our designer about optimizing our search results, which also doubles as a landing page for visitors searching with keywords like "Meeting Rooms Seattle" and "Seattle Meeting Spaces" We're on the front page in the SERPs, but still have a way to go. This is our current page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington And this is something we've proposed for our designer to work with: http://imgur.com/JU1zg There search page text and links in the top left corner were to be placed for onsite SEO purposes ie we have no real text/content on the page for relevancy. We're currently in the process of writing the copy for each city on the search pages. Our designer made this argument: After giving it some thought I came to the conclusion that we may want to take a step back, and focus on the overall goal of this exercise. From what I have gathered, you would like to generate more click-throus and improve SEO, right? In my opinion, adding all of the provided copy and the link farm to the search results page would not necessarily help that. In fact, I think it would actually push the actual results way down. The content you provided me is more suited for a landing page, not a search results page (that is taking into consideration that you want similar content for other locations). Redfin has done a ton of great SEO work on their site. Using them as an example, if you go to Redfin.com, you will find tiny links in the footer that say "home for sale in seattle" etc. If you click on those, it puts you on a page like this: http://www.redfin.com/cities/1/seattle?src=homepage and then from there you can click to a neighborhood page like this: http://www.redfin.com/city/1387/WA/Bellevue. I would recommend that we create a set of location pages with the content the client is asking for, that are specifically optimized for SEO, and provide links in the footer of the site to get to those pages. Then the links on the new landing pages would land the user on the search results page. By keeping two different pages for two different purposes separate would help keep content more organized and help user find specific info they are looking for. As a quick fix we could put one line of text under the H1 text on search results as well, maybe with a strong tag. By doing that we will be able to keep the page looking clean and easy to navigate through. Anyways, that's just my two cents. Any ideas/input on this?
On-Page Optimization | | eVenuesSEO0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0 -
What reasons exist to use noindex / robots.txt?
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
On-Page Optimization | | digitalstream0