Robots.txt help
-
Hi Moz Community,
Google is indexing some developer pages from a previous website where I currently work:
ddcblog.dev.examplewebsite.com/categories/sub-categories
Was wondering how I include these in a robots.txt file so they no longer appear on Google. Can I do it under our homepage GWT account or do I have to have a separate account set up for these URL types?
As always, your expertise is greatly appreciated,
-Reed
-
The robots.txt would allow the OP to go back into GWT and request removal of the dev site from the index. Password protecting a dev site is usually a pretty good idea, too.
-
Can you not just add a htaccess password to the directory to keep the dev site up, but keep bots out?
-
You'll want a separate account for that subdomain, and also put the robots.txt excluding that subdomain in that subdomain itself.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help need some advice?
Can any of you guys please help me I have alerts on links coming in and it looks like recently someone did this, it looks maliciously done as it is only our domain mentioned and most are brand new posts? http://testosteroneclinicindenve53950.shotblogs.com/testosterone-clinic-in-denver-fundamentals-explained-6102386 http://claytondmnnp.ampedpages.com/Details-Fiction-and-testosterone-clinic-in-denver-16897309 http://vinylvehiclecarwrap38041.alltdesign.com/a-review-of-vinyl-vehicle-car-wrap-9574042 http://devinxccct.educationalimpactblog.com/1784474/little-known-facts-about-vinyl-vehicle-car-wrap http://keeganbsftf.ka-blogs.com/7488539/how-vinyl-vehicle-car-wrap-can-save-you-time-stress-and-money http://andybxoes.thezenweb.com/vinyl-vehicle-car-wrap-Fundamentals-Explained-17581028 http://kylerhfdzu.blogkoo.com/not-known-details-about-vinyl-vehicle-car-wrap-9029141 http://troyytkyn.timeblog.net/7695911/the-greatest-guide-to-vinyl-vehicle-car-wrap http://waylontyzab.pointblog.net/testosterone-clinic-in-denver-Secrets-16335972 http://testosteroneclinicindenve30516.onesmablog.com/Top-testosterone-clinic-in-denver-Secrets-17252737 http://emiliogkmop.blogofoto.com/7667522/top-guidelines-of-testosterone-clinic-in-denver http://caidenaczxt.blogs-service.com/7514172/testosterone-clinic-in-denver-fundamentals-explained http://daltonpyfms.mybjjblog.com/5-simple-statements-about-testosterone-clinic-in-denver-explained-6517932 Should I try to disavow these and submit to google or will google know our site which has been up for 5 years is not doing this? Should I do any of these https://tehnoblog.org/google-webmaster-tools-my-website-got-bombed-with-backlinks-what-to-do/
Intermediate & Advanced SEO | | BobAnderson0 -
The "webmaster" disallowed all ROBOTS to fight spam! Help!!
One of the companies I do work for has a magento site. I am simply the SEO guy and they work the website through some developers who hold access to their systems VERY tightly. Using Google Webmaster Tools I saw that the robots.txt file was blocking ALL robots. I immediately e-mailed out and received a long reply about foreign robots and scrappers slowing down the website. They told me I would have to provide a list of only the good robots to allow in robots.txt. Please correct me if I'm wrong.. but isn't Robots.txt optional?? Won't a bad scrapper or bot still bog down the site? Shouldn't that be handled in httaccess or something different? I'm not new to SEO but I'm sure some of you who have been around longer have run into something like this and could provide some suggestions or resources I could use to plead my case! If I'm wrong.. please help me understand how we can meet both needs of allowing bots to visit the site but prevent the 'bad' ones. Their claim is the site is bombarded by tons and tons of bots that have slowed down performance. Thanks in advance for your help!
Intermediate & Advanced SEO | | JoshuaLindley0 -
Magento Help - Server Reset
Good Morning, After rebooting a server, a magento based website reset itself going back to December 2013. All changes to the site and orders dating up until yesterday (6/19/14) have disappeared. There are several folders on the root of the server that have files with yesterday's date but we don't know how to bring everything back and restore. Any Magento or server experts out there ever face this issue or have any ideas or potential solutions? Thanks
Intermediate & Advanced SEO | | Prime850 -
Robots.txt issue for international websites
In Google.co.uk, our US based (abcd.com) is showing: A description for this result is not available because of this site's robots.txt – learn more But UK website (uk.abcd.com) is working properly. We would like to disappear .com result totally, if possible. How to fix it? Thanks in advance.
Intermediate & Advanced SEO | | JinnatUlHasan0 -
Is our robots.txt file correct?
Could you please review our robots.txt file and let me know if this is correct. www.faithology.com/robots.txt Thank you!
Intermediate & Advanced SEO | | BMPIRE0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0 -
Help Needed - 301 a .co.uk to a .com Serp Questions
Hey, really need some help deciding what to do... I have a .co.uk site, its my oldest and best site of my network and accounts for maybe 30-40% of my income. Although its a .co.uk site, it actually makes most of its from from USA traffic and targets many terms for the US market - but the problem is that due to it being a .co.uk it doesnt rank as well in G .com and over the last few years Google has defiantly widened the gap as such for the ability for a .co.uk to rank in G .com. Many terms that I used to be #1 for in G .com, I now rank position 5-10 only, but in G .co.uk I'm #1 and often with a duo listing so I wouldnt put the loss of rankings in G .com down to just losing rankings naturally. Now many of my key pages are gradually losing rankings in G .com which is not good and really frustrating Feedback Needed So my dilemma is do I risk my best site and 301 it to a .com hosted in the US for potential at a guess 50% increase in revenues and more future potential (If the 301 worked well and got some US rankings back - Im sure longtail would increase lots too) ? If people with experience with 301ing sites to a new domain could let me know how they did or if you're an SEO and have done this many times, how many times on average has Serps remained stable / unchanged ? Trying to work out the reward to risk ratio, like on average if the transition is seamless 90% of the time it would seem worth the gamble, but if its 50% then I would say its not worth it.
Intermediate & Advanced SEO | | goody2shoes0