Will password protecting my test sub-domain help keep the SEs from indexing it?
-
Hi, all. I'm working in an unfamiliar area here, so I hope someone can tell me if I'm out in left field.
I am building a sub-domain called http://test.mysite.com, so that I can upload a client's still-under-construction site while working on it. When completed, it'll go up on his server, replacing his old site. Obviously, I want to ensure that it doesn't get indexed while it's on my test platform.
A friend suggested that I password it with htaccess and htpasswd, since we can never be certain the SEs will obey site directives.
My question is, what do you think would be the best (and hopefully, simplest) way to accomplish this?
I'm no code-monkey, so "simple" is a big plus!
Doc
By the way, the platform will be Wordpress CMS.
-
A different Matt but I have to/still agree that you need to password protect the site. This isn't just for a protection against crawlers but also anyone else who might be snooping around. Unless your client is okay with their work being released early into the wild you should password protect it.
The good news is that many hosting companies have tools that will automagically generate the .htaccess files for your.
-
Thanks, Darryl-
Passwording the site seemed like a good option, although I wasn't aware that Matt had ever stated that. That being the case, it would certainly seem like the way to go. Thanks for the input!
-
Also, a good way to go is the following:
- tell search engines to go away in robots.txt
- to insert a meta noindex tag
- block in .htaccess as well
Matt Cutts stated that the only 100% sure way is to password protect the folder
-
Thanks for the response, Matt. So you feel like that's a sure way? There seems to be some different opinions on whether or not all the SEs will respect that. I had always thought it was a solid way to do it,too. But some of the arguments I'm hearing have me in doubt, now.
-
htaccess is a very simple way to protect the site from crawlers. If they can't access the pages they certainly can't index them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Htaccess file help
Hi, thanks for looking i am trying (and failing) to write a htaccess for the following scenario <colgroup><col width="679"></colgroup> http://www.gardening-services-edinburgh.com/index.html http://www.gardening-services-edinburgh.com http://www.gardening-services-edinburgh.com/ | so that all of these destinations goto the one resource any ideas? thanks andy
Technical SEO | | McSEO0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Will blocking the Wayback Machine (archive.org) have any impact on Google crawl and indexing/SEO?
Will blocking the Wayback Machine (archive.org) by adding the code they give have any impact on Google crawl and indexing/SEO? Anyone know? Thanks! ~Brett
Technical SEO | | BBuck0 -
Redirecting old domain to new domain with wordpress
Hi all, I need to change domain name to a website published on wordpress. I'd think to make these steps: trasferring the website (files+db) to a new hosting space to redirect old site (www.oldsite.com) to the new one (www.newsite.com) using rewrite rules. With these steps I'd need to transfer and reinstall files and wordpress so I would like to discover if there's some less time expending procedure to consider. Thanks and ciao Bob
Technical SEO | | bobrock40 -
Preferred domain
In GWT it gives an option to do the following but which is best? and why? If you specify your preferred domain as http://www.example.com and we find a link to http://example.com, we'll consider both links the same. | <label for="no_assoc">Don't set a preferred domain</label> |
Technical SEO | | jwdl
| <label for="use_www">Display URLs as ** www.example.com**</label> |
| <label for="use_nowww">Display URLs as example**.com **</label> |0 -
Extra Sub Directory
Anything wrong with a URL structure like: www.mysite.com/process/widgets/red-widgets Where the DIR: /process/ is completely empty e.g. you get a 404 if you go to www.mysite.com/process/ and it has no content within. This URL structure was setup before they knew what SEO was...wondering if it's worth the pain the 301 and restructure new URLs or is it ok to leave as is?
Technical SEO | | SoulSurfer80 -
Will a drop in indexed pages significantly affect Google rankings?
I am doing some research into why we were bumped from Google's first page into the 3rd, fourth and fifth pages in June of 2010. I always suspected Caffeine, but I just came across some data that indicates a drop in indexed pages from 510 in January of that year to 133 by June. I'm not sure what happened but I believe our blog pages were de-indexed somehow. What I want to know is could that significant drop in indexed pages have had an effect on our rankings at that time? We are back up to over 500 indexed pages, but have not fully recovered our first page positions.
Technical SEO | | rdreich490 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0