One server, two domains - robots.txt allow for one domain but not other?
-
Hello,
I would like to create a single server with two domains pointing to it. Ex: domain1.com -> myserver.com/ domain2.com -> myserver.com/subfolder. The goal is to create two separate sites on one server.
I would like the second domain ( /subfolder) to be fully indexed / SEO friendly and have the robots txt file allow search bots to crawl. However, the first domain (server root) I would like to keep non-indexed, and the robots.txt file disallowing any bots / indexing.
Does anyone have any suggestions for the best way to tackle this one?
Thanks!
-
I am assuming that you are using Cpanel on the host for the site. Just add the second domain as an addon domain. Because of the way that things are set up in Cpanel and the way it handles redirects, the crawlers will not realize the local directory structure that one site is sitting in the sub folder of another site. They will be viewed as if they are two different sites on different servers. So you would handle the robtos.txt like they were on two different servers and just disregard anything about one being in the sub folder of another.
-
A Records
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Value of domain name for domain authority. Please help to figure out!
I am doing SEO for an appliance repair company. Their company website's domain doesn't have high authority, and I am going to increase that by link earning and content improving. I think a better domain name might also help me out. The current URL contain the word "appliance" but doesn't have "repair" in it. I am thinking a new domain that would contain both keywords will serve better. Could you please share with me your thought on this? Am I in the right direction, or not at all? I know Google penalizes mirror sites since this they are considered as duplicated content. I'll upload my content to the new domain and make the old one point to that new URL. I am wondering if canonical might help? Or 301 redirect will be a better solution? Any advise would be highly appreciated! Thank you!
Technical SEO | | kirupa0 -
Should I block Map pages with robots.txt?
Hello, I have a website that was started in 1999. On the website I have map pages for each of the offices listed on my site, for which there are about 120. Each of the 120 maps is in a whole separate html page. There is no content in the page other than the map. I know all of the offices love having the map pages so I don't want to remove the pages. So, my question is would these pages with no real content be hurting the rankings of the other pages on our site? Therefore, should I block the pages with my robots.txt? Would I also have to remove these pages (in webmaster tools?) from Google for blocking by robots.txt to really work? I appreciate your feedback, thanks!
Technical SEO | | imaginex0 -
Moving content between two separate domains...
Hello I am looking for advice regarding moving content from one site to another. We have two websites: Site 1: E-commerce site, with content weaved in throughout the visitor journey.
Technical SEO | | DJR1981
Site 2: Blog-style site, used to archive magazine (which we own) articles online. Both sites exist on completely separate domains. Over time, Site 2 has received a lot less attention due to a change in our business objectives. As a result of this, this site is not as up-to-date as it could be and we're now starting to think about winding the brand down. However, some of the content (mostly feature-pieces, reviews etc) on Site 2 is really good and it would be a shame to just see such high quality stuff disappear into the ether. Ideally, we would like migrate some of the content on Site 2 to Site 1. The reasons for this are mostly to improve things from a visitor perspective, but also to gain any positive SEO points from adding such pieces to our main domain. I've had a look through and a lot of the articles from Site 2 are indexed. Is it going to be a case of selecting the pieces I want and then adding a 301s to those pages so they're no longer found/visable before re-publishing them on Site 1? Sorry if this is a bit of silly question, just wanted some advice to ensure I go about it the right way. Thanks!0 -
Server Connectivity
Hey there When we go to our webmaster tools there is a orange triangle. The issue is that Google's robot can not access our site. Does anyone know why this could be? Thanks!
Technical SEO | | Comunicare0 -
Blocked by meta-robots but there is no robots file
OK, I'm a little frustred here. I've waited a week for the next weekly index to take place after changing the privacy setting in a wordpress website so Google can index, but I still got the same problem. Blocked by meta-robots, no index, no follow. But I do not see a robot file anywhere and the privacy setting in this Wordpress site is set to allow search engines to index this site. Website is www.marketalert.ca What am I missing here? Why can't I index the rest of the website and is there a faster way to test this rather than wait another week just to find out it didn't work again?
Technical SEO | | Twinbytes0 -
Robots.txt
Hello Everyone, The problem I'm having is not knowing where to have the robots.txt file on our server. We have our main domain (company.com) with a robots.txt file in the root of the site, but we also have our blog (company.com/blog) where were trying to disallow certain directories from being crawled for SEO purposes... Would having the blog in the sub-directory still need its own robots.txt? or can I reference the directories i don't want crawled within the blog using the root robots.txt file? Thanks for your insight on this matter.
Technical SEO | | BailHotline0 -
Duplicate content on the one domain (related to country targetting)
Hi - We have a client who has a TLD that they wish to target several markets using .com/au .com/us .com/sg Each will use duplicate content with slight variations to cater for the local market (spelling, industry jargon). They seem reluctant to register a TLD for each target market (which was our suggestion) and I am wondering what SEO penalties would apply for having a majority of duplicate content on the same domain – perhaps using subdomains would be better? Thanks!
Technical SEO | | E2E0