Subdomain and root domain effects on SEO
-
I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?
-
Why not just buy a new domain if it is a totally separate product?
You are correct: a sub domain is essentially (for SEO purposes anyway) treated as its own entity.
As long as you build great content, find great link opportunities and treat it as you would any other domain you should have no problem getting it ranked.
-
Aditya
No matter what you want to use it for, a sub domain is essentially its own domain and, it does not really affect your original domain. As to how a sub directory will will confuse your visitor but a subdomain won't, I do not see that as true. But, I think your real question is "is it harder for a subdomain to "rank" than a subdirectory? No, but let's be clear about what does rank in the SERPs: Pages.
So, you can have a sub domain with 5 pages that are able to rank just as easily as a domain that has five pages that rank. (Or one page, or 100). Given what you are wanting to sell, I would focus on UI/UX and ranking for the key terms for your apps. Quit worrying about the subdomain vs subdirectory. Choose one and rock on.
Best,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Moving Pages Up a Folder to come off root domain
Good Morning I've been doing some competitor research to see why they're ranking higher than us and noticed that one who seems to be doing well has changed their url structure so that rather than being www.domain.com/product-category/product-subcategory/product-info-page/ they've removed levels so for instance they now have: www.domain.com/product-subcategory/ and www.domain.com/product-info-page/ basically everything seems to come off the root domain rather than having the traditional structure. Our rankings for the product-subcategory pages, which are probably what most people would search for, are just sitting below the first page in most instances and have been for a while I'm interested to know other people's thoughts and if this is an approach they've taken and had good results?
White Hat / Black Hat SEO | | Ham19790 -
Pharma Hack/Grey hat SEO. Cannot get site to rank, tons of incoming bad links
I have been working on a website trying to get it to show up in the SERPs again. It is being indexed which is great, it has some errors that I'm fixing now. But for the most part it should be ranking. It don't show any penalties going on, but when I did a backlink search we keep getting the cialis, viagra etc inbound links. First thought was Pharma Hack. But it's not a WP site and I recently rebuilt it. So whatever bad code could have been there it's not anymore. It doesn't show up in google either for the search site:www.mysite.com viagra cialis etc... So I'm wondering if anyone has any insight in a direction to point me? I don't understand what would be causing this to still not rank. Only thing it ranks for is it's name. Any suggestions would be very appreciated.
White Hat / Black Hat SEO | | WeBuyCars.com0 -
Negative SEO? Or?
We had another website attacked by negative SEO, so now I'm getting a little suspicious. The website went from around 26 linking domains to 1001 links from 311 linking domains in webmaster tools. They're all in different languages, and directories. I asked everyone at the organization and they said they didn't sign up for any services. I trust them, because I know they don't have time to breath right now, with 7 product launches this month. OSE says 79 links from 26 linking domains, so the spam links must be gone now.. but the website's been wiped pretty much clean from Google.com and is just starting to slowly (very slowing) crawl back 😞 Is there anything else that could be targeting the website with hundreds of links? Anything I can do to protect it? I've disavowed the links, but they're gone now so it probably won't help. Thanks in advance for ideas 🙂 UPDATE: The website is still not recovering in Google.com. It seems to be ok in .ca, but a recent conundrum is that it's been basically wiped clean from Bing and Yahoo rankings. I've emailed Bing and the team says it is indeed indexed, and not penalized (manually anyways). OLE says the "bad links" are no longer there, but webmaster tools still lists them all (I know, they don't update that often). My latest strategy is to start building some really strong links into the website with killer content. Their products are amazing (tv lift furniture) so it shouldn't be difficult. Just time consuming! I'm also being super-active on their social media platforms, to see if this helps boost rankings in the mean time. Any further tips to recover from negative SEO?
White Hat / Black Hat SEO | | SmartWebPros
(Note: I do not need link removal tools. We have a process that's working just fine).0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
Contacted by an SEO company..
We have been contacted by a free SEO company, who seem to use your domain and create articles for you on your site, linking to other relevant sites, while other relevant sites link to yours. All they ask is a small link on your homepage. An excerpt of the instructions are: Download the attached files in a folder to be uploaded to your server (public_html) folder Set 777 permission to the folder. This allows our script to work on your site Add the "site wide" link on your homepage, as well as on the inner pages, the same way we do in the article section. Implement the "site wide" link following the instructions on the README.txt file Copy the script on the file or you can provide us with a temporary FTP access to your server and we will do it for you. Please note that if you can upload our folder to your site within 48 hours, you will be eligible to receive 20 bonus links for your SEO campaign. Should we tread carefully?
White Hat / Black Hat SEO | | filarinskis0