The differences between XXX.domain.com and domain.com/XXX?
-
hi guys i would like to know which seo value is better?
for example if i would put a link in xxx.domain.com or domain.com/XXX which one will give me a better seo value? does it give the same? assuming that domain.com have a huge PR RANK itself.
why do people bother making XXX.domain.com instead?
hope for clarification thanks!
-
Yeah, couldn't have said it anyway better. domain.com/xxx will be better for your SEO. I would strongly suggest to keep everything on a single subdomain, if you are to look further into the case, Moz did some pretty good experiments on this post: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
-
domain.com/XXX is a subfolder while XXX.domain.com is a subdomain
domain.com/xxx is generally better for seo purposes. People use the subdomain as an alternative to creating a new site .
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Moz is showing Spam Score at my New Domain?
Hi folks I just registered a new domains boring magazine but I forgot to check the spam score. Recently, I checked and it showing spam score of 46% without any backlinks. You can check the domain age is 30 Days only till now. Need your recommendations on how can I reduce it and on which basis Moz showing it as spams? sp.PNG
White Hat / Black Hat SEO | | ImranZahidAli0 -
Buy exact match domain and 301 worth it?
So there is this exact match domain that gets about 500 visitors a day. it has trust flow 17 and citation flow of 23 which is just a little lower than our own website. The website talks about one of our keywords and rank on second page in SERPs. I am not interested in buying and running that website, but rather just to liquidate all the pages with 301s into our existing domain and onto relevant pages. So the 301s would be to relevant pages. The question is, would this strategy be worth it in todays SEO world and Google updates?
White Hat / Black Hat SEO | | TVape0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Domain name for seo
Lets say that my competitor has the domain name and website called remotecontrolcar.com , .net and .org . And he has great links and good white hat google juice. What if I get MyRemoteControlCar.com , .net, and .org domain name.. Would my domain name help me in rank against him as long as I have similar google juice? Thanks.
White Hat / Black Hat SEO | | zsyed0 -
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
White Hat / Black Hat SEO | | dotlineseo0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Why does Google recommend schema for local business/ organizations?
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets". Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks? I would be interested to hear what everyone else opinions are on this thanks.
White Hat / Black Hat SEO | | PeterRota0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0