Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unique domains vs. single domain for UGC sites?
-
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better:
XXX,XXX pages on one site
vs.
A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site.
The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary.
Thoughts? Any other considerations we should be thinking about?
-
-
It depends on how the content on secondary domains organized. If each secondary domain has a content theme, then it would be easier to get separate links for each of them and thus it benefits everyone. It helps user to quickly find/contribute what they are looking for, helps to attract different specific links and pass the value to primary. If there is no such theme, then all those secondary domain will compete with each other for mindshare, user contribution and attracting individual links which would make them to achieve high DA,
-
I have similar setup. but instead of separate domains I have multiple subdomains with content categorized by theme. It helps may ways 1) Easier for single sign on. Use logged in one site does not need to log in again on anotyher site. 2) Can attract different types of links 3) easier for segmentation for advertisers. Not all subdomains can achieve same DA or traffic but it helps overall network by internal linking.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why some domains and sub-domains have same DA, but some others don't?
Hi I noticed for some blog providers in my country, which provide a sub-domian address for their blogs. the sub-domain authority is exactly as the main domain. Whereas, for some other blog providers every subdomain has its different and lower authority. for example "ffff.blog.ir" and "blog.ir" both have domain authority of 60. It noteworthy to mention that the "ffff.blog.ir" does not even exist! This is while mihanblog.com and hfilm.mihanblog.com has diffrent page authority.
Intermediate & Advanced SEO | | rayatarh5451230 -
Will using a reverse proxy give me the benefits of the main sites domain authority?
If I am running example.com and have a blog on exampleblog.com Will moving the blog to example.com/blog and using a reverse proxy give the blog the same domain authority as example.com Thanks
Intermediate & Advanced SEO | | El-Bracko0 -
Community inside the domain or in a separate domain
Hi there, I work for an ecommerce company as an online marketing consultant. They make kitchenware, microware and so on. The are reviewing their overall strategy and as such they want to build up a community. Ideally, they would want to have the community in a separate domain. This domain wouldn't have the logo of the brand. This community wouldn't promote the brand itself. The brand would post content occassionally and link the store domain. The reasoning of this approach is to not interfere in the way of the community users and also the fact that the branded traffic acquired doesn't end up buying at the store I like this approach but I am concerned because the brand is not that big to have two domains separated and lose all the authority associated with one strong domain. I would definitely have everything under the same domain, store and community, otherwise we would have to acquire traffic for two domains. 1. What do you think of both scenarios, one domain versus two? Which one is better? 2. Do you know any examples of ecommerce companies with successful communities within the store domain? Thanks and regards
Intermediate & Advanced SEO | | footd0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Merging Sites: Will redirecting the old homepage to an internal page on the new site cause issues?
I've ended up with two sites which have similar content (but not duplicate) and target similar keywords, rather than trying to maintain two sites I would like to merge the sites together. The old site is more of a traditional niche site and targets a particular set of keywords on its homepage, the new site is more of an authority site with a magazine type homepage and targets the same set of keywords from an internal page. My question is: Should I redirect the old site's homepage to the relevant internal page on the new website...
Intermediate & Advanced SEO | | lara_dar
...or should I redirect the old site's homepage to the new site's homepage? (the old site's homepage backlinks are a mixture of partial match keyword anchor text, naked URLs and branded anchor text) I am in two minds (a & b!) (a) Redirecting to the internal page would be great for ranking as there are some decent backlinks and the content is similar (b) But usually when you do a 301 redirect the homepage usually directs to the new homepage and some of the old site's links are related to the domain rather than the keyword (e.g. http://www.site.com) and some people will be looking for the site's homepage. What do you think? Your help is much appreciated (and hope this makes sense...!)0 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
Should I buy a .co domain if my preferred .com and .co.uk domain are taken by other companies?
I'm looking to boost my website ranking and drive more traffic to it using a keyword rich domain name. I want to have my nearest city followed by the keyword "seo" in the domain name but the .co.uk and .com have already been taken. Should I take the plunge and buy .co at a higher price? What options do I have? Also whilst we're on domains and URL's is it best to separate keywords in url's with a (_) or a (-)? Many thanks for any help with this matter. Alex
Intermediate & Advanced SEO | | SeoSheikh0 -
Move blog from subdomain to main domain on ecom site?
I am wondering what my fellow mozers think. Pretty set about my direction but want to get any other input to aid in my decision. Have an ecom site with a www.blog.maindomain.com. The blog is fairly new and no major rankings. There are only about 30 posts. This isn't a super competitive market and the blogging won't be a huge part of our content strategy but I would like to use it for passing juice etc. Would you go through the trouble to move the blog to www.site.com/blog and redirecting all the old content to new?
Intermediate & Advanced SEO | | PEnterprises0