Unique domains vs. single domain for UGC sites?
-
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better:
XXX,XXX pages on one site
vs.
A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site.
The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary.
Thoughts? Any other considerations we should be thinking about?
-
-
It depends on how the content on secondary domains organized. If each secondary domain has a content theme, then it would be easier to get separate links for each of them and thus it benefits everyone. It helps user to quickly find/contribute what they are looking for, helps to attract different specific links and pass the value to primary. If there is no such theme, then all those secondary domain will compete with each other for mindshare, user contribution and attracting individual links which would make them to achieve high DA,
-
I have similar setup. but instead of separate domains I have multiple subdomains with content categorized by theme. It helps may ways 1) Easier for single sign on. Use logged in one site does not need to log in again on anotyher site. 2) Can attract different types of links 3) easier for segmentation for advertisers. Not all subdomains can achieve same DA or traffic but it helps overall network by internal linking.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
Launching a new website. Old inherited site cannot be saved after lifted penalty. When should we kill the old site and how?
Background Information A website that we inherited was severely penalized and after the penalty was revoked the site still never resurfaced in rankings or traffic. Although a dramatic action, we have decided to launch a completely new version of the website. Everything will be new including the imagery, branding, content, domain name, hosting company, registrar account, google analytics account, etc. Our question is when do we pull the plug on the old site and how do we go about doing it? We had heard advice that we should make sure we run both sites at the same time for 3 months, then deindex the old site using a noindex meta robots tag.We are cautious because we don't want the old website to be associated in any way, shape or form with the new website. We will purposely not be 301 redirecting any URLs from the old website to the new. What would you do if you were in this situation?
Intermediate & Advanced SEO | | peteboyd0 -
PageRank vs. Domain Authority
A web-site has PR5 but in OSE I see that Domain Authority is only 26. I've also checked and the domain was registered in 2009. Is this normal?
Intermediate & Advanced SEO | | ditoroin0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Separate Site or should we incorporate it into our main site
Hello, We have a website to sell personal development trainings. The owners want to start 2 blogs - one for each owner - that promotes their personal coaching practices. What's the SEO advantages of embedding both blogs in the current site vs starting 2 brand new blogs with their names as the domain names?
Intermediate & Advanced SEO | | BobGW0 -
Is my site being penalized?
I launched http://rumma.ge in February of this year. Because I'm using a domain hack (the Georgian domain), I'd really like to rank for just the word "rummage". After launching, I was steady at around page 4/5 on searches for "rummage". However since then I've tumbled out of the first 100. In fact I can't even find the site in the first 20 pages on Google for that search. Even a search for my exact homepage title text doesn't bring up the site, despite the fact that the site is still in the index. I'm wondering if one of the following could be the root cause: We have a ccTLD (.ge)--not sure about the impacts of this, but seems like it might not be the root cause because we were ranking for "rummage" when we first launched. Tried running an Adwords campaign but the site was flagged as a "bridge page" (working on getting this addressed). I'm wondering if this could have carryover impacts into natural search rankings? We've tried doing some press and built up a decent number of backlinks over the past couple of months, many of which had "rummage" in the anchor text. This was all organic, but happened over the span of a month which may be too fast? Am I being penalized? Beyond checking indexing of the site, is there a way to tell if I've been flagged for some bad behavior? Any help or thoughts would be greatly appreciated. I'm really confused by this since I feel like I've been doing things right and my rankings have been travelling downward. Thanks!! Matt
Intermediate & Advanced SEO | | minouye0 -
Domains
I am currently working on a huge website which ranks very well receiving 150,000 visitors every day. I have been offered the chance to buy some more domain names which would suit my keywords in the current site. These domains as a keyword also receive huge amounts of traffic. Would it be beneficial for me to do this....if so why? Thanks
Intermediate & Advanced SEO | | wazza19850 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0