Unique domains vs. single domain for UGC sites?
-
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better:
XXX,XXX pages on one site
vs.
A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site.
The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary.
Thoughts? Any other considerations we should be thinking about?
-
-
It depends on how the content on secondary domains organized. If each secondary domain has a content theme, then it would be easier to get separate links for each of them and thus it benefits everyone. It helps user to quickly find/contribute what they are looking for, helps to attract different specific links and pass the value to primary. If there is no such theme, then all those secondary domain will compete with each other for mindshare, user contribution and attracting individual links which would make them to achieve high DA,
-
I have similar setup. but instead of separate domains I have multiple subdomains with content categorized by theme. It helps may ways 1) Easier for single sign on. Use logged in one site does not need to log in again on anotyher site. 2) Can attract different types of links 3) easier for segmentation for advertisers. Not all subdomains can achieve same DA or traffic but it helps overall network by internal linking.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am temporarily moving a site to a new domain. Which redirect is best?
A client is having their site redeveloped on a new platform in sections and are moving the sections that are on the new platform to a temporary subdomain until the entire site is migrated. This is happening over the course of 2-3 months. During this time, is it best for the site to use 302 temporary redirects during this time (URL path not changing), or is it best to 301 to the temp. domain, then 301 back to the original once the new platform is completely migrated? Thanks!
Intermediate & Advanced SEO | | Matt3120 -
A single page from site not ranking
Hello, We have a new site launched in March, that is ranking well in search for all of the pages, except one and we don't know why. This page it is optimised exactly the same way like the others, but still doesn't rank in Google. We have verified robots.txt for noffollow, noindex tags, we have verified if it was penalized by Google, but still didn't find nothing. Initially we had another site and was on the topic of this page, but we have redirected it to the new one. In case this old site was anytime in the past penalized by Google, could it be possible that the new page be influenced by this? Also, we have another site that ranks on the first position, that targets the same keywords like the page that does not rank. It was the first site we launched, so it is pretty much old, but we do not have duplicate content on them. Maybe Google doesn't like the fact that both target the same keywords and chooses to display only the old site? Please help us if you have any ideas or have been through such thing. Thank you!
Intermediate & Advanced SEO | | daniela.pirlogea0 -
Redirect domain or keep separate domains in each country?
Hi all Hoping this might be something that can be answered given the number of variables 🙂
Intermediate & Advanced SEO | | IsaCleanse
My main site is www.isacleanse.com.au (Obviously targeted to Australian Market) and also www.isacleanse.co.nz targeted to NZ. The main Keywords im targeting are 'Isagenix' for both and also Isagenix Australia, Isagenix Perth, Sydney (Australian cities) and Isagenix NZ, Isagenix New Zealand, Isagenix Auckland etc.. for NZ The Australian site gets a lot more traffic and Australian market gets a lot more searches - I also have a section www.isacleanse.com.au/isagenix-new-zealand/ on the Australian site. The question is am I best off redirrecting the .co.nz domain completley to the Australian Domain to give it extra SEO Juice?0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Renaming your domain from an existing live domain and SEO implications - Please Help *shudder*
Please see the details below. Site A: http://south-african-holiday.mobi is an existing site that is our best site. It is Joomla 3.1 and runs all our ecommerce. Site B: http//www.southerncircle.com/ is our original and has the best DA but is out of date and pretty clunky. joomla 1.5 and all bookings (tour site) are redirected to Site A for processing. Instead of redesigning the Site A I'd like to change the domain name of http://south-african-holiday.mobi -> http://southerncircle.com So far my reading and research (Thanks MOZ for awesome forum!) has provided me with: 1. Do the SEO groundwork. i.e. remove dead links from both sites. Delete useless content and generally tidy up both sites. 2. Map all pages from site a: http://southerncircle.com -> http://south-africa-holiday/ so that the existing pages that have good ranking will have a home on the new site. 3. When ready do a small sample 301 redirect from: http://southerncircle.com to http://south-africa-holiday.mobi. 4. arghhhh now I'm stuck ..... If I redirect to this site then I lose my http://southerncircle.com domain which is what I want to keep....I just want the .mobi site to move to the southerncircle.com site.... I don't consider myself totally thick but this is really confuseing the *$%# out of me PLEASE could you give me some insight here. I'm sure it has been done before without completely losing the sites seo ranking and sending my site into SEO oblivion. If there are any JOOMLA gurus that have done this I'd love to hear from you as well. Many thanks in advance.
Intermediate & Advanced SEO | | SoutherlySwell0 -
1 site on 2 domains (interesting situation, expert advice needed)
Dear all, i have read many posts about having one content on 2 different domains, how to combine those two to avoid duplicate content. However the story of my two domains makes this question really difficult. Domain 1: chillispot.org ( http://www.opensiteexplorer.org/links?site=chillispot.org ) The original site was on this domain, started 9 years ago. That time the owner of the domain was not me. The site was very popular with lots of links to it. Then after 5 years of operation, the site closed. I have managed to save the content to: Domain 2: chillispot.info ( http://www.opensiteexplorer.org/links?site=chillispot.info ) The content i put there was basically the same. Many links were changed to chillispot.info on external sites when they noticed the change. But lots of links are still unchanged and pointing to .ord domain. The .info is doing well in search engines (for example for keyword 'chillispot'). Now i managed to buy the original chillispot.org domain. As you can see the domain authority of the .org domain is still higher than the .info one and it has more valuable links. Question is: what would be the best approach to offer content on both domains without having penalized by google for duplicated content? Which domain should we keep the content on? The original .org one, which is still a better domain but not working for several years or the .info one who has the content for several years now and doing well on search engines? And then, after we decide this, what would be the best approach to send users to the real content? Thanks for the answers!
Intermediate & Advanced SEO | | Fudge0 -
Site dancing
Hi guys I have a site which is dancing. I mean one day is on position 20 , if I put more backlinks is falling, after rising again,, I dont know what is going on. The site is 2 years old, pr 2, authority 35. Why this is happening? Usually when he appears again is ranking higher, but today he disappear totally from rankings. Maybe return tomorrow? But anyway why is dancing? Thanks
Intermediate & Advanced SEO | | nyanainc0 -
Blog - on the domain or place on separate site, now that Panda ranks for bounce, TOP, depth of visit
Over 10 years ago, we decided to run our blog external to our main website. contrary to conventional wisdom then, we thought we’d have more control/opps for generating external anchor text links, plus working in a bona fide blog software environment (WP). As we had hoped, the blog generated alot of strong inbound links, captured inbound links of it own from other sites and I think, helped improve our SERPs and traffic. Once the blog was established and with the redesign of the website, we capitulated, and finally moved the blog onto the main domain. After reading a number of pieces on Panda and the new reality of SEO, sounds like bounce rates (in particular), time on page, and other GA measures may have a more profound influence on google rankings now. Given that blogs are notoriously for high bounce rates (ours is), low time on site, depth of visit, seems logical that it adversely affects our site averages for the main domain). Is it time to re-consider pulling our blog off the main domain to reassert the ‘true’ GA measures of the main domain? I guess it still gets down to the question... is the advantage of all the inbound links to the blog on the main domain of greater value than moving the blog off-site and reasserting better 'site stats' for google's pando algo? Thanks.
Intermediate & Advanced SEO | | ahw0