Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unique domains vs. single domain for UGC sites?
-
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better:
XXX,XXX pages on one site
vs.
A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site.
The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary.
Thoughts? Any other considerations we should be thinking about?
-
-
It depends on how the content on secondary domains organized. If each secondary domain has a content theme, then it would be easier to get separate links for each of them and thus it benefits everyone. It helps user to quickly find/contribute what they are looking for, helps to attract different specific links and pass the value to primary. If there is no such theme, then all those secondary domain will compete with each other for mindshare, user contribution and attracting individual links which would make them to achieve high DA,
-
I have similar setup. but instead of separate domains I have multiple subdomains with content categorized by theme. It helps may ways 1) Easier for single sign on. Use logged in one site does not need to log in again on anotyher site. 2) Can attract different types of links 3) easier for segmentation for advertisers. Not all subdomains can achieve same DA or traffic but it helps overall network by internal linking.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
So many links from single site?
this guy is ranking on all high volume keywords and has low quality content, he has 1600 ref domains check the attachment how did he get so many links from single site is he gonna be penalized YD2BvQ0
Intermediate & Advanced SEO | | SIMON-CULL0 -
Splitting and moving site to two domains - How to redirect
I have a client who is going to split their retail and wholesale business and rebrand the retail biz. So let’s say they are going to move everything from currentdomain.com to either retaildomain.com or wholesaledomain.com. The most important business for them is the retail site, so they want to pass on as much ranking power as they can from currentdomain.com to retaildomain.com. I see two choices here: We can 301 redirect all of currentdomain.com to retaildomain.com, and then redirect any wholesale pages to wholesaledomain.com. The advantage is that we can use GSC’s change of address tool to report the change to Google. The downside is that there is a redirect chain (2 hops) to wholesaledomain.com. Would this confuse Google? Or we can 301 redirect page by page from currentdomain.com to the appropriate page on either new site. This means no redirect chains but it also means that we can’t use GSC’s change of address tool. Which would you do and why? And is there another option that I'm missing? I appreciate any insights you can share.
Intermediate & Advanced SEO | | rich.owings1 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
Combining two existing sites into a single magento install
Hi, We run an online beauty ecommerce store and recently acquired one of our competitors. Their site runs on magento also, and they sell 70% the same product as us. We plan to merge the new site into our existing magento install but keep both sites looking exactly as they do now with different themes, different product names, product descriptions, product prices, category structures etc. In theory the customer would have no idea both sites from the same magento, they will look just as they do now. My question is, will google possibly slap the SERP's of either sites because we have combined them onto the same server and same magento install, even though nothing on either site actually changed on the front end. Both sites already have the same ownership information on the domain WHOIS, and a quick company search would reveal that we legally own both businesses under the same company. So it's not something we are trying to hide, we are open about it, and plan to continue running both sites long term, with each site being targeted to a slightly difference audience, with 30% different products at different price points. Has anyone done this before? Was there any SEO risks or SERP drops? Would love some advice on this matter before we make the move, the possible blow back is way too massive to do it without firm advice saying the risk is very low. Brad.
Intermediate & Advanced SEO | | rec1230 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Redirect ruined domain to new domain without passing link juice
A new client has a domain which has been hammered by bad links, updates etc and it's basically on its arse because of previous SEO guys. They have various domains for their business (brand.com, brand.co.uk) and want to use a fresh domain and take it from there. Their current domain is brand.com (the ruined one). They're not bothered about the rankings for brand.com but they want to redirect brand.com to brand.co.uk so that previous clients can find them easily. Would a 302 redirect work for this? I don't want to set up a 301 redirect as I don't want any of the crappy links pointing across. Thanks!
Intermediate & Advanced SEO | | jasonwdexter0 -
Move blog from subdomain to main domain on ecom site?
I am wondering what my fellow mozers think. Pretty set about my direction but want to get any other input to aid in my decision. Have an ecom site with a www.blog.maindomain.com. The blog is fairly new and no major rankings. There are only about 30 posts. This isn't a super competitive market and the blogging won't be a huge part of our content strategy but I would like to use it for passing juice etc. Would you go through the trouble to move the blog to www.site.com/blog and redirecting all the old content to new?
Intermediate & Advanced SEO | | PEnterprises0 -
Site Architecture: Cross Linking vs. Siloing
I'm curious to know what other mozzers think about silo's... Can we first all agree that a flat site architecture is the best practice? Relevant pages should be grouped together. Shorter, broader and (usually) therefore higher volume keywords should be towards the top of each category. Navigation should flow from general to specific. Agreed? As Google say's on page 10 of their SEO Starter Guide, "you should think about how visitors will go from a general page (your root page) to a page containing more specific content ." OK, we all agree so far, right? Great! Enter my question: Bruce Clay (among others) seem to recommend siloing as a best practice. While Richard Baxter (and many others @ SEOmoz), seem to view silos as a problem. Me? I've practiced (relevant) internal cross linking, and have intentionally avoided siloing in almost all cases. What about you? Is there a time and place to use silos? If so, when and where? If not, how do we rectify the seemingly huge differences of opinions between expert folks such as Baxter and Clay?
Intermediate & Advanced SEO | | DonnieCooper7