Which is better for SEO. 1 big site or a number of smaller sites.
-
Hello , I am about to create a website with product reviews for a certain niche.
What i want to know: Is it better for me to have a site with all reviews , like nicheproductsreviews.com and then have
nicheproductsreviews.com/product-one-review.html
and
nicheproductsreviews.com/product-two-review.html
or
buy multiple domains to have product name in the domain name, like product-one-review.com and product-two-review.com
As far as I understand, first approach consolidates all pages on the same site , consolidating all the link juice to it. However, second approach lets me have the product name in the main domain URL.
Which way is better for SEO and why?
-
It may be that when someone searches for product one that the site product-one-review.com is better than the product one page on the .com site all other things (links etc) being equal.
However what happens if the visitor wants to buy product one and product two at the same visit and in the same transaction, would they have to jump between two different sites? Would there be duplicate content on the product one site about product two that was also on the product two site.
There are non-SEO considerations to take account of.
I would go for the .com site with pages devoted to and fully optimised for each product and then try and link build for those product pages.
Hope this helps!
-
You are completely right. Keywords in domains is the only reason i was thinking of spreading them out. And again , I as well heard of the dialing down the "keyword in domain" importance. The niche I am trying to go into is not that competitive, so i would assume that i can have the same advantage by making 10-20 .edu and .gov links.
Am i mistaken somewhere, but now I think to go with the "all-in-one" structure
-
Well, in terms of keywords in the domain, historically, in my experience, that has been a great benefit. However, there has been talk that this will be dialed down soon.
Is it your intention to inter-link the sites?
There doesn't seem to me to be any additional benefit of spreading the reviews across multiple domains. At least not based on the example as given.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
New site or subdomain
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site? will the new site be placed in the google sandbox? the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
Algorithm Updates | | bakergraphix_yahoo.com0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
When Google crawls and indexes a new page does it show up immediately in Google search - "site;"?
We made changes to a site, including the addition of a new page and corresponding link/text changes to existing pages. The changes are not yet showing up in the Google index (“site:”/cache), but, approximately 24 hours after making the changes, The SERP's for this site jumped up. We obtained a new back link about a couple of weeks ago, but it is not yet showing up in OSE, Webmaster Tools, or other tools. Just wondering if you think the Google SERP changes run ahead of what they actually show us in site: or cache updates. Has Google made a significant SERP “adjustment” recently? Thanks.
Algorithm Updates | | richpalpine0 -
Are the latest Ranking Reports counting the new large format site links as positions?
Received my weekly ranking report this morning and noticed a specific keyword that I've been ranking in the 3rd or 4th spot has dropped a significant amount of positions. I tested the results myself and it appears the site links of the manufacturer are being counted as positions? My keyword has me in the 3rd position (although it is much lower on the physical page now because of the new format). I'm really wondering how this will affect organic listings going forward - this new format could be a game changer.
Algorithm Updates | | longbeachjamie2 -
Site Usage Statistics and organic ranking
I'm not sure if anyone has tested this properly but i'm begining to suspect that google is using site usage statistics as a site quality guide and ultimately as a ranking variable. The this what i've seen so far on one of my sites (site A) Week 1= bounce rate (83.88%), Avg time on site (0:0:57), Pages/visit (1.28) no changes made to the site apart from the usual link building. Week 2: Traffic drops by 30%, Keywords generating traffic drops by 39%. Bounce rate (87.25%), Avg time on site (0:0:43), pages/visit (1.21). I replaced all affiliate links on my homepage to internal pages where the chunk of the content is and did a reconsideration request. Week 3: Traffic goes up by 30%, keywords generating traffic goes up by 65%, Bounce rate (30.41%), Avg time on site (0:3:02), Pages/visit (3.74). This is not the most scientific test but surely google must be using these variables and a ranking factor? Anyone seen something along these lines or have thoughts on it?
Algorithm Updates | | clickangel0