Would it be a good idea to duplicate a website?
-
Hello,
here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com.
Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4.
We see 2 ways of doing this:
- we launch www.company2.com as a copy of www.company1.com.
- we launch www.company2.com as a completely different website.
The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure.
Do you think either of these approaches could result in penalties by the search engines?
Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one?
Thanks for your help!
-
Hello Travis,
I'm on the same page as you - I just wanted a third party's opinion.
Thank you!
-
First, you're totally right about the possible duplicate content issue. No matter what, don't make a duplicate. If the only interest is crowding SERPs, you're destined for a bad time.
If your manpower is fixed in the short run, you're better off focusing on one property. If the systems administration on one property alone is hard, two will be killer. Now imagine all the necessities aside from that. You'll end up hating your life.
Clients have come to me with dozens of keyword+geo sites. The results are usually the same. That's when the culling of chaff begins. It ends in a single property that performs better than the dozen combined.
Instead, consider another strategy with your existing property. Is your product something that lends itself to the experiential side of things? Is there potential for frequent buying from one client, can they upgrade? Consider what you can do for the buyers you have.
If you can increase LTV, you might be able to increase staff in the long run. And that means more hands for more complex/interesting things. I would think about the situation like that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
Is Meta Keywords Important For Websites?
Hi, I understand that meta title and descriptions are very important for websites. I would like to know if meta keywords are important? I have seen people talking about meta keywords are useless and it should be removed from the website to prevent competitors from knowing your keywords. Anyone has anything to share? 🙂
White Hat / Black Hat SEO | | chanel270 -
Technorati links. good? or bad?
Hi all After an unnatural link warning I am about to do my third reconsideration request after having my previous two turned down. I have manually removed hundreds of spammy links (thousands if you include sitewide) and used the disavow tool on hundreds more where I could not get them manually removed. The backlinks I have remaining now all seem to be genuine. There are quite a few backlinks from technorati, I thought these were ligitimet links but am now thinking should I remove/disavow them. Does anybody have any opinions?
White Hat / Black Hat SEO | | shauny350 -
Are links from directories good or bad?
I've done a lot of competitive link analysis lately and found that a lot of my competitors links for a certain keyword are coming from low quality directory sites and they're outranking my site. This leads me to my question which may or may not have an answer(I at least hope it fuels a good discussion)... Are links from directory sites good or bad for SEO?
White Hat / Black Hat SEO | | TylerReardon0 -
Article Re-posting / Duplication
Hi Mozzers! Quick question for you all. This is something I've been unsure of for a while. But when a guest post you've written goes live on someone's blog. Is it then okay it post the same article to your own blog as well as Squidoo for example? Would the search engines still see it as duplication if I have a link back to the original?
White Hat / Black Hat SEO | | Webrevolve0 -
Would Headspace Plug-in be a bad idea?
We use the plug in headspace for some posts because some things we want to show in a certain way on our site ie with a certain title but we want the title to be more descriptive for google. It used to work really well but now I have noticed a lot of posts that used to do really well in search being flagged up for multiple meta description and headers that I wondered wether they would be harming the site's query stats? Does anyone think that after the penguin/panda updates etc using headspace might be a negative option?
White Hat / Black Hat SEO | | luwhosjack0 -
Do backlinks with good anchor text from bad sites help?
Hi, In the Netherlands, the SEO competition for terms like loans is very competitive. I see a website in this industry that seems to be doing very well based on links with good anchor text from sites that seem quite worthless to me, such as: http://www.online-colleges-helper.com/ and http://www.alohapath.com/ My question is: is it worth pursuing this type of links? I assume these must be paid links, or am I wrong? I'd really rather not go down this route but I don't want to be outranked by someone who is using these types of links... Many thanks in advance for any type of insight! Annemieke
White Hat / Black Hat SEO | | AnnemiekevH0