Would it be a good idea to duplicate a website?
-
Hello,
here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com.
Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4.
We see 2 ways of doing this:
- we launch www.company2.com as a copy of www.company1.com.
- we launch www.company2.com as a completely different website.
The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure.
Do you think either of these approaches could result in penalties by the search engines?
Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one?
Thanks for your help!
-
Hello Travis,
I'm on the same page as you - I just wanted a third party's opinion.
Thank you!
-
First, you're totally right about the possible duplicate content issue. No matter what, don't make a duplicate. If the only interest is crowding SERPs, you're destined for a bad time.
If your manpower is fixed in the short run, you're better off focusing on one property. If the systems administration on one property alone is hard, two will be killer. Now imagine all the necessities aside from that. You'll end up hating your life.
Clients have come to me with dozens of keyword+geo sites. The results are usually the same. That's when the culling of chaff begins. It ends in a single property that performs better than the dozen combined.
Instead, consider another strategy with your existing property. Is your product something that lends itself to the experiential side of things? Is there potential for frequent buying from one client, can they upgrade? Consider what you can do for the buyers you have.
If you can increase LTV, you might be able to increase staff in the long run. And that means more hands for more complex/interesting things. I would think about the situation like that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks from customers' websites. Good or bad? Violation?
Hi all, Let's say a company holds 100 customers and somehow getting a backlink from all of their websites. Usually we see "powered by xyz", etc. Is something wrong with this? Is this right backlinks strategy? Or violation of Google guidelines? Generally most of the customers's websites do not have good DA; will it beneficial getting a backlinks from such average below DA websites? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
International Website Targeting
Hello fellow Mozzers, had a quick question. So we have a new eCommerce client that is interested in launching a website in multiple countries. According to their vision, they want a US site, UK site, Japan site, etc and so on. I have a few concerns about doing it this way. First, there is the issue with the sites being the same. They only difference will be that they have a different domain, such as domain.co.jp for the Japan-based site, domain.co.uk for UK, etc. Even if we target different countries in webmaster, won't the sites still compete with one another and potentially get tagged as duplicates? I'm thinking there has to be a better way to have a site targeted at the world, without having to clone and duplicate and relaunch. Anyone have experience with this?
White Hat / Black Hat SEO | | David-Kley0 -
Increase in spammy links from image gallery websites i.e. myimagecollection.net
Hi there I've recently noticed a lot of spammy links coming from image gallery sites that all look the same, i.e.: http://mypixlibrary.co/ http://hdimagegallery.net/ http://myimagecollection.net/ http://pixhder.com/ Has anyone else seen links from these? They have no contact details, not sure if they are some form of negative SEO or site spam. Any ideas how to get rid? Thanks
White Hat / Black Hat SEO | | Kerry_Jones0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Is it negative to put a backlink into the footer's website of our clients ?
Hello there ! Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop". But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ? What is the best practice for a lasting SEO ? I hope you understand my question, Thnak you in advance !
White Hat / Black Hat SEO | | mywebshop0 -
How does Google rank a websites search queries
Hello, I can't seem to find an answer anywhere. I was wondering how a websites search query keyword string url can rank above other page results that have stronger backlinks. The domain is usually strong, but that url with the .php?search=keyword just seems like it doesn't fit in. How does Google index those search string pages? Is it based off of traffic alone to that url? Because those urls typically don't have backlinks, right? Has anyone tried to rank their websites search query urls ever? I'm just a little curious about it. Thanks everyone. Jesse
White Hat / Black Hat SEO | | getrightmusic0