Duplicate Content Question With New Domain
-
Hey Everyone,
I hope your day is going well. I have a question regarding duplicate content.
Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet.
The Issue
Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc).
Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
-
Great pleasure - good luck with it all!
-
Great response Nigel and thank you so much for your insight!
-
Hi imjonny
I'm glad you have asked around to be honest. Like I said, I would.
You are right that even if you canonicalize ultimately Google will decide whether to rank a page it deems to be important and can ignore the canonicalization. If the canonical isn't bona fide then it could call in to doubt the other canonicals on your site which would be a strong negative signal for SEO and lead to a drop in trust.
So - it depends on what you want to rank for.
Let's say you have Product A on site A and you then have an equivalent checkout page on site B. Then you can't try and rank for the term Product A because it just isn't going to happen. We've already said that we will need to canonicalize that page anyway to the equivalent page on site A.
The only thing you can hope to do is rank for 'Delivery Options', 'Branded Delivery' The Big 'Delivery Option Site'
What you can't do is try and rank for the product names, It will be impossible. But then why would you want to? Surely the important thing is to maintain rank for site A's products with site B being more of a slave site - solely functional.
Ultimately you would be canonicalizing the product pages not the whole site so maybe there are other pages that you can add. Maintenance, Technology, How to etc But frankly they would suit Site A anyway because if I am buying a product I want as much info as possible before purchase, not on the delivery page.
Oh and don't create branded content for site B because once again you will crave up site A.
I know it's a big conundrum but I haven't seen anything like you are trying to do so can only generalise on best practice.
I hope that helps!
Regards
Nigel
-
Hi Nigel,
I got some more responses from other sources and it seems like duplicating a new site IS a bad idea.
Let's say we canonical to website A so that Google knows that the main page is on site A. Would Website B have chances to index & rank? I've heard that canonical is just a signal to Google. Google will ultimately determine which page they will want to show even if the canonical is there. Is this true?
-
The bigger the site the bigger the potential loss. No SEO in my honest opinion would snaction what you are thinking no matter how big the site is.
Like I said - cast the question wider than here. It's shame that other SEOs haven't come on to help you with your thinking.
-
Hi Nigel,
Thanks for the response again! I understand that you may have had sites that had shared content, but what was the scale of these websites? Do you think if Website A was a huge authority that this issue won't be as big of a deal?
We're talking millions of sessions per month.
-
Hi
I have had sites myself with shared content and the end result was that neither of them ranked at all. They were set up in a pre-Penguin world (before 2011) and when the update really cut in September 2012 I lost 60% of my traffic in one day.
I have also worked on many sites who shared content across their own pages resulting in the same collapse in SERPS - You can read about the biggest mistake that website owners make here: https://moz.com/learn/seo/duplicate-content
In certain circumstances, you can share others' content by way of syndication. You'll see it on MOZ occasionally. They will have produced a great article and at a later date will share it across some other article sites as the authority will have been established as a MOZ article. Note that these are small'ish articles, not whole sites.
What you are talking about is basically, willingly, creating a duplicate site to site A. If you do that your rankings on site A will fall and site B will never gain any rank at all if the content pages are duplicates.
Yes, a competitor could damage your site if they were so inclined. Negative SEO is the practice of sharing your content to a number of sources thereby creating mass duplication. While Google should recognise yours as the original that is rarely the the case.
Duplicate content is at the very core of SEO. If someone is telling you differently then they are wrong.
However, it is your website and I would completely agree with your strategy of playing devil's advocate. If it was my site I would want as much corroboration as possible. So go and ask other SEOs but make damn sure they know what they are talking about and it isn't a 'bloke down the pub! because it can cost you hugely.
We probably lost £½m through our own naivete - never again!
Regards Nigel
-
Hi Nigel,
Thanks for the response again! I have a few questions:
- Why do you think I will destroy Site A? If that logic is true, theoretically, wouldn't you be able to copy someones site 100% and cause it to get destroyed?
- Have you seen any examples of this before?
I don't mean to neglect your advice, I'm just hearing different things from different people and need an accurate response in order to make the right decision.
-
If you use Website A content then you must canonicalize otherwise you will destroy the site (A). If you want B to rank independently then it MUST have original content.
This is how it works I'm afraid. Get help from a copywriter, or a few if that helps keep the cost down,
Regards
Nigel
-
Hey Nigel,
Thanks for your answer! Just to give some reference, Website A is currently up and has been up for a long time. It is getting A LOT of traffic and we don't want to risk anything on website A which is ranking REALLY well. Also, Website B is being made because of legal issues (can't really get into it) but it's best if we keep them as separate entities.
Because we're looking at scale for 1000s of pages to rewrite content, that doesn't seem like an option. And yes, we will be pulling all of the content from Website A to Website B.
Is the only solution to create completely new content for Website B? Will I face any issues with Website A at all whichever strategy I choose?
-
Hi imjonny
You are going to have a major problem trying to get these two sites listed at all in my opinion.
1. You are creating a multi-brand/store website in website A with menus reviews and about the stores & brands.
2. When someone clicks the brand delivery on website A they will be directed to site B. - presumably, because site B handles all of the shipping and checkout processes.If site B pulls the information from site A then you will kill both sites. I presume when you say 'pull' you mean it will also have that information on the pages?
So you are creating an unindexable monster that no amount of canonicalization, redirecting or iframe manipulation will help.
Presumably, you need to rank for site A but that is not possible if you are pulling content into site B. The only sensible thing I can think of is.
1. Canonicalize 'Store 1' on Site B to the equivalent store page on Site A. So store 1 on Site B effectively does not exist at all.
2. Call Store B - 'Brand Delivery' and write acres of content about delivering brands on the home page and a load of supporting pages. You just won't rank for anything on the second site apart from 'Brand Delivery' and any contextually similar words.
It's a weird way of setting stuff up. If I were shopping on a site I would not want to go to a different site to check out. You will have two sites to manage presumably with the same NAP - (site ownership and address) as well so that will not help.
The only way is to keep the two sites content mutually exclusive and use canonicals which of course can be done between different URLs.
If it was me I would keep site A and ditch the B idea but ho hum!
Kind Regards
Nigel
-
Unfortunately, there is no way around it
I think, imo, the best option is to just use the same domain, but that is really something we aren't able to do.
iframe sounds interesting, but we do still want the content (at least the menu of products) indexed.
Changing the content is also out of the question. Way too hard to scale with how many we would have to change.
-
Hey. If you rel=canonical I don't see how these pages would rank anymore. You're basically telling google that the other pages are the better ones. Is there no going around the duplicate content? This is a really strange / problematic situation.
I think your best bet is either using some sort of iframe if that's an option (it doesn't necessarily need to look bad). Or do your best to change content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
Intermediate & Advanced SEO | | murraycustomhomescom0 -
Internal Duplicate Content Question...
We are looking for an internal duplicate content checker that is capable of crawling a site that has over 300,000 pages. We have looked over Moz's duplicate content tool and it seems like it is somewhat limited in how deep it crawls. Are there any suggestions on the best "internal" duplicate content checker that crawls deep in a site?
Intermediate & Advanced SEO | | tdawson091 -
Semi-duplicate content yet authoritative site
So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance
Intermediate & Advanced SEO | | Veebs0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
New website from scratch and redirects or maintaining domain?
Hi friends, I have to face a big challenge. My client has an online store with about 250 products. The store is based on a web service provider whose CMS gives critical duplicate content issues. My client is considering to rethink the store from scratch and choose a CMS that offers more guarantees. I have two options: Starting the store from scratch with new domain and make redirects (whole domain or page to page?) Maintaining the domain, careful to keep the current structure of the URL, which is currently pretty bad I hope you could help me. Thanks in advance.
Intermediate & Advanced SEO | | sergio_redondo0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0 -
"Duplicate" Page Titles and Content
Hi All, This is a rather lengthy one, so please bear with me! SEOmoz has recently crawled 10,000 webpages from my site, FrenchEntree, and has returned 8,000 errors of duplicate page content. The main reason I have so many is because of the directories I have on site. The site is broken down into 2 levels of hierachy. "Weblets" and "Articles". A weblet is a landing page, and articles are created within these weblets. Weblets can hold any number of articles - 0 - 1,000,000 (in theory) and an article must be assigned to a weblet in order for it to work. Here's how it roughly looks in URL form - http://www.mysite.com/[weblet]/[articleID]/ Now; our directory results pages are weblets with standard content in the left and right hand columns, but the information in the middle column is pulled in from our directory database following a user query. This happens by adding the query string to the end of the URL. We have 3 main directory databases, but perhaps around 100 weblets promoting various 'canned' queries that users may want to navigate straight into. However, any one of the 100 directory promoting weblets could return any query from the parent directory database with the correct query string. The problem with this method (as pointed out by the 8,000 errors) is that each possible permutation of search is considered to be it's own URL, and therefore, it's own page. The example I will use is the first alphabetically. "Activity Holidays in France": http://www.frenchentree.com/activity-holidays-france/ - This link shows you a results weblet without the query at the end, and therefore only displays the left and right hand columns as populated. http://www.frenchentree.com/activity-holidays-france/home.asp?CategoryFilter= - This link shows you the same weblet with the an 'open' query on the end. I.e. display all results from this database. Listings are displayed in the middle. There are around 500 different URL permutations for this weblet alone when you take into account the various categories and cities a user may want to search in. What I'd like to do is to prevent SEOmoz (and therefore search engines) from counting each individual query permutation as a unique page, without harming the visibility that the directory results received in SERPs. We often appear in the top 5 for quite competitive keywords and we'd like it to stay that way. I also wouldn't want the search engine results to only display (and therefore direct the user through to) an empty weblet by some sort of robot exclusion or canonical classification. Does anyone have any advice on how best to remove the "duplication" problem, whilst keeping the search visibility? All advice welcome. Thanks Matt
Intermediate & Advanced SEO | | Horizon0