Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
-
For example, lets say I have these 3 domains:
The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com)
The purpose of this would be to capitalize on the Exact match domain opportunities.
I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml
The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain.
What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
-
Sure, from what I know the EMDs hit were spammy but the thing is... why work on three sites & create partial duplicate copy in the hopes that them being EMDs would give you a boost when you could create great non-duplicate copy for one site and have it rank based on putting quality work & effort into it? Odds are you'll have better luck that way than you would linking between your own EMDs like that.
And if the plan is to 301 redirect the two EMDs to the primary then you'd really need a strong link profile on the two EMDs to have them help enough once redirected. Instead you could spend the time working on the primary site instead of spreading your effort across three sites.
-
It was my understanding that you can still benefit from an EMD that is a high quality, relevant website.
I thought that the EMD's that did take a hit were spammy, low quality sites?
http://moz.com/blog/googles-emd-algo-update-early-data
What is your opinion about this?
Assuming this is correct, do you have any suggestions pertaining to the original question?
-
Exact Match Domains took a hit recently in Google so trying to capitalize on something like that won't necessarily help you. Instead of splitting your work over 3 sites hoping to make one rank better, you'd be better served by targeting your efforts on the main site and giving it the best, most relevant content you can while building its link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What if i dont use an H1, but rather, h2 with multiple keywords.
the reason i dont want to use h1 is because i can have only one h1, however if i use several h2s. is it gonna help me rank? bacause google likes h1 more than h2, is google gonna give more priority or same priority to h2., and if that priority is gonna be less, what will be the percentage of that lessness? for ex: h1 gets 90 score if my h1 is missing how much score my h2 will get out of hundred(i know there is no such metric but i am just wondering anyways)
White Hat / Black Hat SEO | | Sam09schulz0 -
Moving Pages Up a Folder to come off root domain
Good Morning I've been doing some competitor research to see why they're ranking higher than us and noticed that one who seems to be doing well has changed their url structure so that rather than being www.domain.com/product-category/product-subcategory/product-info-page/ they've removed levels so for instance they now have: www.domain.com/product-subcategory/ and www.domain.com/product-info-page/ basically everything seems to come off the root domain rather than having the traditional structure. Our rankings for the product-subcategory pages, which are probably what most people would search for, are just sitting below the first page in most instances and have been for a while I'm interested to know other people's thoughts and if this is an approach they've taken and had good results?
White Hat / Black Hat SEO | | Ham19790 -
Disabling a slider with content...is considered cloaking?
We have a slider on our site www.cannontrading.com, but the owner didn't like it, so I disabled it. And, each slider contains link & content as well. We had another SEO guy tell me it considered cloaking. Is this True? Please give feedbacks.
White Hat / Black Hat SEO | | ACann0 -
How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
We have implemented quite a few large websites onto cloudflare and have been very happy with our results. Since this has been successful so far, we have been considering putting some other companies on CL as well, but have some concerns due to the structure of their business and related websites. The companies run multiple networks of technology, review, news, and informational websites. All have good content (Almost all unique to each website) and rankings currently, but if implemented to cloudflare, would be sharing DNS and most likely IP's with eachother. Raising a concern of google reducing their link juice because it would be detected as if it was coming from the same server, such as people used to do for their blog farms. For example, they might be tasked to write an article on XYZ company's new product. A unique article would be generated for 5-10 websites, all with unique, informative, valid and relevant content to each domain; Including links, be it direct or contextual, to the XYZ product or website URL. To clarify, so there is no confusion...each article is relevant to its website... technology website- artciel about the engineering of xyz product
White Hat / Black Hat SEO | | MNoisy
business website - How xyz product is affecting the market or stock price
howto website - How the xyz product is properly used Currently all sites are on different IP's and servers due to their size, but if routed through cloudflare, will Google simply detect this as duplicate linking efforts or some type of "black hat" effort since its coming from cloudflare? If yes, is there a way to prevent this while still using CL?
If no, why and how is this different than someone doing this to trick google? Thank you in advance! I look forward to some informative answers.0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Blog on 2 domains (.org/.com), Canonical to Solve?
I have a client that has moved a large majority of content to their .org domain, including the blog. This is causing some issues for the .com domain. I want to retain the blog on the .org and have it's content also show on the .com. I would place the canonical tag on the .com Is this possible? Is this recommended?
White Hat / Black Hat SEO | | Ngst0 -
Can anyone explain these crazy SERPS?
do a UK based search for 'short term loans' on google. there are 7 sites on page 1 without any page or domain authority, several of them registered to a 'jeremy hughes', who I am guessing does not really exist. this is a very competitive term and they just shouldn't be making it onto page 1. im thinking this must be some clever 301 redirecting, as I cant see any backlinks to any of these sites in opensiteexplorer. any ideas how these sites are pulling this off?
White Hat / Black Hat SEO | | lethal0r0