Blog on 2 domains (.org/.com), Canonical to Solve?
-
I have a client that has moved a large majority of content to their .org domain, including the blog. This is causing some issues for the .com domain. I want to retain the blog on the .org and have it's content also show on the .com. I would place the canonical tag on the .com
Is this possible? Is this recommended?
-
What you describe is what the cross domain canonical is intended for, but there is a risk of search engines misinterpreting the canonical signals and/or not honoring them.
So you are probably safe doing this, but it's always a little risky when you purposely publish dupe content in multiple places, especially across separate domains.
-
You can safely use the standard feed features provided by most major CMS and blogging platforms, they often have this option enabled by default or let you do it.
-
Yes, the canonical tag is the best route to take in this scenario.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Why there is lot of difference in Domain Authority vs majestic trust flow strange???
Hello all I want to ask you why there is difference in DA authority vs majestic trust authority as both of these companies say they have the best authority alogrithm see the below link for refrence. http://wp.auburn.edu/bassclub/next-meeting-1-28-2014/
White Hat / Black Hat SEO | | adnan11010 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
We have implemented quite a few large websites onto cloudflare and have been very happy with our results. Since this has been successful so far, we have been considering putting some other companies on CL as well, but have some concerns due to the structure of their business and related websites. The companies run multiple networks of technology, review, news, and informational websites. All have good content (Almost all unique to each website) and rankings currently, but if implemented to cloudflare, would be sharing DNS and most likely IP's with eachother. Raising a concern of google reducing their link juice because it would be detected as if it was coming from the same server, such as people used to do for their blog farms. For example, they might be tasked to write an article on XYZ company's new product. A unique article would be generated for 5-10 websites, all with unique, informative, valid and relevant content to each domain; Including links, be it direct or contextual, to the XYZ product or website URL. To clarify, so there is no confusion...each article is relevant to its website... technology website- artciel about the engineering of xyz product
White Hat / Black Hat SEO | | MNoisy
business website - How xyz product is affecting the market or stock price
howto website - How the xyz product is properly used Currently all sites are on different IP's and servers due to their size, but if routed through cloudflare, will Google simply detect this as duplicate linking efforts or some type of "black hat" effort since its coming from cloudflare? If yes, is there a way to prevent this while still using CL?
If no, why and how is this different than someone doing this to trick google? Thank you in advance! I look forward to some informative answers.0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
One Blog Comment Now on Many Pages of The Same Domain
My question is I blog commented on this site http://blogirature.com/2012/07/01/half-of-200-signals-in-googles-ranking-algorithm-revealed/#comment-272 under the name "Peter Rota". For some reason the recent comments is a site wide link so, bascially my link from my website is pretty much on each page of their site now. I also noticed that the anchor text for each one of my links says "Peter Rota". This is my concern will google think its spammy if im on a lot of pages on a same site for one blog comment, and will I be penailzied for the exact same anchor text on each page? If this is the case what could I do in trying to get the links removed? thanks
White Hat / Black Hat SEO | | ilyaelbert0 -
Unique businesses - unique domain names?
A client of mine owns a studio space where he teaches yoga and martial arts. It's a new business and we're deciding how to create the website(s) and which domain(s) to buy. The idea right now is to have 3 websites for each side of the business, and I'm looking for validation of this idea. I haven't been able to find an answer in the Q&A forum that quite applies to our situation. Website 1: for the studio itself. The audience is other yoga teachers, martial arts teachers, or personal trainers. He will rent out the studio space to them and they bring in their own clients. Content and keywords will relate to this. Website 2: yoga classes. The audience is members of the public who want to take yoga classes. Content and keywords will relate to this. Website 3: martial arts. The audience is members of the public who want to take martial arts classes. Content and keywords will relate to this. We will make certain there's no duplicate content on the sites, but it makes sense for them to link to each other because they're similar in nature (personal health and fitness at the studio), and the latter 2 services are offered at the studio, of course. Question 1: (a) is it a good idea to get a separate domain for each site? for example: www.city-studio.com, www.unique-name-yoga.com, www.unique-name-martial-arts.com (b) Or would it be better to keep it all under city-studio.com and use subdomains like yoga.city-studio.com and martial-arts.city-studio.com? In either case, the keywords "yoga" and "martial arts" would be in the domain name, which has benefit. Does that still apply for subdomains? (c) Or would these services even be considered similar enough that I just use www.city-studio.com/yoga.php and www.city-studio.com/martial-arts..php There will of course be several pages on yoga and several on martial arts. Question 2: if registering multiple domains, they will interlink as much as possible. (a) what do we consider when buying the domains? (b) use a different address for WHOIS of each domain? (c) can technical contact be the same address (mine, the consultant)? (d) use a different credit card for each? (e) ok if the name on the credit card is the same? (f) can we register them all the same day? (h) same domain registrar? (i) same host? we don't want to appear black hat by having multiple sites, but I think it's very legitimate to have the business split into 3 sites like this just because they're separate sides of the business with different audiences, content, keywords. Question 3: when the domains come up for renewal in 1 year (or more), would it be safe to switch them all over to one credit card then, for convenience to the owner? Question 4: is there anything important I haven't mentioned here? I appreciate any input and discussion.
White Hat / Black Hat SEO | | Kenoshi0