Does running 2 domains from the same website harm my seo efforts?
-
Hi
I run a website which has 2 domains running from it. One domain has a .co.uk extension and shows products aimed at the UK market. The other domain is a .com which shows products aimed at the US market.
As both domains run from the same website, the content is mostly identical APART from the product listings as they change depending upon whether you're in the UK or US.
My reason for doing is this mainly for paid search purposes as I can have a more accurate domain depending upon which country I target. Both sites are set to encourage the bots to index it, and in google WMT each domain is targeted to its specific country.
So, my questions are,
-
Is this set-up possibly damaging my SEO efforts for either or both websites?
-
If so, would setting one domain to no-index, improve SEO efforts for the other?
Thanks in advance for any replies!
Cheers
Joe
-
-
Thank you, you're most welcome.
-
Thanks again, a brilliant reply, and I'll action those suggestions immediately!
-
You're welcome Joe.
Regarding those links that you mention; they could potentially cause an issue such as link spamming, which can happen when two websites send each other a lot of reciprocal links.
(Simpler solution towards the end) ~ My advice would be to remove them and make it clear on the website that those on the UK site, in the US, can visit the US site and vice versa.
As some landing pages will no doubt be deep pages (i.e. not the homepage) perhaps a widget type of box in a prominent position would help to direct visitors to the other site if they are on the wrong site at the time. EG. "If you're a visitor from the US then visit our site especially for our US customers" or such like.
With some clever coding (and use of nofollow for such links) you could get it to work that when clicked on, the visitor goes to the corresponding page on the other site.
Or alternatively, an easier solution, make sure that all those links are "nofollow" links so that search spiders don't follow them, reducing the risk of a penalty.
Regards
Simon
-
Thanks very much for the response Simon.
I have links between both .co.uk and .com websites on every page on my site so you can check products from either country perspective, therefore google WMT shows lots of backlinks from the alternative domain, do you think this could cause a problem as well?
Thanks
Joe
-
Do more to differentiate them insofar as it's possible. Interview people from different countries for the respective sites, for example.
-
Hi Joe
A good question. You'll find most of your answer in a video by Matt Cutts at Google http://www.youtube.com/watch?v=Ets7nHOV1Yo
As you are running different Top Level Domains (TLDs) then chances are that your websites and content won't be classed as duplicate, however saying that, if they are both hosted in the same country, e.g. the UK, then potentially they could be classed as duplicate, with only one of them showing up in search results.
Your best bet would be to Host the .co.uk in the UK and the .com in the US. That way, it's more likely that neither site would be classed as duplicate.
Regards
Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect our www.website.com to website. Com/target-keyword
Dear moz community I have been analyzing the websites that rank in top 20 for our target keywords. All of the top 20 sites except us have their websites re directing to websites.com/target keyword. This is due to probably because they have multiple city's and one of the target keywords term is cityname + word. My question is and idea - make a 301 to our www.website.com to /city-keyword and start linking to that page with new links. Would that bring any benefit? Seems that it's a very strong ranking signal. Any threats that I must take into account? We currently rank as #9
Technical SEO | | advertisingcloud0 -
AJAX and SEO
Hello team, Need to bounce a question off the group. We have a site that uses the .NET AJAX tool kit to toggle tabs on a page. Each tab has content and the content is drawn on page load. In other words, the content is not from an AJAX call, it is there from the start. The content sits in DIV tags which the javascript toggles - that's all. My customer hired an "SEO Expert" who is telling them that this content is invisible to search engines. I strongly disagree and we're trying to come to a conclusion. I understand that content rendered async via an AJAX call would not be spidered, however just using the AJAX (Javascript) to switch tabs will not affect the spiders finding the content in the markup. Any thoughts?
Technical SEO | | ChrisInColorado0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Websites that scroll....forever....
I'm seeing more and more websites that have all their content on one page. I came across www.otbthink.com today and am wondering if I'm missing something here. I ask this because I am trying to figure out how to link to their copy writing portion of their website. I have a client that is in their area and needs someone to do some copy writing work for them. All the content there is being served through css. The search engines will see it but how do you optimize for this sort of site? How do you link to a particular section of this site? Am I missing something obvious here? Thanks.
Technical SEO | | DarinPirkey0 -
Multilingual Website - Sub-domain VS Sub-directory
Hi Folks - Need your advice on the pros and cons of going with a sub-domain vs a sub-directory approach for a multi lingual website. The best would be a ccTLD but that is not possible now, so I would be more interested in knowing your take on these 2 options. Though, I have gone through http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-seo-friendly-structure/ and this somewhat vouches for a sub-directory, but what would you say'?
Technical SEO | | RanjeetP0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Do or don't —forward a parked domain to a live website?
Hi all, I'm new to SEO and excited to see the launch of this forum. I've searched for an answer to this question but haven't been able to find out. I "attended" two webinars recently regarding SEO. The above subject was raised in each one and the speakers gave a polar opposite recommendations. So I'm completely at a loss as to what to do with some domains that are related to a domain used on a live website that I'm working to improve the SEO on. The scenario: Live website at (fictitious) www.digital-slr-camera-company.com. I also have 2 related domain names which are parked with the registrar: www.dslr.com, www.digitalslr.com. The question: Is there any SEO benefit to be gained by pointing the two parked domains to the website at www.digitalcamercompany.com? If so, what method of "pointing" should be used? Thanks to any and all input.
Technical SEO | | Technical_Contact0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0