I am launching an international site. what is the best domain strategy
-
Hi Guys,
I am launching a site across the US, UK and UAE. Do I go **test.com/uk test.com/us test.com/UAE -- **or do I go us.Test.com UAe.test.com us.test.com? Which is best for SEO?
-
Suggesting subfolders without considering the peculiarities of the site jusrt because it's a "best practice" is not the ideal thing to do. Even if subfolders have sure advantages, their use can be not technically affordable or even good for effectiveness.
-
As always, it depends.
Is your site an ecommerce site with thousand hundreds if not more products. Or a news site? Then, maybe, the best thing to do should be using country code level domains, because - apart better for geotargeting - the technical maintaining of three complete ecommerce/news sites under in a subfolder system is not the most agile (especially if the site is custom made).
If the case if the one described above, but the ccTLDs are not available, then subdomains can be an alternative.
If your site is not an very technically complex ecommerce or news site, then use subfolders, but consider, if you see that one of the subfolders has very good metrics (sessions, conversions) to move it to a ccTLDs in the middle/long term.
-
There are positives and negatives to using different strategies. Moz's education section has three articles on international SEO: International SEO, Hreflang, and CcTLDs. I'd suggest going through them and also reading any further resources that they cite.
Good luck!
-
Also, if we want to target GCC countries with arabic content, what domain strategy should we apply..
We already have www.tcf-me.com (for english) and www.tcf-me.ae for arabic (But now we not only want to target UAE (Ae) but the entire GCC..
-
Figen we are in the same situation asking the same questions..
How do you tackle duplicate content?
-
Use sub domain (Etc. us.domain.com - fr.domain.com - de.domain.com) if your site will be different languages .
if your site will be only english use this : domain/us - domain/uk .
Be careful about duplicate content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
Exchange link from sites in same google account
Hi everyone, Anybody have experience when you have some websites which stored in Google Webmaster Tool and they exchange links between sites. So is it good for sites? We are hosted on different server. Thank you so much
White Hat / Black Hat SEO | | Jeepster0 -
Should I 301 redirect my old site are just add a link to my new site
I used to offer design and web services on a site that is current blank (no content, no links). My questions is should I add a little bit of content, maybe a brief explanation with a link to my new site. Or should I just add 301 redirect. This is purely a question of what is better for SEO and ranking for my new site (not a branding question).
White Hat / Black Hat SEO | | Tyrell0 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Secondary Domain Outranking Master Website
IEEE is a large professional association dedicated to serving engineers. The IEEE Web Presence is made up of flagship sites like IEEE.org, IEEEXplore, and IEEE Spectrum, mid-tier sites like Computer.org, and smaller sites like those dedicated to specific conferences. It is unclear exactly when this started - but searches in Google for [ieee] currently return ieeeusa.org before ieee.org. This is troublesome, as users are typically looking for IEEE.org with such a general query. ieeeusa.org is a site that has a much narrower focus - it is dedicated to public policy. IEEE.org is one of the strongest domains - I am thinking that this is a glitch of some sort. I am removing a stale sitemap that is referenced in robots.txt (though again, I'm not seeing any issues with other pages - its just two queries that are trouble: [ieee] and [about ieee]. And its noticeable in analytics 🙂 http://ieee.d.pr/hMg0/YhklCw7Z What do you think? 🙂
White Hat / Black Hat SEO | | thegrif3290 -
Should I Even Bother Trying To Recover This Site After Google Penguin?
Hello all, I would like to get your opinion on whether I should invest time and money to improve a website which was hit by Google Penguin in April 2014. (I know, April 2014 was nearly 2 years ago. However, this site has not been a top priority for us and we have just left until now). The site is www.salmonrecipes.net Basically, we aggregated over 700 salmon recipes from major supermarkets, famous chefs, and others (all with their permission) and made them available on this site. It was a good site at the time but it is showing its age now. For a few years we were occasionally #1 on Google in the US for "salmon recipes", but normally we would be between #2 and #4. We made money from the site almost entirely through Adsense. We never made a huge amount, but it paid our office rent every month, which was handy. We also built up an email database of several thousand followers, but we've not really used this much. (Yet). In the year from 25th April 2011 to 24th April 2012 the site attracted just over 500k visits. After the rankings dropped due to Google Penguin, traffic dropped by 77% in the year from 25th April 2011 to 24th April 2012. Rankings and traffic have not recovered at all, and are only getting worse. I am happy to accept that we deserved our rankings to fall during the Google Penguin re-shuffle. I stupidly commissioned an offshore company to build lots of links which, in hindsight, were basically just spam, and totally without any real value. However they assured me it was safe and I trusted them, despite my own nagging reservations. Anyway, I have full details of all the links they created, and therefore I could remove many of these 'relatively' easily. (Of course, removing hundreds of links would take a lot of time). My questions ... 1. How can I evaluate the probability of this site 'recovering' from Google Penguin. I am willing to invest time/money on link removal and new (ethical) SEO work if there is a reasonable chance of regaining a position in the top 5 on Google (US) for "salmon recipes" and various long-tail terms. But I am keen to avoid spending time/money on this if it is unlikely we will recover. How can I figure out my chances? 2. Generally, I accept that this model of site is in decline. Relying on Google to drive traffic to a site, and on Google to produce revenue via its Adsense scheme, is risky and not entirely sensible. Also, Google seems to provide more and more 'answers' itself, rather than sending people to e.g. a website listing recipes. Given this, is it worth investing any money in this at all? 3. Can you recommend anyone who specialises in this kind of recovery work. (As I said, I have a comprehensive list of all the links that were built, etc). OK, that is all for now. I am really looking forward to whatever opinions you may have about this. I'll provide more info if required. Huge thanks
White Hat / Black Hat SEO | | smaavie
David0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0