International Website Targeting
-
Hello fellow Mozzers, had a quick question.
So we have a new eCommerce client that is interested in launching a website in multiple countries. According to their vision, they want a US site, UK site, Japan site, etc and so on.
I have a few concerns about doing it this way.
First, there is the issue with the sites being the same. They only difference will be that they have a different domain, such as domain.co.jp for the Japan-based site, domain.co.uk for UK, etc.
Even if we target different countries in webmaster, won't the sites still compete with one another and potentially get tagged as duplicates?
I'm thinking there has to be a better way to have a site targeted at the world, without having to clone and duplicate and relaunch. Anyone have experience with this?
-
Thank you for your response. It appears the best way to go about this is to make the main site amazing and optimize it around what they sell, correct? This is what I had in mind, but just wanted to check with the community to be sure.
-
Hi David.
You are right... creating three identical sites on three different geo-targeted domain names can be a problem.
However, if you implement the hreflang tags in order to tell Google what URL to show depending of the geography of the targeted user (see more here: https://support.google.com/webmasters/answer/189077?hl=en) will avoid the risk of duplicated content and to see the stronger domain outranking the one meant, for instance, for Japan in Japan.
Said that, this "full duplication" strategy can be only very temporary, because it does not have much sense targeting users living in different countries with the exact matching content:
- American English is different than British English;
- In Japan people simply don't search in English, and maybe neither use Google as main search engine (it is Yahoo).
- The culture, hence how people use internet and search on the web, is different in the three countries (slightly different in the case of USA and UK, enormously in the case of Japan).
So... yes, the strategy proposed is not the most effective one, despite of the advantages of the hreflang mark up implementation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website is getting too many DMCA Hits
My Website has been getting too many DMCA Hits since last december then my rankings dropped i would like to know if getting a new domain would be advisable ... and would it be good to redirect my website that is getting DMCA hits to the new domain i want to get it is advisable to build links for it the new domain or would it pass link juice to it (it has some spammy links tho)
White Hat / Black Hat SEO | | emmycircle0 -
How to redirect 301 from high authority sites to own website?
How to redirect 301 from high authority sites to own website? If anyone know can tell me, such gigs are selling on the Fiverr.
White Hat / Black Hat SEO | | jefjaa0 -
Chrome79 shows warning on our domain "Did you mean...?" another website
On Chrome79 a large scary warning is shown to users on our site: "Did you mean this other domain? This site's domain looks similar to X domain. Attackers sometimes mimic sites by making small, hard-to-see changes to the domain." Screenshot: https://imgur.com/a/NOGEyLM Our online business is reputable, no black hat SEO practices, has been established since the early 2000s, with a relatively high DA. We don't have any warnings / manual actions in Google Search Console so I can't request a review there. I've reported it several weeks ago to Google's Incorrect Phishing Warning but the warning continues to display. I reported using: google.com/safebrowsing/report_error/ Does the Moz community have any suggestions on how to fix this or general thoughts? Thanks! NOGEyLM
White Hat / Black Hat SEO | | sb10300 -
HELP! My website has been penalized - what did I do wrong?
I have been working on a website Zing.co.nz and have made a sub domain blog.zing.co.nz. The website is for a company that is yet to launch, so I have been boosting traffic by writing blog posts about the topic (loans) on the subdomain. I pushed some traffic to the actual website too. We were climbing the rankings for our brand name but have all of a sudden started to drop. The domain authority was something like 0.9 and has dropped to 0.3. (Using SEO Spyglass) The blog was somewhere similar, but has dropped to 0.0!!! Please help in anyway you can. These changes have happened within the last 48 hours. Zing.co.nz Blog.zing.co.nz
White Hat / Black Hat SEO | | Startupfactory0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Is it a good idea to target a similar versions of a keyword?
Salute you all, I am optimizing a site for an attorney. I have done some good research and find the keyword difficulties. Some of my keywords are very similar was wondering is this a good idea and safe (white hat) or not? e.g. page title: 1) city immigration lawyer 2) city immigration attorney My main and first reason is to target all users. Since some will search under 'attorney' and some under 'Lawyer'. Secondly one is easier than the other. I appreciate any input from more experienced seo experts. Chris 🙂
White Hat / Black Hat SEO | | Chris-tx0 -
Competitor website, how come they get away with it?
Hi we have been looking at competitors websites do see how we can improve, this website jumped out at me straight away as spammy gateway pages where 3 words was the only difference on all of the pages. Why does google give them so much weight still and rank them so highly? I thought this is what G was trying to avoid? Am I missing something here in terms of great SEO opportunity? A checked for noindex or canonical and I cannot see any. Love to hear some feedback. Cheers
White Hat / Black Hat SEO | | PottyScotty0 -
Site being targeted by hardcore porn links
We noticed recently a huge amount of referral traffic coming to a client's site from various hard cord porn sites. One of the sites has become the 4th largest referrer and there are maybe 20 other sites sending traffic. I did a Whois look up on some of the sites and they're all registered to various people & companies, most of them are pretty shady looking. I don't know if the sites have been hacked or are deliberately sending traffic to my client's site, but it's obviously a concern. The client's site was compromised a few months ago and had a bunch of spam links inserted into the homepage code. Has anyone else seen this before? Any ideas why someone would do this, what the risks are and how we fix it? All help & suggestions greatly appreciated, many thanks in advance. MB.
White Hat / Black Hat SEO | | MattBarker0