Geo-targeted Organic Search Traffic to a sub-domain
-
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country.
Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market.
We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions:
a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country?
b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective?
Thanks.
-
Thanks.
-
If its not too competitive then it shouldnt take you more than 30-60 days for a geo-targeted domain.
There is no case study to look at because each situation is so different.
-
Thank you, Gianluca. Your detailed response is much appreciated.
Would you be able to give any indication on the time it could take for the sub-domain to get all the search traffic directly for queries originating in that country?
Any case studies or references you will be able to point me to? That'd be great.
-
Thank you for your response; it's helpful.
By any chance, are you able to point me to any case study that shows the time it took for the geo-targeted sub-domain to get all the traffic directly from the search engines?
Our concern with using a new TLD is the time it will take the domain to acquire authority and attract traffic of its own from the targeted geography.
-
Hi Manoj, in your case I suggest you to use the rel="alternate" hreflang="x" geotargeting tag, apart from targeting the subdomain to the desired country (and the main site set as "global").
The use of the rel=”alternate” hreflang=”x” is strongly suggested in the case a website as an “incomplete” international version for very different reasons:
- Template translated, but main content in a single language;
- Broadly similar content within a single language, but targeting different countries (i.e.: US, UK, Australia…)
But remember that Google suggests to use it also in the case the site content is fully translated (i.e.: all the Spanish version has content in Spanish, and so on).
This rel, then, seems very appropriate for the Sitecore site.
How to implement it
Two options:
- HTML link element. In the section of any page.
In this case, for instance, in the section of www.domain.com we should add as many rel=”alternate” hreflang=”x” as the different country versions are present in the site.
I.e.: http://es.domain.com” />
Please note that if exist multiple language versions (“set” in the Google slang), every set must include the rel=”alternate” hreflang=”x” to every other language versions.
I.e.: if we Global, UK and FR versions of the site apart the Spanish one, the Spanish version will have to include:
Obviously, every single URL must have the rel=”alternate” hreflang=”x” tag pointing to the corresponding URL of any other language version.
- HTTP header, in the case of not-HTML files (as PDF)
As it is implicitly said, this tag is used on a page level, not domain one. That means that every single pages must be correctly marked-up
Same content and same language on different pages and language versions
If, as it happens in case, some pages show almost the same content in both the domain and subdomain, hence it is highly suggested to use also the rel=”canonical” in order to specify to Google what the preferred version of the URL is.
As Google itself says here, Google will “use that signal to focus on that version in search, while showing the local URLs to users where appropriate. For example, you could use this if you have the same product page in German, but want to target it separately to users searching on the Google properties for Germany, Austria, and Switzerland.”
Don't forget
Don't forget that your main site is set a targeting all the web, also the country targeted by your sub-domain.
That means that if you will perform an active link building campaign for the sub-domain, in order to provide it of an equal if not higher strenght respect the main site.
-
As soon as they index it it will take precedence in that country for geotargeting. You can increase the likelihood of differentiation or non duplicate content by using top level domains and by adding geotargeting keywords to your sub domain content. See the specific examples below:
Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know that
http://www.example.de
contains Germany-focused content, for instance, thanhttp://www.example.com/de
orhttp://de.example.com
.Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Source for above comes from google on duplicate content relating to different countries.
Hope this helps.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving website and domain name without 301 Redirect or rel=canonical
I do not wish to draw attention to my company, so I am using code names. For the sake of this discussion, we are a new car dealership representing Brand X Cars. The manufacturer of Brand X Cars pushes its dealers toward a website hosting company called CarWebsites in order to maintain a level of quality and control with each dealer. However, we have found the platform to be too restricting, and are switching to our own WordPress site. Unfortunately Brand X is claiming ownership of our original domain, BrandXCarDealer.net, so we have switched to BrandXCarDealer.com (which we prefer anyways). Now both websites are running, and there is duplicate content of everything. Brand X is not cooperative and will not 301 redirect to the new site, and we do not have access to the of the website for a rel=canonical. Brand X is also dragging its feet on shutting down BrandXCarDealer.net. We do still have access to change the content of the pages on the BrandXCarDealer.net site, but that is pretty much as far as our control goes. So my question is, is there anything we can do, without using a 301 redirect or rel=canonical, to tell Google to pay attention to the new BrandXCarDealer.com rather than the old BrandXCarDealer.net? Any suggestions are appreciated. Thanks!
White Hat / Black Hat SEO | | VanMaster0 -
Block unwanted traffic
Hello everyone, I have started developing a website about 45 days ago. I have noticed that I'm receiving traffic from adult websites. I've been tracking the traffic via GA. I have not started my link build process yet, but I'm still receiving about 15 to 30 visits per day from these websites. I've had this domain name without use for about 5 years. The domain name was originally registered in 1998. I don't have a background on the previous owners so I wouldn't know their SEO practice. What's the best practice to block this traffic? Google has yet to index the domain by the brand keyword. I feel the referring traffic could be affecting my SERPs. Bing has already index the site and started referring traffic by the brand keyword. Yes, the brand keyword is on the domain name (brandkeyword.com). Thanks in advanced.
White Hat / Black Hat SEO | | vndflkvnlkzdfnv0 -
Two sites, heavily cross linking, targeting the same keyword - is this a battle worth fighting?
Hi Mozzers, Would appreciate your input on this, as many people have differing views on this when asked... We manage 2 websites for the same company (very different domains) - both sites are targeting the same primary keyword phrase, however, the user journey should incorporate both websites, and therefore the sites are very heavily cross linked - so we can easily pass a user from one site to another. Whilst site 1 is performing well for the target keyword phrase, site 2 isn't. Site 1 is always around 2 to 3 rank, however we've only seen site 2 reach the top of page 2 in SERPs at best, despite a great deal of white hat optimisation, and is now on the decline. There's also a trend (all be it minimal) of when site 1 improves in rank, site 2 drops. Because the 2 sites are so heavily inter-linked could Google be treating them as one site, and therefore dropping site 2 in the SERPs, as it is in Google's interests to show different, relevant sites?
White Hat / Black Hat SEO | | A_Q0 -
Redirecting location-specific domains
I am working on a project for a physician who only cares about reaching patients within a specific geographic region. He has a new technique at his practice and wants to get the word out via radio spots. I want to track the effectiveness of the radio campaigns without the use of call-tracking numbers or special promo codes. Since the physician's primary domain is very long (but well-established), my thought is to register 3-4 short domains referencing the technique and location so they would be easy for listeners to remember and type-in later. 301 these domains to the relevant landing page on the main domain. As an alternative. Each domain could be a single relevant landing page with a link to the relevant procedure on the main site. It's not as if there is anything deceptive going on, rather, I would simply be using a domain in place of a call tracking number. I think I should be able to view the type-in traffic in Analytics, but would Google have an issue with this? Thoughts and suggestions appreciated!
White Hat / Black Hat SEO | | SCW0 -
Subdomain and root domain effects on SEO
I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?
White Hat / Black Hat SEO | | herlamba0 -
Has google done well with these search results?
I am struggling to grasp the new logic behind google, my understanding was that they wanted to return more related searches so that the search matched the results giving people exactly what they are looking for from trusted suppliers. However I work in the vacation rental niche and I have found that the individual long tail searches have started to become less valuable as they are no longer giving the exact property. Here is a screenshot of the top 10 results for the key phrase "10 bedroom villas in quinta do lago" Position 1 & 2 are good results and would be expected however the next 7 positions are completely not related to the search, yes it is quinta do lago. But I am looking specifically for a 10 bedroom villa, none of these pages offer 10 bedroom villas. I actually found my listing outside the top 20 and mine is a 10 bedroom villa in quinta do lago. Does anyone have anything that can enlighten me on this? Thanks Andy 0bqdRJi
White Hat / Black Hat SEO | | iprosoftware0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Geotargeting a new domain without impacting traffic to existing domain
I had previously asked this as a 'private question' and couldn't make it a 'public question' automatically-- hence reposting it as a new question: We have an existing site, let's say www.xyz.com --- which attracts traffic from all over the world (including the US), though it's primary audience is the UK/ Europe. Most of this traffic is via organic search results on Google. Now, there is a business case to launch a US-centric website -- www.xyz.us, which will have most of its content from the original site (probably with some localization). Our goal is that on day 1 when the new site xyz.us is launched, we want all traffic originating from the US (and may be some other North American countries) to be directed to the .us domain instead of the .com domain. We don't want to lose any search engine traffic; equally importantly, we want this to be done in a manner that is seen by the search engines as a legitimate technique. What are the best options to do this such that the new .US site automatically inherits all of the traffic from the .com site on day 1, without either of these sites getting penalized in any form. Thanks.
White Hat / Black Hat SEO | | ontarget-media0