Geo-targeted Organic Search Traffic to a sub-domain
-
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country.
Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market.
We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions:
a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country?
b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective?
Thanks.
-
Thanks.
-
If its not too competitive then it shouldnt take you more than 30-60 days for a geo-targeted domain.
There is no case study to look at because each situation is so different.
-
Thank you, Gianluca. Your detailed response is much appreciated.
Would you be able to give any indication on the time it could take for the sub-domain to get all the search traffic directly for queries originating in that country?
Any case studies or references you will be able to point me to? That'd be great.
-
Thank you for your response; it's helpful.
By any chance, are you able to point me to any case study that shows the time it took for the geo-targeted sub-domain to get all the traffic directly from the search engines?
Our concern with using a new TLD is the time it will take the domain to acquire authority and attract traffic of its own from the targeted geography.
-
Hi Manoj, in your case I suggest you to use the rel="alternate" hreflang="x" geotargeting tag, apart from targeting the subdomain to the desired country (and the main site set as "global").
The use of the rel=”alternate” hreflang=”x” is strongly suggested in the case a website as an “incomplete” international version for very different reasons:
- Template translated, but main content in a single language;
- Broadly similar content within a single language, but targeting different countries (i.e.: US, UK, Australia…)
But remember that Google suggests to use it also in the case the site content is fully translated (i.e.: all the Spanish version has content in Spanish, and so on).
This rel, then, seems very appropriate for the Sitecore site.
How to implement it
Two options:
- HTML link element. In the section of any page.
In this case, for instance, in the section of www.domain.com we should add as many rel=”alternate” hreflang=”x” as the different country versions are present in the site.
I.e.: http://es.domain.com” />
Please note that if exist multiple language versions (“set” in the Google slang), every set must include the rel=”alternate” hreflang=”x” to every other language versions.
I.e.: if we Global, UK and FR versions of the site apart the Spanish one, the Spanish version will have to include:
Obviously, every single URL must have the rel=”alternate” hreflang=”x” tag pointing to the corresponding URL of any other language version.
- HTTP header, in the case of not-HTML files (as PDF)
As it is implicitly said, this tag is used on a page level, not domain one. That means that every single pages must be correctly marked-up
Same content and same language on different pages and language versions
If, as it happens in case, some pages show almost the same content in both the domain and subdomain, hence it is highly suggested to use also the rel=”canonical” in order to specify to Google what the preferred version of the URL is.
As Google itself says here, Google will “use that signal to focus on that version in search, while showing the local URLs to users where appropriate. For example, you could use this if you have the same product page in German, but want to target it separately to users searching on the Google properties for Germany, Austria, and Switzerland.”
Don't forget
Don't forget that your main site is set a targeting all the web, also the country targeted by your sub-domain.
That means that if you will perform an active link building campaign for the sub-domain, in order to provide it of an equal if not higher strenght respect the main site.
-
As soon as they index it it will take precedence in that country for geotargeting. You can increase the likelihood of differentiation or non duplicate content by using top level domains and by adding geotargeting keywords to your sub domain content. See the specific examples below:
Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know that
http://www.example.de
contains Germany-focused content, for instance, thanhttp://www.example.com/de
orhttp://de.example.com
.Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Source for above comes from google on duplicate content relating to different countries.
Hope this helps.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
Whay are low-quality exact match domains still ranking well for our biggest term?
There are a number of low-quality “exact-match” domains that are ranking well for the term “locum tenens”. I don’t want to specifically mention any sites, but there are some with poor content and very few quality backlinks that are on page one. The only reason I can see for them ranking so well is the fact that “locum” and/or “tenens” are in the URL. It’s very frustrating because we have worked hard to do all the right things (regular blogging, high-quality content, quality backlinks, etc.) to build our domain authority and page authority so they are better than these sites, yet they still out-rank us. Our site is www.bartonassociates.com. Could it have something to do with the term “locum tenens”, which is a latin phrase? Is it possible that because it is a latin term that it somehow slipped through the cracks and avoided the update that was supposed to eliminate this? If so, what can we do to get some justice?
White Hat / Black Hat SEO | | ba_seomoz0 -
Changing domains from .net to .com after 7 month of traffic loss.
We are in business since 2005 and we always used the .net version as it was the only one available when we started. In about 2007 we bought the .com version to the person who owned it but we kept using the .net as customers were already used to that version. In January we started to see a SE traffic loss, not to mention being outranked by several sites (95% of those site spammers). We had no manual penalty but it could be an algorithmic, we are not sure if we even have some sort of penalty or is just that our niche is too spammed. We are now considering moving the site to the .com version as all our tries of increasing and regaining our ranks were useless (backlink cleanup, disavow tool usage, excellent link building, excellent content creation and social interactions). Our DA and PA are both higher that any of the other ages ranking on top. We have about 3k pages indexed. What do you guys think? Should we move the site to the .com? (note that the change is ranking-wise, not in terms of branding). And if we do, should we 301 all pages? or rel=canonical to avoid a possible "penalty flow" to the other domain? Note: for years, the .com version was/is 301 to the .net one. Thank you all!
White Hat / Black Hat SEO | | FedeEinhorn0 -
Correct way to block search bots momentarily... HTTP 503?
Hi, What is the best way to block googlebot etc momentarily? For example, if I am implementing a programming update to our magento ecommerce platform and am unsure of the results and potential layout/ file changes that may impact SEO (Googlebot continuously spiders our site) How can you block the bots for like 30 mins or so? Thanks
White Hat / Black Hat SEO | | bjs20100 -
Redirect n domain to one
What happen when I redirect301 10 domain to one? I have 10 domain with ave Page Authority=45 and Domain Authority 60 and want to increase my new domain by redirect them. is it right or wrong?
White Hat / Black Hat SEO | | vahidafshari450 -
Indexing search results
One of our competitors indexes all searches performed by users on their site. They automatically create new pages/ new urls based on those search terms. Is it black hat technique? Do search engines specifically forbid this?
White Hat / Black Hat SEO | | AEM131 -
Creating multiple domains with key phrases and linking back and forth to them
There are several of my competitors who have built multiple sites with keywords in their domain names such as localaustinplumber.com, houstonplumbers.com, Dallasplumbers.com, localdallasplumbingservices.com...you get the picture. (These are just made up examples to illustrate what they are doing) They put unique content on each page and use alias whois using a different credit card to set up each domain to hide the fact from Google that they are the same entity and then link back and forth to each of the domains with appropriate keywords in the anchor text. They are outranking me on a lot of key search phrases due to the fact that they have the keywords in the domain name. They have no other outside links other than the links from the domains that they own. Is this a good idea? is it black hat? are they going to get slapped if someone reports them as a link farm? It's frustrating for me staying white hat and getting legitimate links and then these competitors come in and out rank me after only a few months with this scheme. Is this a common practice to rank highly for certain key phrases? Thanks in advance for your opinions! Ron10
White Hat / Black Hat SEO | | Ron100