Geo-targeted Organic Search Traffic to a sub-domain
-
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country.
Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market.
We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions:
a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country?
b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective?
Thanks.
-
Thanks.
-
If its not too competitive then it shouldnt take you more than 30-60 days for a geo-targeted domain.
There is no case study to look at because each situation is so different.
-
Thank you, Gianluca. Your detailed response is much appreciated.
Would you be able to give any indication on the time it could take for the sub-domain to get all the search traffic directly for queries originating in that country?
Any case studies or references you will be able to point me to? That'd be great.
-
Thank you for your response; it's helpful.
By any chance, are you able to point me to any case study that shows the time it took for the geo-targeted sub-domain to get all the traffic directly from the search engines?
Our concern with using a new TLD is the time it will take the domain to acquire authority and attract traffic of its own from the targeted geography.
-
Hi Manoj, in your case I suggest you to use the rel="alternate" hreflang="x" geotargeting tag, apart from targeting the subdomain to the desired country (and the main site set as "global").
The use of the rel=”alternate” hreflang=”x” is strongly suggested in the case a website as an “incomplete” international version for very different reasons:
- Template translated, but main content in a single language;
- Broadly similar content within a single language, but targeting different countries (i.e.: US, UK, Australia…)
But remember that Google suggests to use it also in the case the site content is fully translated (i.e.: all the Spanish version has content in Spanish, and so on).
This rel, then, seems very appropriate for the Sitecore site.
How to implement it
Two options:
- HTML link element. In the section of any page.
In this case, for instance, in the section of www.domain.com we should add as many rel=”alternate” hreflang=”x” as the different country versions are present in the site.
I.e.: http://es.domain.com” />
Please note that if exist multiple language versions (“set” in the Google slang), every set must include the rel=”alternate” hreflang=”x” to every other language versions.
I.e.: if we Global, UK and FR versions of the site apart the Spanish one, the Spanish version will have to include:
Obviously, every single URL must have the rel=”alternate” hreflang=”x” tag pointing to the corresponding URL of any other language version.
- HTTP header, in the case of not-HTML files (as PDF)
As it is implicitly said, this tag is used on a page level, not domain one. That means that every single pages must be correctly marked-up
Same content and same language on different pages and language versions
If, as it happens in case, some pages show almost the same content in both the domain and subdomain, hence it is highly suggested to use also the rel=”canonical” in order to specify to Google what the preferred version of the URL is.
As Google itself says here, Google will “use that signal to focus on that version in search, while showing the local URLs to users where appropriate. For example, you could use this if you have the same product page in German, but want to target it separately to users searching on the Google properties for Germany, Austria, and Switzerland.”
Don't forget
Don't forget that your main site is set a targeting all the web, also the country targeted by your sub-domain.
That means that if you will perform an active link building campaign for the sub-domain, in order to provide it of an equal if not higher strenght respect the main site.
-
As soon as they index it it will take precedence in that country for geotargeting. You can increase the likelihood of differentiation or non duplicate content by using top level domains and by adding geotargeting keywords to your sub domain content. See the specific examples below:
Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know that
http://www.example.de
contains Germany-focused content, for instance, thanhttp://www.example.com/de
orhttp://de.example.com
.Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Source for above comes from google on duplicate content relating to different countries.
Hope this helps.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Redirect a sub-domain to other domain
Hi there! Suppose a domain 'abc.com' has a subdomain 'news.abc.com'. If we redirect (301) only subdomain 'news.abc.com' to 'xyz.com'. so is there any SEO harm on main domain 'abc.com'? Even both abc.com and xyz.com are running separately. Rajiv
White Hat / Black Hat SEO | | gamesecure0 -
How do I write tags on a youtube video for a local Google search?
I've been reading into tags, and I would like to know what the best ways to do them for a local search are. Right now I have a title that reads similar to, "Keyword1 and Keyword2 in City X" Would I make a corresponding tag that reads "Keyword 1 and Keyword 2 in City X,"? Or would I do "Keyword 1," "Keyword 2," and, "City X," as separate tags? Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
Blog on 2 domains (.org/.com), Canonical to Solve?
I have a client that has moved a large majority of content to their .org domain, including the blog. This is causing some issues for the .com domain. I want to retain the blog on the .org and have it's content also show on the .com. I would place the canonical tag on the .com Is this possible? Is this recommended?
White Hat / Black Hat SEO | | Ngst0 -
Redirect n domain to one
What happen when I redirect301 10 domain to one? I have 10 domain with ave Page Authority=45 and Domain Authority 60 and want to increase my new domain by redirect them. is it right or wrong?
White Hat / Black Hat SEO | | vahidafshari450 -
Best use of domains with keywords
I own a domain with just the company name in it (no keywords) that I use as main domain. I also own some other domain with keywords inside that right now I redirect all to the main domain with a 301 redirect. What is the best use for these domains? Should I use them when I do link building or is better to use just the main domain? Can they be useful to increase the main domain link juice/page rank? If yes, how? Thanks
White Hat / Black Hat SEO | | darkanweb0 -
EMD with 3.3million broad match searches got hit hard by Panda/Penguin
k, so I run an ecommerce website with a kick ass domain name. 1 keyword (plural)
White Hat / Black Hat SEO | | SwissNinja
3.3 million broad match searches (local monthly)
3.2 million phrase match
100k exact match beginning of march I got a warning in GWT about unnatural links. I feel pretty certain its a result of an ex-employee using an ALN listing service to drip spun article links on splogs. This was done also for another site of mine, which received the same warning, except bounced back much sooner (from #3 for EMD w/ 100k broad, 60k phrase and 12k exact, singular keyword phrase) I did file reinclusion on the 2nd (smaller) domain. Received unnatural warning on 4/13 and sent reconsideration on 5/1 (tune of letter is "I have no clue what is up, I paid someone $50 and now Im banned) As of this morning, I am not ranking for any of my terms (had boucned back on main keyword to spot #30 after being pushed down from #4) now back to the interesting site....
this other domain was bouncing between 8-12 for main keyword (EMD) before we used ALN.
Once we got warning, we did nothing. Once rankings started to fall,we filed reinclusion request...rankings fell more, and filed another more robustly written request (got denials within 1 week after each request)until about 20 days ago when we fell off of the face of the earth. 1- should I take this as some sort of sandbox? We are still indexed, and are #1 for a search on our domain name. Also still #1 in bing (big deal) 2- I've done a detailed analysis of every link they provide in GWT. reached out to whatever splog people I could get in touch with asking them to remove articles. I was going to file another request if I didn't reappear after 31 days after I fell off completely. Am I wasting my time? there is no doubt that sabatoge could be committed by competition by blasting them with spam links (previously I believed these would just be ignored by google to prevent sabatoge from becoming part of the job for most SEOs) Laugh at me, gasp in horror with me, or offer some advice... I'm open to chat and would love someone to tell me about a legit solution to this prob if they got one thanks!0