Geo-targeted Organic Search Traffic to a sub-domain
-
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country.
Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market.
We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions:
a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country?
b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective?
Thanks.
-
Thanks.
-
If its not too competitive then it shouldnt take you more than 30-60 days for a geo-targeted domain.
There is no case study to look at because each situation is so different.
-
Thank you, Gianluca. Your detailed response is much appreciated.
Would you be able to give any indication on the time it could take for the sub-domain to get all the search traffic directly for queries originating in that country?
Any case studies or references you will be able to point me to? That'd be great.
-
Thank you for your response; it's helpful.
By any chance, are you able to point me to any case study that shows the time it took for the geo-targeted sub-domain to get all the traffic directly from the search engines?
Our concern with using a new TLD is the time it will take the domain to acquire authority and attract traffic of its own from the targeted geography.
-
Hi Manoj, in your case I suggest you to use the rel="alternate" hreflang="x" geotargeting tag, apart from targeting the subdomain to the desired country (and the main site set as "global").
The use of the rel=”alternate” hreflang=”x” is strongly suggested in the case a website as an “incomplete” international version for very different reasons:
- Template translated, but main content in a single language;
- Broadly similar content within a single language, but targeting different countries (i.e.: US, UK, Australia…)
But remember that Google suggests to use it also in the case the site content is fully translated (i.e.: all the Spanish version has content in Spanish, and so on).
This rel, then, seems very appropriate for the Sitecore site.
How to implement it
Two options:
- HTML link element. In the section of any page.
In this case, for instance, in the section of www.domain.com we should add as many rel=”alternate” hreflang=”x” as the different country versions are present in the site.
I.e.: http://es.domain.com” />
Please note that if exist multiple language versions (“set” in the Google slang), every set must include the rel=”alternate” hreflang=”x” to every other language versions.
I.e.: if we Global, UK and FR versions of the site apart the Spanish one, the Spanish version will have to include:
Obviously, every single URL must have the rel=”alternate” hreflang=”x” tag pointing to the corresponding URL of any other language version.
- HTTP header, in the case of not-HTML files (as PDF)
As it is implicitly said, this tag is used on a page level, not domain one. That means that every single pages must be correctly marked-up
Same content and same language on different pages and language versions
If, as it happens in case, some pages show almost the same content in both the domain and subdomain, hence it is highly suggested to use also the rel=”canonical” in order to specify to Google what the preferred version of the URL is.
As Google itself says here, Google will “use that signal to focus on that version in search, while showing the local URLs to users where appropriate. For example, you could use this if you have the same product page in German, but want to target it separately to users searching on the Google properties for Germany, Austria, and Switzerland.”
Don't forget
Don't forget that your main site is set a targeting all the web, also the country targeted by your sub-domain.
That means that if you will perform an active link building campaign for the sub-domain, in order to provide it of an equal if not higher strenght respect the main site.
-
As soon as they index it it will take precedence in that country for geotargeting. You can increase the likelihood of differentiation or non duplicate content by using top level domains and by adding geotargeting keywords to your sub domain content. See the specific examples below:
Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know that
http://www.example.de
contains Germany-focused content, for instance, thanhttp://www.example.com/de
orhttp://de.example.com
.Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Source for above comes from google on duplicate content relating to different countries.
Hope this helps.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Homepage not ranking for targeted keywords (established site with somewhat ok UR&DR)
Hello everyone, i have a question regarding my homepage issue. My homepage is not showing up in google search result for all the keywords except brand name. I have checked the following things to make sure my homepage is working properly. 1.The page is indexed. 2.No canonical issues 3.No robots.txt issues. 4.Ahrefs UR45 DR55 while my competitors ranking in 2nd and 3rd page have lower UR and DR Have tens of thousands of backlinks but i think most of them are legit I suspect the problem might be the hoempage has more than 70 Anchor text (Internal links) working as directory, and many of them contain the keywords we are targeting. Will that be the reason my homepage is not ranking at all? Since the google might consider it as keyword stuffing and penalize my homepage for that. What are your thoughts on this? Any suggestion would be greatly appreciated!
White Hat / Black Hat SEO | | sufanfeiyan0 -
HELP!! We are losing search visibility fast and I don't know why?
We have recently moved from http to https - could this be a problem? https://www.thepresentfinder.co.uk As far as I'm aware we are doing everything by SEO best practice and have no manual penalties, all content is unique and we are not doing any link farming etc...
White Hat / Black Hat SEO | | The-Present-Finder0 -
Redirect from old domain to a new domain
Hi, assuming i have an old domain that i would like to redirect it to the new domain because the old domain contain good links on it and been ranking for its keywords. Would it be a wise choice? and can i redirect my sub domain into my new one too? for example website1.com/life > website2.com/life and how do i do so? can i do that by hosting the old domain in my new domain hosting and do all those redirect include sub domain redirect?
White Hat / Black Hat SEO | | andzon0 -
Branded Anchor Text, Exact vs. Non-exact Match Domain
Hello, For NLPCA.com, when you search for "NLP California" in Google,the letters "nlp" are bolded in the SERP URL and so is "ca". See here. This is because "ca" is an abbreviation for "California" Thus, this is not an exact match domain but it is close. What should our branded anchor text be? I want to change the anchor text profile to 98% branded anchor text. The 3 names our company goes by are NLP California NLP Institute of California NLP and Coaching Institute Let me know if we should not use one or more of these names for branded anchor text.
White Hat / Black Hat SEO | | BobGW0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Separate domain name for a subdomain?
I just created a subdomain to help our main TLD website. I was wondering if it's smart to create a separate TLD for this subdomain and set up a forward and build links to it. Reason I was thinking about it because it would be easier for people to remember instead of typing in subdomain.maindomain.com. But, I don't want the main website to suffer, since the purpose of creating this subdomain and it's content is to help the main domain. Any inputs on this? Thank you.
White Hat / Black Hat SEO | | FinanceSite0