Geo-targeting a sub-folder that's had url's rewritten from a sub-domain
-
I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages.
His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages.
I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain?
It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain?
Thanks,
-
Ok. Thanks for the advise, Ryan.
-
My first suggestion is to push further on the "developer" issue. As an SEO, it is important to have the ability to implement recommended changes as required. If the changes are not implemented for whatever reason, results are affected.
We all work very hard to achieve the best results for our clients. Two common reasons a client might offer for not implementing a change are "my software wont support the change" and "my developer wont support the change". This topic will likely arise again on other matters. Additionally, I recommend a direct line of communication between an SEO and developer when possible. Each party can gain a higher understanding and appreciation for the other, miscommunications can be minimized and it simply creates a better working environment.
With the above noted, your decision to move the subdomain into the main site is the commonly accepted best practice. You are consolidating your DA. While Google has made some recent changes with respect to subdomains, it is still the best practice to make the change you have recommended to your client.
If the URLs are properly rewritten at the server level, no one will even know the actual path of the files. Anyone who visits the URL will simply see the page with a 200 response (all ok) header code returned. You can and should test this change after it is implemented.
Robots.txt can be used to block access to the sub-domain if you wish.
-
Thanks Ryan.
I've no direct contact with the developer, so I can't answer those questions. I'm afraid I just have to work with what my client is telling me.
By what you're saying, and if done correctly, the pages would look to google as if they were in a folder on that domain e.g. website.com/language-site, and we would geo-target that folder, and not the sub domain?
Then we'd need to find a way to stop the search engines crawling the sub-domain. Would this be done in the robots.txt file?
Do you think it we'd be just better off using the sub-domain and forgetting about the rewrites. The main reason I'm advising him to go for a folder structure is because of the uncertainty of domain authority flowing to a sub-domain.
-
I firmly believe software and developers should enable site owners the freedom to make changes as they see fit. When a developer or software are not able to readily implement SEO best practices, it's time to look for alternatives.
Is the software being used a particular CMS or e-commerce solution which is in an earlier stage of development? How experienced is the developer?
If the URLs were rewritten (server-side) to provide the target pages with a normal header response code the process should work. My biggest concern is ensuring the sub-domain URLs are not crawled otherwise there would be a duplicate content issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best strategy for acquisition?
Hi All, Recently acquired a competitor company. This acquired company is small in size but is the exclusive UK distributor for a gigantic Swedish company. This is the way the current domain structure is divided. swedish-supplier.se (Not owned by us - swedish supplier ) Swedish-supplier.co.uk (owned by us, operating as the swedish supplier in the UK) New-acquired-company.com (owned by us) The supplier doesn't want us to have two websites as they keep getting confused customers. Because of this we have agreed to remove www.swedish-suplier.co.uk and solely sell the product at www.new-aqquired-company.com. However, because of the sheer size of the Swedish supplier, a lot of traffic comes through to swedish-supplier.co.uk. My question is, how can we work together with the supplier to remove this domain and still maintain a good amount of UK traffic? Should we point swedish-supplier.co.uk back to the suppliers original translated web site and have them pass enquiries onto us or should we point it to our website? & What's the best way to go about it? Thanks, Danny
International SEO | | DannyHoodless0 -
Google Webmaster showing error for [hreflang='x-default']
Hi There! Using [hreflang='x-default'] tag to target language specific countries on our site but Google Webmaster showing errors even implementation made as per Google guideline but one thing is not clear and we are not sure, this is the reason behind it. Error is showing up only on those pages where 'Google Parameters' are used. For example : https://www.sitegeek.com/a2hosting?grank=open 'grank=' is defined as a 'Google Parameters' and on the above page 'hreflang' tags are : Also, on page https://www.sitegeek.com/a2hosting [without Google Parameters] same above 'hreflang' tags are taken. But, There is no error on second page URL where no 'Google Parameters' in URL. Therefore, error showing on first URL where 'Google Parameters' are taken. Is this the issue or not? suggest how to remove? -- Rajiv S9vhl3T
International SEO | | gamesecure0 -
.com versus local domains
Hi all, One of my clients has local domain websites in various parts of the world (co.uk etc. etc.) and there has always been a discussion about where a move from local domain (the current set-up) to a targeted .com domain (i.e. .com/uk) would benefit from a SEO perspective. The main reasoning (seo-wise) that keeps coming up is that there'd only be one domain to link to which would help with link juice being passed around. Any thoughts as whether this would actually be the case or if this possible benefit would be outweighed by other cons? Recent moves (local to .com) from a few websites (the Guardian newspaper in the UK being the most recent one off the top of my head) has made me start thinking about it again! Diana
International SEO | | Diana.varbanescu0 -
Multinational URLs
Hi I'm wondering if the following URL structure using subdirectories would be alright to use on a multinational site. I have local products only in the local language and english. I plan to use: /uk/ - UK product in English (geo target in GWT to UK, href lang="en") /fr/ - French product in French (would geo target this in GWT to France, and hreflang="fr-FR") /fr-en/ - French product in English (no geo-targeting, hreflang ="en") /de/ - German product in German (would geo target this in GWT to Germany, and hreflang="de-DE") /de-en/ - German product in English (no geo-targeting, hreflang ="en") /at-de/ - Austrian product in German (would geo target this in GWT to Austria, and hreflang="at-DE") /at-en/ - Austrian product in English (no geo-targeting, hreflang ="en") Does the name of the subfolder matter? I've tried to keep the URL's shorter, so german users in Germany would get just /de/ rather than /de-de/, and have made the english version of the content the more ugly URL as it's used much, much less. The URL structures aren't really consistent here (ie. uk and fr-en are for english content, but are different in URL format) but I'm wondering if this is an issue, or if the above would be fine. Thanks!
International SEO | | pikka0 -
Which hreflang tag to use for .eu domain
Hi there, We're trying to solve a problem with one of our domains, we have a .eu CCTLD and we're trying to implement hreflang tags. On our US and UK sites, we use "en-us" and "en-gb", but it's not clear how to approach this european problem, as there is not a "en-eu" tag. The site is in English, but serves several European countries speaking different languages. What's the best hreflang code to use in this situation? Any help much appreciated, Thanks!
International SEO | | dennis.globalsign0 -
Website Target in Europe
Hi, I am planning a site to target in Europe and I expect to translate my site into ten different languages namely English, French, Spanish, Italian, German, Russian, Greek, Portuguese, Dutch and Swedish. I am doing some study of this case in targeting different countries for SEO, most of the advise are the following: a. Build 10 different websites and target different geographical location in Google Webmaster b. Get 10 different country specific domains for 10 different websites I would like to hear any suggestion if there is anything better than this ? I had all the materials and translation ready but building 10 different websites or getting 10 different domains are very time consuming and costly. I would be appreciated if any one had any advise for me to make the website more management friendly. Thank you. Tom
International SEO | | Stevejobs20110 -
Would other TLDs (Top Level Domains) be helpful?
Hi, I have a website geared towards an international crowd. It is written in English on the .com TLD. We are currently having it translated to Japanese on the .jp TLD and to French on the .fr TLD. Is getting a TLD for each country/translation a good way to go? Not only in terms of SEO, but is this the best way to get found in these other countries? Second questions: Would getting TLDs in other English speaking countries do any good? Like .com.au or .com.nz or .ca? Again, both in terms of SEO and reach for users in those countries. Last question, since I'm not going to change the content much (or any...) for the other English TLDs, how should I go about them? 301 redirect to the .com website? Show same content without a redirect? Other idea? Thank you in advance! -Elad
International SEO | | Eladla0 -
International SEO - auto geo-targetting
I read with interest the recent post on international SEO and the top level domain architecture approaches to local content: http://www.seomoz.org/ugc/folders-vs-subdomains-vs-cctld-in-international-seo-an-overview#jtc135670 The issue I have is a little more complex: The business sells a wide variety of products (37) but one is by far and away the biggest and most popular. This means that due to the link profile of the various country sites and HQ site, search engines categorise the site according to this product (this is easily seen with the Google Adplanner) and the other product lines suffer as a result. The current architecture is to have a .com site and then individual ccTLD country sites, again with all products on each site. This creates an issue as in most countries the brand is not strong (compared to the keyword names and search volumes of the products) and so it is not that effective in generating organic traffic. The .com hogs much of the inbound links and the country sites themselves are not that well optimised for a number of reasons. A proposed solution has been to leverage the strength of the .com and the search volume for the product names, and to produce thematic sites based on each product: productA.brand.com
International SEO | | StevieCC
productB.brand.com
productC.brand.com In this way, the sites, content and link profiles are aligned around the more desirable products and we can expect improved organic search performance as a result (or at least ensure relevant traffic finds the relevant content fast). In terms of providing localised content, the plan was to use content mirroring and to then assign each content mirror to a specific geo-location using the webmaster tools console (and other SE equivilents). This is shown I think in one of Rand's videos. ProductA.brand.com/de/de Germany site for product A with unique German content
ProductA.brand.com/fr/fr French site for product A with unique French content This makes economic sense to me as to utilise the ccTLDs would result in hundreds of separate sites with all the licence and server considerations that entails. For example, for product A alone we would have to produce: productA.brand.de
productA.brand.fr
productA.brand.cn
productA.brand.jp
ect ect ect This just would not be sustainable in license/server costs alone across 37 products and 24 countries. However, I saw in a recent presentation at SES London that (auto) geo-targeting is risky, often doesn't work well for SEO and can even be seen as cloaking. I think the above strategy could still work, but perhaps we should avoid the use of auto-geotargetting altogether and hope the search engines alone do their job in getting users to the right content as we optimise the unique content for each country (and if they don't, ensure our desgn, UX and country selectors do the job instead). SEO guru consensus is to use the ccTLD if you own it, but as described above, in the real world that just isn't possible or practical given the company's strategic position. Which leads to the final question- we do own the brand ccTLDs- if they are directed back to the content mirror for the country on the .com, is there any SEO benefit in doing so aside from directing back any link juice associated with the domain)?0