Foreign Language Directories
-
I have a client whose site has each page in multiple languages. each is in specific directories. Needless to say each page is showing up with the same site title, meta data, and content. When my campaigns are crawled they show up as thousands of page errors. Should i add each of these into robots.txt? would this fix the issue of duplicate content?
-
George,
i've build websites in 3 different languages and without any of the discribed errors. Each language gets its own title, own description and own content. Did you handcode everything of did you use a CMS to build the site?
Since every language has its own folder I can imagine that every language has its own file to. If that is not the case then i would suggest looking into CMS systems that enable this way of building up websites.
hope this helps some.
kind regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Building a Local Directory of Businesses on a Subdomain Good SEO?
Hello Fellow Moz'ers: I own a small digital shop in a major US city. We had a marketing idea which I'd like some input on the soundness of. We are creating a professional services directory of 'digital professional services providers' in our hometown. The directory's membership will only be open to firms located within our city limits. The directory will be curated and maintained, ongoing, by us. Our motivation is 75% selfish and 25% benevolent. The idea is that, by building the directory on our subdomain, we hopefully will collect links, which ultimately will enhance search visibility. But I'm concerned about the devaluation directories have incurred in recent years and I've even seen advice given to the effect that listings in some directories might be harmful to a site's link profile. It is not our intention to harm those who might list in our directory. Any thoughts on this matter would be greatly appreciated!
Intermediate & Advanced SEO | | Daaveey0 -
Language Subdirectory homepage not indexed by Google
Hi mozzers, Our Spanish homepage doesn't seem to be indexed or cached in Google, despite being online for over a month or two. All Spanish subpages are indexed and have started to rank but not the homepage. I have submitted sitemap xml to GWTools and have checked there's no noindex on the page - it seems to be in order. And when I run site: command in Google it shows all pages except homepage. What could be the problem? Here's the page: http://www.bosphorusyacht.com/es/
Intermediate & Advanced SEO | | emerald0 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
I am attempting to understand what is realistically possible within Google, Yahoo and Bing as they search websites for KeyWords. Technically my understanding is that they should be able to distinguish common words within concatenated strings, although there can be confusion between word boundaries when ambiguity is involved. So in the simple example of text55fun, do search engines actually distinguish text, 55 and fun separately? There are practical processing, databased and algorithm limitations that might turn a technically possible solution into a unrealistic one at a commercial scale. What about more ambiguous strings like stringsstrummingstrongly would that be parsed as string s strummings trongly or strings strummings trongly or strings strumming strongly? Does one need to use dashes or underscores to make it unambiguous to the search engine? My guess is that the engine would recognize the dash or space and better understand the word boundaries yet ignore the dash or underscore from an overall concatenated string perspective. Thanks in advance to whoever can provide any insight to an old coder who is new to this field.
Intermediate & Advanced SEO | | ny600 -
Geo targeting - same language, different countries
We are in serps in US google, but not in the UK, Australia. Also we have non-english translations of the website - these options works very well. Are there any logic options to make uk.domain.com? We can add United Kingdom in the description and texts. Can this do though in www.google.co.uk ? Same with AU. Website content fits all countries.
Intermediate & Advanced SEO | | bele0 -
Doubt of multi country/language site
Hi ! We are building a site that is going to be available in some countrys with the same language (spanish), and we have some doubts about whih is the best way to do it. Option 1) Subdomains: Example; españa.mydomain.com , mexico.mydomain.com (the problem here is that there are some problems with linkbuilding with subdomains) Option 2) Language folders: Example; mydomain.com/es/es mydomain.com/es/mx (the problem here is that the prestige of the category in the url is going to be in 3rd position, example: mydomain.com/es/es/category and is not recommended for SEO) Option 3) Country domains Example; mydomain.es<a></a> mydomain.mx (the link building is going to be much more, cause we have to multipliate the links that we need ffor being in a good position with the diferent domains of each country) I am not sure of which one is the best option, what do you think? The only thing I am sure is to use te TAG: rel="alternate" hreflang="x" for not having duplicate content, because index and categories are going to be the same, the only thing that is going to change is the products of each country. Looking forward to your suggestions! Thanks, Regards Exequiel
Intermediate & Advanced SEO | | SeoExpertos0 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
Should I robots block this directory?
There's about 43k pages indexed in this directory, and while helpful to end users, I don't see it being a great source of unique content for search engines. Would you robots block or meta noindex nofollow these pages in the /blissindex/ directory? ie. http://www.careerbliss.com/blissindex/petsmart-index-980481/ http://www.careerbliss.com/blissindex/att-index-1043730/ http://www.careerbliss.com/blissindex/facebook-index-996632/
Intermediate & Advanced SEO | | CareerBliss0 -
How Google Carwler Cached Orphan pages and directory?
I have website www.test.com I have made some changes in live website and upload it to "demo" directory (which is recently created) for client approval. Now, my demo link will be www.test.com/demo/ I am not doing any type of link building or any activity which pass referral link to www.test.com/demo/ Then how Google crawler find it and cached some pages or entire directory? Thanks
Intermediate & Advanced SEO | | darshit210