Foreign Language Directories
-
I have a client whose site has each page in multiple languages. each is in specific directories. Needless to say each page is showing up with the same site title, meta data, and content. When my campaigns are crawled they show up as thousands of page errors. Should i add each of these into robots.txt? would this fix the issue of duplicate content?
-
George,
i've build websites in 3 different languages and without any of the discribed errors. Each language gets its own title, own description and own content. Did you handcode everything of did you use a CMS to build the site?
Since every language has its own folder I can imagine that every language has its own file to. If that is not the case then i would suggest looking into CMS systems that enable this way of building up websites.
hope this helps some.
kind regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
Should sub domains to organise content and directories?
I'm working on a site that has directories for service providers and content about those services. My idea is to organise the services into groups, e.g. Web, Graphic, Software Development since they are different topics. Each sub domain (hub) has it's own sales pages, directory of services providers and blog content. E.g. the web hub has web.servicecrowd.com.au (hub home) web.servicecrowd.com.au/blog (hub blog) http://web.servicecrowd.com.au/dir/p (hub directory) Is this overkill or will it help in the long run when there are hundreds of services like dog grooming and DJing? Seems better to have separate sub domains and unique blogs for groups of services and content topics.
Intermediate & Advanced SEO | | ServiceCrowd_AU0 -
Multi Langes Blog on Multi languages Website
Hello! I need your Idea. I have my website in 4 languages. for example: domain.com/en domain.com/fr etc.. up to 4 languages. And i run at the moment domain.com/blog but now, i like to invest on content on all languages. What is your idea ? domain.com/en/blog or domain.com/blog/en/ or blog.domain.com/en/ blog.domain.com/fr/ or to use another word instead "blog" for example to use domain.com/en/magazine or domain.com/en/now or another word? what do you prefer me?
Intermediate & Advanced SEO | | leadsprofi0 -
Are links from pages in other languages ok?
Hey everyone, what are your thoughts on this? If a bunch of links from another language, say the site is in Canada and is in English but we have french links pointing to the site with english keywords...is that ok? Will that harm us? Opinions? Facts?
Intermediate & Advanced SEO | | jhinchcliffe0 -
Optimize root domain or a page in a sub directory?
Hi My root domain is already optimized for keywords, i would say branded keywords, which i do not really need, as the traffic from these does not give me any revenue ( mostly consists of our employees/returning visitors). Now i have run on page optimization for set of keywords for root domain which i like and got good grades (hurray!). But yet my website does not show up on search engines for those keywords. I have got pretty good link building done to my root domain but this is not done for all keywords (but done for branded keywords). It just happened, please do not ask why. So i decided to optimize inside pages in sub directory with new set of keywords i like. Starting with link building, giving anchor text on various other website linking to this particular page. These pages are not ranked in top 50 in google. Is that a good practice? or I would not need those branded keywords, hence should I re-optimize my root domain to suite my new keywords by giving less preference to branded keywords? Is this a good practice?
Intermediate & Advanced SEO | | MiddleEastSeo0 -
Anybody know good SEO success stories in the field of small business directories?
We are helping a small business directory in their SEO. They address 20 service categories(300 subcategories) with 60000 profiles. We are focusing on following elements: 1. Cutting the flab (they have 3.4 million pages indexed), but they get only 30000 visitors on the website. This will be done by removing long list of cities & by using "Nofollow". 2. Improve internal navigation & use Anchor texts 3. Focus SEO (Backlinks) at business category pages 4. Clean URLs, Titles 5. Implementation of Rich Snippets (Schema.org) 6. Cleaning data If we can not take traffic volume to 300000 in a month, this project will be considered a failure. Does any directory has achieved this recently? We are in first 2 weeks of the project and It will help us our "To do" list 😉
Intermediate & Advanced SEO | | UnyscapeInfocom0 -
Multi-language, multi-country localized website with duplicate content penalty
My company website is multi-language and multi-country. Content created for the Global (English-language only, root directory) site is automatically used when no localization exists for the language and country choice (i.e. Brazil). I'm concerned this may be harming our SEO through dupe content penalties. Can anyone confirm this is possible? Any recommendations on how to solve the issue? Maybe the canonical tag? Thanks very much!
Intermediate & Advanced SEO | | IanTreviranus0 -
Removing URLs in bulk when directory exclusion isn't an option?
I had a bunch of URLs on my site that followed the form: http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l= There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk? Any insights are much appreciated. Kurus
Intermediate & Advanced SEO | | kurus1