Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Index process multi language website for different countries
-
We are in charge of a website with 7 languages for 16 countries. There are only slight content differences by countries (google.de | google.co.uk). The website is set-up with the correct language & country annotation e.g. de/DE/ | de/CH/ | en/GB/ | en/IE. All unwanted annotations are blocked by robots.txt. The «hreflang alternate» are also set.
The objective is, to make the website visible in local search engines. Therefore we have submitted a overview sitemap connected with a sitemap per country. The sitemap has been submitted now for quite a while, but Google has indexed only 10 % of the content.
We are looking for suggestion to boost the index process.
-
Thank you.
-
Just a couple thoughts off the top of my head:
1. Double-check all technical international SEO issues and ensure that the robots.txt file is not mistakenly blocking any desired pages.
2. Make sure that you have a separate Google Webmaster Tools setup for each root domain / subdomain / subdirectory (however you have set up the international sites) and have submitted an individual XML sitemap for each one. Also make sure that the geographical targeting in each GWT setup is set to the desired country.
3. If Google is only indexing a small percentage of a site's pages, it is often because Google is thinking (accurately or not) that a site has duplicate content. "Duplicate content" is not a penalty per se -- it is when Google, for example, sees two pages that are very similar and then indexes only one of them so as to not provide redundant pages in search results.
Example: Say that you have an e-commerce product that has ten variations (such as color). The content of each variation page would often be very similar except, for, the listed color. In the case, you would want to use a rel=canonical tag on all variation pages that points to the main page for that product. (In other words, you don't want all of those pages to be indexed, and Google often would not index them anyway.)
Most likely, I would use a tool such as Moz or any other SEO software to crawl the site and see if any duplicate-content issues are present. Once these are addressed (if the problem exists), then Google will likely crawl and index your sites more thoroughly and accurately.
I hope this helps -- good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
How To Optimize For Same Word, Different Spelling
Hi all. Just wondering what peoples stance is on using multiple variations of keywords on a webpage - those keywords that have the same meaning and search intent, but are just spelt differently. i.e. 'woodscrews' and 'wood screws' (the latter has a significantly higher search volume) You could approach the webpage in 4 different ways; 1. Use ONLY 'wood screws' on-page, and in the page <title><br />2. Use ONLY 'woodscrews' on-page, and in the page <title><br />3. Use BOTH 'wood screws' and 'woodscrews' on-page, and BOTH in the page <title><br />4. Use BOTH 'wood screws' and 'woodscrews' on-page, but ONLY one variation in the page <title></p> <p>We've run some tests in the past but there were never any clear takeaways, a mixed bag of results really.</p> <p>Also, If they are considered the same keyword by Google why are the ranking positions always different for each variation?</p> <p>I'm not sure there' a specific answer to this, just interested to hear peoples thoughts really.</p> <p>Many thanks in advance!</p> <p>Lee.</p></title>
Intermediate & Advanced SEO | | Webpresence0 -
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
How to Target Country Specific Website Traffic?
I have a website with .com domain but I need to generate traffic from UK? I have already set my GEO Targeting location as UK in Google Webmasters & set country location as UK in Google Analytics as well but still, i get traffic only from India. I have also set Geo-targeting code at the backend of the website. But nothing seems works. Can anyone help me how can is do this? I am unable to understand what else can be done.
Intermediate & Advanced SEO | | seoninj0 -
Why does Moz recommend subdomains for language-specific websites?
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites: Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website). Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Intermediate & Advanced SEO | | AdamThompson0 -
How to de-index old URLs after redesigning the website?
Thank you for reading. After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain). It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?) What is the best way to deal with this issue?
Intermediate & Advanced SEO | | Chemometec0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0