Index process multi language website for different countries
-
We are in charge of a website with 7 languages for 16 countries. There are only slight content differences by countries (google.de | google.co.uk). The website is set-up with the correct language & country annotation e.g. de/DE/ | de/CH/ | en/GB/ | en/IE. All unwanted annotations are blocked by robots.txt. The «hreflang alternate» are also set.
The objective is, to make the website visible in local search engines. Therefore we have submitted a overview sitemap connected with a sitemap per country. The sitemap has been submitted now for quite a while, but Google has indexed only 10 % of the content.
We are looking for suggestion to boost the index process.
-
Thank you.
-
Just a couple thoughts off the top of my head:
1. Double-check all technical international SEO issues and ensure that the robots.txt file is not mistakenly blocking any desired pages.
2. Make sure that you have a separate Google Webmaster Tools setup for each root domain / subdomain / subdirectory (however you have set up the international sites) and have submitted an individual XML sitemap for each one. Also make sure that the geographical targeting in each GWT setup is set to the desired country.
3. If Google is only indexing a small percentage of a site's pages, it is often because Google is thinking (accurately or not) that a site has duplicate content. "Duplicate content" is not a penalty per se -- it is when Google, for example, sees two pages that are very similar and then indexes only one of them so as to not provide redundant pages in search results.
Example: Say that you have an e-commerce product that has ten variations (such as color). The content of each variation page would often be very similar except, for, the listed color. In the case, you would want to use a rel=canonical tag on all variation pages that points to the main page for that product. (In other words, you don't want all of those pages to be indexed, and Google often would not index them anyway.)
Most likely, I would use a tool such as Moz or any other SEO software to crawl the site and see if any duplicate-content issues are present. Once these are addressed (if the problem exists), then Google will likely crawl and index your sites more thoroughly and accurately.
I hope this helps -- good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website removed from Bing and Yahoo
Hello, Our website howtoremove.guide was recently removed from the Bing and Yahoo index. The first thing we did was contact Bing Webmaster support to ask what the issue was since we did not get any notifications or messages in our webmaster dashboard. The email that we got back said “I have escalated the issue to our engineers and will get back to you once I receive an update.” Since then, we haven't received any word back from them, but we did not find any technical problems and we strongly believe we were manually penalized. We've never had issues with a search engine before, so we are at a loss what to do. Could you please give us advice as to what technical issue our website might have or what could incur a deindex penalty in our case? We want to do everything that is possible to get back into Bing and Yahoo search results ASAP. The website has primarily affiliate content, so we are doing anything we can to clean everything up, but any recommendations will be incredibly useful to us. We are also open to contacting an expert on this, but we have no idea where to look.
Intermediate & Advanced SEO | | ThreatAnalyzer0 -
News articles on our website are being indexed, but not showing up for search queries.
News articles on distributed.com are being indexed by Google, but not showing up for any search queries. In Google Search, I can copy and paste the entire first paragraph of the article, and the listing still won't show up in search results. For example, https://distributed.com/news/dtcc-moves-closer-blockchain-powered-trades doesn't rank AT ALL for "DTCC Moves Closer to Blockchain-Powered Trades", the title of the article. We've tried the following so far: re-submitted sitemap to search console checked manual actions in search console checked for any no-index/no-follow tags Please help us solve this SEO mystery!
Intermediate & Advanced SEO | | BTC_Inc0 -
Duplicate content across different domains in different countries?
Hi Guys, We have a 4 sites One in NZ, UK, Canada and Australia. All geo-targeting their respective countries in Google Search Console. The sites are identical. We recently added the same content to all 4 sites. Will this cause duplicate content issues or any issues even though they are in different countries and geo-targeting is set? Cheers.
Intermediate & Advanced SEO | | wickstar0 -
:Pointing hreflang to a different domain
Hi all, Let's say I have two websites: www.mywebsite.com and www.mywebsite.de - they share a lot of content but the main categories and URLs are almost always different. Am I right in saying I can't just set the hreflang tag on every page of www.mywebsite.com to read: rel='alternate' hreflang='de' href='http://mywebsite.de' /> That just won't do anything, right? Am I also right in saying that the only way to use hreflang properly across two domains is to have a customer hreflang tag on every page that has identical content translated into German? So for this page: www.mywebsite.com/page.html my hreflang tag for the german users would be: <link < span="">rel='alternate' hreflang='de' href='http://mywebsite.de/page.html' /></link <> Thanks for your time.
Intermediate & Advanced SEO | | Bee1590 -
Is it safe to link my websites together?
Hi Everyone, I have 10 websites which are all of good standing and related. My visitors would benefit of knowing about the other websites but I don't want to trigger a google penalty by linking them all together. Ideally I'd also like to pass on importance through the links as well. How would you proceed in this situation? Advice would be greatly appreciated, Peter.
Intermediate & Advanced SEO | | RoyalBlueCoffee0 -
Google is Really Slow to Index my New Website
(Sorry for my english!) A quick background: I had a website at thewebhostinghero.com which had been slapped left and right by Google (both Panda & Penguin). It also had a manual penalty for unnatural links which had been lifted in late april / early may this year. I also had another domain, webhostinghero.com, which was redirecting to thewebhostinghero.com. When I realized I would be better off starting a new website than trying to salvage thewebhostinghero.com, I removed the redirection from webhostinghero.com and started building a new website. I waited about 5 or 6 weeks before putting any content on webhostinghero.com so Google had time to notice that the domain wasn't redirecting anymore. So about a month ago, I launched http://www.webhostinghero.com with 100% new content but I left thewebhostinghero.com online because it still brings a little (necessary) income. There are no links between the websites except on one page (www.thewebhostinghero.com/speed/) which is set to "noindex,nofollow" and is disallowed to search engines in robots.txt. I made sure the web page was deindexed before adding a "nofollow" link from thewebhostinghero.com/speed => webhostinghero.com/speed Since the new website launch, I've been publishing new content (from 2 to 5 posts) daily. It's getting some traction from social networks but it gets barely any clicks from Google search. It seems to take at least a week before Google indexes new posts and not all posts are indexed. The cached copy of the homepage is 12 days old. In Google Webmaster Tools, it looks like Google isn't getting the latest sitemap version unless I resubmit it manually. It's always 4 or 5 days old. So is my website just too young or could it have some kind of penalty related to the old website? The domain has 4 or 5 really old spammy links from the previous domain owner which I couldn't get rid of but otherwise I don't think there's anything tragic.
Intermediate & Advanced SEO | | sbrault740 -
Different TITLE for the same page appear for different keywords
Hi there Can anyone advice please on this funny/strange issue I have title on home page. When I type some of keywords the homepage appears in SERP with shortcut TITLE (just one keyword there). But when I type company name I have full TITLE. Could anybody advice please what can be a problem and how to fix it?
Intermediate & Advanced SEO | | fleetway0 -
Can I use the same source for two different websites?
I have developed a successful portal based website but would like to grow my portfolio of sites by expanding into new niches and sectors. I would like to use the same source code to fast track new sites but I'm not sure of the dangers involved. Content, meta details etc. will all be unique and the only similarity will be the html code. Another example of how I want to use this is that my current site targets the UK but I want to target a global market with a .com domain and this would involve using the same source. Is this possible without a penalty or am I overlooking something?
Intermediate & Advanced SEO | | Mulith0