How to optimize for different google seach center (google.de, google.ch) ?
-
We all use Deutsch language and (.com) domains for the sites.
I ranked well in google.com ,but not so well in google.de , google.ch , my competitors ranked much better in google.de,google.ch.
I checked most of their outbound-links, but get few information.
Links from (.DE) domains or links from sites located in German help the rank for special google seach center ? (google.de, google.ch) .
Or some other factors i missed? please help.
-
Hi there,
In order to increase your rankings in German speaking search results it's recommended that you:
- Use the rel="alternate" hreflang="de" tag, as specified here by Google in order to inform them that this content is in German and targeted to these users.
- Build links from specific German speaking websites, from blogs, communities, forums, news sites, etc. specially those hosted in German IPs and with German (.DE) or Swiss (.CH) ccTLDs
- Since I understand that you're using a .com domain to rank for all of the German speaking countries with the same content, both Germany and Switzerland and not specifically targeted to a specific country (so your site is language, not country targeted) in this case you're not able to geotarget it in Google Webmaster Tools which would be also an additional possibility if you had for example, a specific German version targeting Germany with: yourdomain.com/de/ or targeting to Switzerland with: yourdomain.com/ch/, nonetheless, if at some point you are, is important that you also know of this setting. Moreover, if you switch towards a country targeted domain structure, instead of using a generic domain such .com the best is to switch over a .de targeting Germany and .ch targeting Switzerland.
These additional signals should help!
-
Hi Yiqing,
You're right about getting links from domains like, .de and .ch, they will increase the authority within the German market. As I would imagine, but we never know for sure, Google will give more weight to the Zeitung, when you're a German site then the BBC as the Zeitung is a way more popular news source in Germany then the BBC is.
As I'm not sure about which site we're talking but I would also suggest taking a look at the HREF Lang tag. As this help article from Google: support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 already suggest it's used by Google to determine which language is used for what countries.
Lately the're have been written a couple of great blog posts around the topic of internationalization, I would at least recommend to read:
Hope this helps a bit!
-
If you are using multiple language sites, make sure you set your geographic target in webmaster tools. Also make sure your content is well written for the local language and if possible i think it could help to have a local contact address in that country displayed somewhere throughout the website.
Hope this helps slightly
James
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Running 2 different domains
Hi guys, Something i'd like appreciate any opinion on... We own a local domain www.goodycard.co.nz but we're expanding overseas. I've purchased www.goodyhq.com as we'd like to drop the 'card' from URL. We can't get www.goodycard.com. The local site has a decent ranking, but obviously, the newly created HQ site doesn't. What's the best way to tie these in (or even ditch the .co.nz) but retain rank? And what's the best way for users to discover the HQ site? Any insight would help, as both sites are kind of different. One has a local member base and the other just sells software so it really depends on the region of search. Cheers!
Technical SEO | | r.moss0 -
Is this okay with google if i can access my sub categories from two different path?
My website is url is abcd.com. One of my category url is abcd.com/mobile.aspx. Which contains 5 sub categories :- samung Mobile 2) Nokia Mobile 3) Sony Mobile 4) HTC Mobile 5) Blackberry Mobile Now if i go in to HTC Mobile sub categories i.e. abcd.com/htcmobile.aspx here i will see all the product related to HTC Mobile. But at below of all product i will find all sub categories that is samsung mobile, nokia mobile, sony mobile and blackberry mobile. So i want to task is this okay? Google will not count these categories as duplicate that is i can access all 4 categories i.e. samsung, nokia, sony and blackberry from here 1) abcd.com/mobile.aspx and 2) abcd.com/htcmobile.aspx Thanks! Dev
Technical SEO | | devdan0 -
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Adding Google + to SEOmoz
I wanted to add my google + signature to every post I make on SEOmoz and I think every user should do the same... Two reasons why... Google helps our existence so we should help theirs. If someone likes what I wrote or vice versa we should be able to follow each other in a simple click. In my opinion all blogs forum posts etc... should Lead to a user not a website, this will prevent spam and help people network. In other words blog spammers and forum spammers will be SOL (Which they all ready are lol).
Technical SEO | | SEODinosaur0 -
Page Over-optimized?
I read over this post on the blog tonight: http://www.seomoz.org/blog/lessons-learned-by-an-over-optimizer-14730 & it's got me concerned that I might be having a similar issue on our site? Back in March & April of last year, we ranked fairly well for a number of long tail keywords, here is one in particular 'Mio Drink' for this page: http://www.discountqueens.com/free-mio-drink-from-kraft-facebook-offer The page is still indexed, but appears back on page #3 for the search term. During this time we had made a number of different updates to our site & I can't seem to put an exact finger on what might have caused the problem? Can anyone see any issues that might have caused this to drop? Thanks, BJ
Technical SEO | | seointern0 -
Unexplained spikes in Google Analytics
My site has modest traffic (50 unique visitors per day). In the past week, I've seen two unexplained spikes in my Google Analytics. Yesterday, there were 140 unique visitors, and these unique visitors each visited one unique page. This appears to be a bot of some sort. If this is a bot, why does Google Analytics think these are unique visitors? Is there a was for small sites to deal with this? Best,
Technical SEO | | ChristopherGlaeser
Christopher0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0