Blocking domestic Google's in Robots.txt
-
Hey,
I want to block Google.co.uk from crawling a site but want Google.de to crawl it.
I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers?
any ideas?
Thanks
B
-
Thanks Guys for all of the help.
I think we will just implement cross domain GeoIP redirects to ensure users get the right location and currency.
Cheers
-
Are you having the issue of your .de pages ranking in .co.uk instead of your .co.uk pages?
If that's the case then I'd look towards usage of HREFLANG both on-page and in the xml sitemaps. That is going to provided Googlebot with a better view of the country-language targeting for the site.
-
Hi, country specific search engine spiders cannot be blocked using robots.txt file or any other method. However, you can block certain IP ranges pertaining to certain countries.
Best regards,
Devanur Rafi
-
Hi Gareth,
I don't think this is going to work as every crawler by Google is run by the same useragent: Googlebot. What you could do but what I really wouldn't recommend to do is generating the robots.txt automatically. Check if the IP address of the user is in another country and then Disallow them. It probably won't work as the crawler from let's say Germany could also be used for the UK.
Also the specific data for a countries search engine gets collected first and then gets looked at to see what they need users to serve, not the other way around that content gets acquired for a specific country.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console International Targeting - Works for Hungary, but not Ireland - Why?
company.com (root)
International SEO | | scottclark
USA - lang="en" | GSC target: USA (shows US site in SERPs for "companyname" search)
company.com/hungary
Hungary - lang="hu" | GSC target: Hungary (shows Hungarian site in SERPs for "companyname" search)
AWESOMENESS company.com/ireland
Ireland - lang="en" | GSC target: Ireland (shows US site (doh!) in SERPs "companyname" search)
NOT RIGHT! It is our theory [please weigh in!] that because we don't have a company.com/usa folder, the TLD targeting (EN) is overriding other English language sites in some manner. In other words, the reason it's not overriding Hungary is because it's a different language. What must we do to get the Irish site ranked for "companyname" searches and to show by default in Ireland?0 -
Will hreflang with a language and region allow Google to show the page to all users of that language regardless of region?
I'm launching translations on a website with the first translation being Brazilian Portuguese. If I use the following hreflang: If a user is outside of Brazil and has their browser language set to just Portuguese (Not Portuguese (Brazil)) will Google still serve them the Portuguese version of my pages in search results?
International SEO | | Brando160 -
Does Google's algorithm work the same in every country?
I can't help but feel this is a silly question! but does Google algorithm work exactly the same throughout all countries? I run a few sites in the UK and a couple in Spain but can't help but feel that my Spanish sites are harder to rank for. The sites that rank the best are business directories in Spain... whereas here in the UK you'd be lucky to find one on page one..
International SEO | | david.smith.segarra0 -
Ranking UK company in Google.com
Hi all, I have a UK client with a .com domain, hosted on a US server, but the physical business premises is based in the UK. Their product is a really great product and available for export to the US. I want to rank them higher in the US, more specifically Google.com. I've helped them rank very well organically in the UK (google.co.uk) for some great terms, however they rank almost nowhere in google.com (gl=us) for the same terms, for example: In Google.co.uk they rank #3 for the key-phrase.
International SEO | | seowoody
In Google.com they rank #90 for the same key-phrase. I've got them some great US focused links with PR coverage including MSN Cars, nydailynews.com etc. I just wondered if there was any one "golden ticket" for boosting US rankings? I've read that a physical business premises located in the US helps a lot. Can anyone confirm this and if so, would a rented PO box in the US help? The site has great social signals too, growing twitter following and many FB likes/shares etc. Any other tips/advice? Thanks in advance,
Cheers,
Woody 🙂0 -
Getting ranked in French on Google UK ?
Hellooooo the Moz community ! (#superexcited, # firstpost) Here's my problem. I'm working for a client specialised in Corporate Relocation to London for French families. (I'm reworking the entire site from the ground up, so I can manoeuvre pretty easily) The thing is, these families will either be : Searching on Google FR but mostly in English (French as well) Searching on Google UK but mostly in French ! (and of course, English as well) To be honest, I'm really not sure what strategy I should go with. Should I just target each local market in its native language and google will pick up the right language if people are searching in the "opposite" language ? I'd love some tips to help get me started. Sadly, I don't have a lot of data yet. (Client didn't even have tracking up on their site before I came in). So far here's what I got (on very small number of visitors): Location: 50+% from UK / 20+% from France.
International SEO | | detailedvision
Language : 60+% En / 35+% Fr Thank you. Tristan0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0 -
Geo Targeting for Similar Sites to Specific Countries in Google's Index
I was hoping Webmaster Tools geo targeting would prevent this - I'm seeing in select google searches several pages indexed from our Australian website. Both sites have unique TLDs: barraguard.com barraguard.com.au I've attached a screenshot as an example. The sites are both hosted here in the U.S. at our data center. Are there any other methods for preventing Google and other search engines from indexing the barraguard.com.au pages in searches that take place in the U.S.? dSzoh.jpg
International SEO | | longbeachjamie0