Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
-
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.
Sincere thanks,
-
o ensure SEO compliance while restricting access to certain countries, follow these 3 steps and keep in mind that these are critical to follow if you want to work on multinational and multilingual site:
Page Blackout for Restricted Visitors: Instead of redirecting users, blackout the content and display a message. For example, https://fifamobilefc.com/ shows a message to users from restricted countries while allowing Google to crawl the pages.
Implement Paywall Schema: Use paywall schema markup to signal to Google that content is restricted but not cloaked. This helps maintain transparency with search engines.
Geo-Targeting: Employ geo-targeting to identify and present the message to users from restricted countries, while still allowing Google to access the content.
By applying these methods, you can maintain SEO compliance while effectively restricting access to users from certain countries. Regular monitoring via Google Search Console ensures continued adherence to best practices.
-
@MarkCanning said in Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!:
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.
Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:
RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!
Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]
If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.
Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.
I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.By blacking out the page for visitors from restricted locations while allowing Googlebot access, you're ensuring compliance without hindering indexing. Implementing paywall schema can further clarify to Google that the restriction is based on licensing rather than cloaking. Just ensure consistent implementation across all restricted pages and adhere to Google's guidelines to avoid any issues.
-
@George_Inoriseo hi george, i submitted a previous reply on here but can't see it anywhere.
Firstly thank you for your feedback. I have some extra questions.
Lets assume we have a Canadian version of the website and a US human visitor tries to visit that site or any page on the site. They should be able to browse to the site but an overlay would appear meaning they cannot use the site or proceed any further. The overlap would say te site is restricted in their location. I see other companies doing this. What way would google handle this:
- Could they proceed to crawl the website or would the javascript overlap prevent Googlebot from crawling and indexeing?
- If googlebot where to look at the hash information of the page companred to the hash of what a user sees would they be the same? I believe if their is a big difference in the hash this is a signal for cloaking - because it shows the information / page size is substantially different.
- Would it be wise to avoid user agent lookups in the code? Again i believe this can signal to Google to Googl that manipulation is taking place.
I heard from a google offical that paywall schema might not be a great method.
"Paywall markup would not be suited here since there's no log-in or paymeny that can be done to get access when in the wrong country".Thanks
-
@George_Inoriseo thanks very much George.
The website will have a .com domain and then subfolders will branch off that for different countries / languages. So the structure would be like this:
domain.com
doman.com/en-ca (english Canada}
domain.com/fr-ca (french Canada)The company have licenses for certain countries and in countries where they don't have a license to operate (e.g. USA) users visiting our sites from those countries, should not be able to play. So on our Canadian website, if we detect a user is from USA (where we don't have a license) the user should get a message telling them they can't play. They should be able to visit the site ok, but the website would sniff the location and tell them that they can't play with the website blacked out.
As you suggested we could have a javascript overlay that loads if the user is from the USA. I assume this would only look at the geolocation and not the user agent? Looking up the user agent would be a clear sign we are doing something different for users and Googlebot would it not? Would an overlay restrict Googlebot from crawling the site and because the user is seeing something different to Googlebot could this be perceived as cloaking?
I spoke to someone at Google regarding paywall schema and the feeling was this: "paywall markup would not be suited since there is no log-in or payment that can be done to get access when in the wrong country".
Thanks again George.
-
@MarkCanning here is what I would do:
Avoid Redirects for Blocked Regions: Instead of redirecting users from blocked regions to a different page, use a client-side overlay (like a modal window) to display the restricted access message. This method keeps all users on the same URL.
Implement Paywall Schema: Applying the paywall schema is a smart move. It informs Google that your content restrictions are based on user location, not pay-to-access barriers, which helps avoid penalties for cloaking.
Ensure Accessible Content for Googlebot: Allow Googlebot to crawl the original content. Ensure that your site’s robots.txt file permits Googlebot to access the URLs of region-specific pages.
Use hreflang Tags for Multi-Region Sites: For multiple language and region versions, use hreflang tags to help Google understand the geographic and language targeting of your pages. This will also prevent duplicate content issues.
Monitor and Adapt: Keep an eye on Google Search Console to monitor how these changes affect your site's indexing and adjust your strategies as needed.
This strategy should help you manage SEO for restricted content effectively, while staying compliant with Google’s guidelines.
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving from single domain to multiple CCTLDs
Hi, I have a website targeting 3 markets (and therefor 3 languages). I was currently using a single domain with each market being targeted in the following format: www.website.com/pl
International SEO | | cellydy
www.website.com/de
www.website.com/hu It's clear to me by looking at organic results, that in my industry (Real Estate) Google is putting a large emphasis on local businesses and local domains. Top 10 organic results for all my keywords in all markets have country specific CCTLDs. I decided to migrate from a single domain strategy to a multi domain strategy. I own the domains. The new structure is www.website.com/pl -> www.website.pl
www.website.com/de -> www.website.de
www.website.com/hu -> www.website.hu All the website have been added to google search console and 301 redirects are in place and working correctly. The pages are all interlinked and have rel=alternate to each other. The sitemaps are all done correctly. My question is how do I tell Google about this. The change of address feature only works for changing one domain to one other domain. It's been a week and the old www.website.com domain is still showing up (even considering 301 redirects). Or do I just need to be patient and wait it out? Any tips?0 -
Question about International SEO
We've just recently launched our website in Canada and our web crawler is showing some pages with "&Country=CA", even if the current page already includes Country=CA. Why is this and how would we go about resolving?
International SEO | | nicole.nelson030 -
Question regarding international SEO
Hi there, I have a question regarding international SEO and the APAC region in particular. We currently have a website extension .com and offer our content in English. However, we notice that our website hardly ranks in Google in the APAC region, while one of the main languages in that region is also English. I figure one way would be to set up .com/sg/ (or .com/au/ or .com/nz/), but then the content would still be in English. So wouldn't that be counted as duplicate content? Does anyone have experience in improving website rankings for various English-speaking countries, without creating duplicate content? Thanks in advance for your help!
International SEO | | Billywig0 -
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
Google Analytics Search Console for International Countries
Hi Moz Community, Our e-commerce site is trying to gauge the opportunity of certain queries for specific countries. I'm trying to use the search console data presented in GA to do this. I'm looking at the top queries filtered by each country and also the top landing pages for each country as well. The non filtered data for queries and landing pages is completely different than by country and some if it looks wrong. For instance, our most popular query by impressions shows 0 query impressions in the US once filtered by country. Our site is based in the US so this doesn't make any sense, the same is true for landing pages. Is the queries and landing page data in GA under search console a combination of all countries? Since our target is set to the USA in search console is this data technically US based? How is this data so off? Thanks for answering!
International SEO | | znotes0 -
E-commerce : 1 site per country or 1 site per language?
I'm working with an European e-commerce; they already have a French website with a .fr domain. They want to target the Belgium with a .be domain and the Nederland with a .nl domain. Belgium = 50% dutch, 50% French. Is it better to do 3 websites, one per country, or 2 websites, one per language ? Thinking to SEO, costs, VAT management, what is your opinion?
International SEO | | johnny1220 -
Use country-specific domains or stick to already strong .com domain?
We run an online store with the majority of our customers coming from 4 different European countries. The site is accessible through TLD's of all of these countries. However our .com domain currently has the most links pointing to it and the highest domain authority. Unfortunately, we are unable to tell through which TLD visitors reach our site. The niche is rather competetive, and therefore I am unsure whether it would be worth it to solely use our .com domain for the English language, and try to rank for each of the seperate languages with its own country-specific domain. **Question/discussion: **Will it be worth the costs and time to spent to build links for the country specific domains in these countries, or should we focus on making our .com domain stronger and use it for all countries? I'm aware of the benefits of ranking with a domain in the country the user is in. Note: We have major duplicate content issues at this moment, due the content being available in different languages, on a handful of domains. On each domain, users can view the site in different languages. In addition, the language indication in the url is not very clear (?lang=x) so I believe this should be improved to make it easier for search engines to tell which language is presented. If I choose to use a different language for each TLD, then the language flag in the navigation on the site will point to a different domain, so each language is hosted on 1 domain and there is no more duplicate content. However, I'm afraid this will lead to lower rankings, as the (strong) .com domain will no longer host the content in different languages.
International SEO | | 1200wd0 -
Same language many countries
Hello, I live in Belgium and in this country you've 3 languages : french, dutch and german. I've customers from many countries : France, Nederlands,... and for my website in ".be" (we'll say www.mysite.be for example) I've choosen the french language. My question is can I've the same content on my site : www.mysite.be and www.mysite.fr without duplicate content or should I forgot using www.mysite.fr to avoid the D.C. problem? And with my site : www.mysite.be should I've more difficult to rank in France for example? Thank you for your answer, Jonathan
International SEO | | JonathanLeplang0