Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
-
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.
Sincere thanks,
-
o ensure SEO compliance while restricting access to certain countries, follow these 3 steps and keep in mind that these are critical to follow if you want to work on multinational and multilingual site:
Page Blackout for Restricted Visitors: Instead of redirecting users, blackout the content and display a message. For example, https://fifamobilefc.com/ shows a message to users from restricted countries while allowing Google to crawl the pages.
Implement Paywall Schema: Use paywall schema markup to signal to Google that content is restricted but not cloaked. This helps maintain transparency with search engines.
Geo-Targeting: Employ geo-targeting to identify and present the message to users from restricted countries, while still allowing Google to access the content.
By applying these methods, you can maintain SEO compliance while effectively restricting access to users from certain countries. Regular monitoring via Google Search Console ensures continued adherence to best practices.
-
@MarkCanning said in Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!:
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.
Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:
RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!
Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]
If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.
Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.
I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.By blacking out the page for visitors from restricted locations while allowing Googlebot access, you're ensuring compliance without hindering indexing. Implementing paywall schema can further clarify to Google that the restriction is based on licensing rather than cloaking. Just ensure consistent implementation across all restricted pages and adhere to Google's guidelines to avoid any issues.
-
@George_Inoriseo hi george, i submitted a previous reply on here but can't see it anywhere.
Firstly thank you for your feedback. I have some extra questions.
Lets assume we have a Canadian version of the website and a US human visitor tries to visit that site or any page on the site. They should be able to browse to the site but an overlay would appear meaning they cannot use the site or proceed any further. The overlap would say te site is restricted in their location. I see other companies doing this. What way would google handle this:
- Could they proceed to crawl the website or would the javascript overlap prevent Googlebot from crawling and indexeing?
- If googlebot where to look at the hash information of the page companred to the hash of what a user sees would they be the same? I believe if their is a big difference in the hash this is a signal for cloaking - because it shows the information / page size is substantially different.
- Would it be wise to avoid user agent lookups in the code? Again i believe this can signal to Google to Googl that manipulation is taking place.
I heard from a google offical that paywall schema might not be a great method.
"Paywall markup would not be suited here since there's no log-in or paymeny that can be done to get access when in the wrong country".Thanks
-
@George_Inoriseo thanks very much George.
The website will have a .com domain and then subfolders will branch off that for different countries / languages. So the structure would be like this:
domain.com
doman.com/en-ca (english Canada}
domain.com/fr-ca (french Canada)The company have licenses for certain countries and in countries where they don't have a license to operate (e.g. USA) users visiting our sites from those countries, should not be able to play. So on our Canadian website, if we detect a user is from USA (where we don't have a license) the user should get a message telling them they can't play. They should be able to visit the site ok, but the website would sniff the location and tell them that they can't play with the website blacked out.
As you suggested we could have a javascript overlay that loads if the user is from the USA. I assume this would only look at the geolocation and not the user agent? Looking up the user agent would be a clear sign we are doing something different for users and Googlebot would it not? Would an overlay restrict Googlebot from crawling the site and because the user is seeing something different to Googlebot could this be perceived as cloaking?
I spoke to someone at Google regarding paywall schema and the feeling was this: "paywall markup would not be suited since there is no log-in or payment that can be done to get access when in the wrong country".
Thanks again George.
-
@MarkCanning here is what I would do:
Avoid Redirects for Blocked Regions: Instead of redirecting users from blocked regions to a different page, use a client-side overlay (like a modal window) to display the restricted access message. This method keeps all users on the same URL.
Implement Paywall Schema: Applying the paywall schema is a smart move. It informs Google that your content restrictions are based on user location, not pay-to-access barriers, which helps avoid penalties for cloaking.
Ensure Accessible Content for Googlebot: Allow Googlebot to crawl the original content. Ensure that your site’s robots.txt file permits Googlebot to access the URLs of region-specific pages.
Use hreflang Tags for Multi-Region Sites: For multiple language and region versions, use hreflang tags to help Google understand the geographic and language targeting of your pages. This will also prevent duplicate content issues.
Monitor and Adapt: Keep an eye on Google Search Console to monitor how these changes affect your site's indexing and adjust your strategies as needed.
This strategy should help you manage SEO for restricted content effectively, while staying compliant with Google’s guidelines.
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multi National Company that Doesn't Want to Implement International SEO
I have got an interesting situation where I have a client who wants to merge two ccTLD's into one. They currently have .fi and .com and they want to merge both sites to .com .fi is for finland and .com for USA. They want to merge the sites and the original plan was to use subfolders for each country and pair with hreflang. However the team now wants to merge both sites with NO subfolders differentiating between finland or the US. My understanding of International SEO that this is the most opposite from best practices, but is there any specific reasons why they wouldn't want to do this? I'm struggling to find any specific reasons that I can cite to the client that would argue why we should at least do a subfolder or some sort of international seo strategy.
International SEO | | JKhoo1 -
Question regarding international SEO
Hi there, I have a question regarding international SEO and the APAC region in particular. We currently have a website extension .com and offer our content in English. However, we notice that our website hardly ranks in Google in the APAC region, while one of the main languages in that region is also English. I figure one way would be to set up .com/sg/ (or .com/au/ or .com/nz/), but then the content would still be in English. So wouldn't that be counted as duplicate content? Does anyone have experience in improving website rankings for various English-speaking countries, without creating duplicate content? Thanks in advance for your help!
International SEO | | Billywig0 -
My Website Not Showing In Google English Search Results
My website is not visible on Google English. Selecting the language of Google in Hindi, Spanish, etc., my pages are visible in search results.
International SEO | | Jude_Wix0 -
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
Soft Launch App with Country Targeting To Match?
Hi,
International SEO | | eTinaRose
We're gearing up to do a soft launch of our app in a few countries, such as Australia. Our website says to request early access since the app is not yet publicly available in the apple store. However, it will be in those soft launch countries. This is what I'm considering... Creating a subdomain site such as au.liquidtext.net (liquidtext.au is not available). And setting that site, au.liquidtext.net, to be country targeted to Australia via webmaster tools. Once the app is available internationally, and the main site has been updated to show this, I would then redirect any subdomains to the main site. Thoughts? Are there any negative repercussions to this this approach that I am not considering? Thank you for your help Moz community!! Tina0 -
Help! A never before asked query about using a ccTLD but hosting in a different country
Hi Guys, I've a website that has a India specific domain ending with .org.in. The website has ALL the traffic from India (as mentioned earlier, it's a website meant only for audience from India). Currently this _.org.in_domainis hosted on a server located in India. I'm thinking of hosting this website in Singapore. Do you think that will negatively affect the current rankings of the website (i.e. changing the server location of my website from India to Singapore)?
International SEO | | seotoseo0 -
.edu or country TLD, which one would be better?
Hi,we are working right know with an Education Instutition located Outside the U.S. I think they would be in a possition where they could get de .edu TLD. Right know they have good rankings in its own country cause they are working with their country specific TLD, and they rank well there. But, of course, a considerable percentage of their students are foreigners, so they are very interested in improving their interantional rankings (note that U.S is not a target market). I was wondering if it would be ok to recommend them to change to the .edu TLD, because all their competitors have that tld too. Whould that TLD increase their domain authority inmediatly? I know that .edu is well consider by google when it sends you a link, so it would be reasonable to think that having a .edu domain would be great, but as this domain is very related with the US and all their markets are outside the U.S, I am not sure about what recommend them. What do you think?? Thank you!!!
International SEO | | teconsite0 -
Google results inverted in a different country
Hi My site and a rival site are bothl UK based. If I search google.co.uk for a keyword I'm above him, yet if I search the australian results, he's on top. How come? I'd really like to put this right and grind his face into the dirt -- if only as revenge for repeatedly stealing my web content. Thanks
International SEO | | Jeepster0