Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
-
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.
Sincere thanks,
-
o ensure SEO compliance while restricting access to certain countries, follow these 3 steps and keep in mind that these are critical to follow if you want to work on multinational and multilingual site:
Page Blackout for Restricted Visitors: Instead of redirecting users, blackout the content and display a message. For example, https://fifamobilefc.com/ shows a message to users from restricted countries while allowing Google to crawl the pages.
Implement Paywall Schema: Use paywall schema markup to signal to Google that content is restricted but not cloaked. This helps maintain transparency with search engines.
Geo-Targeting: Employ geo-targeting to identify and present the message to users from restricted countries, while still allowing Google to access the content.
By applying these methods, you can maintain SEO compliance while effectively restricting access to users from certain countries. Regular monitoring via Google Search Console ensures continued adherence to best practices.
-
@MarkCanning said in Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!:
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.
Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:
RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!
Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]
If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.
Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.
I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.By blacking out the page for visitors from restricted locations while allowing Googlebot access, you're ensuring compliance without hindering indexing. Implementing paywall schema can further clarify to Google that the restriction is based on licensing rather than cloaking. Just ensure consistent implementation across all restricted pages and adhere to Google's guidelines to avoid any issues.
-
@George_Inoriseo hi george, i submitted a previous reply on here but can't see it anywhere.
Firstly thank you for your feedback. I have some extra questions.
Lets assume we have a Canadian version of the website and a US human visitor tries to visit that site or any page on the site. They should be able to browse to the site but an overlay would appear meaning they cannot use the site or proceed any further. The overlap would say te site is restricted in their location. I see other companies doing this. What way would google handle this:
- Could they proceed to crawl the website or would the javascript overlap prevent Googlebot from crawling and indexeing?
- If googlebot where to look at the hash information of the page companred to the hash of what a user sees would they be the same? I believe if their is a big difference in the hash this is a signal for cloaking - because it shows the information / page size is substantially different.
- Would it be wise to avoid user agent lookups in the code? Again i believe this can signal to Google to Googl that manipulation is taking place.
I heard from a google offical that paywall schema might not be a great method.
"Paywall markup would not be suited here since there's no log-in or paymeny that can be done to get access when in the wrong country".Thanks
-
@George_Inoriseo thanks very much George.
The website will have a .com domain and then subfolders will branch off that for different countries / languages. So the structure would be like this:
domain.com
doman.com/en-ca (english Canada}
domain.com/fr-ca (french Canada)The company have licenses for certain countries and in countries where they don't have a license to operate (e.g. USA) users visiting our sites from those countries, should not be able to play. So on our Canadian website, if we detect a user is from USA (where we don't have a license) the user should get a message telling them they can't play. They should be able to visit the site ok, but the website would sniff the location and tell them that they can't play with the website blacked out.
As you suggested we could have a javascript overlay that loads if the user is from the USA. I assume this would only look at the geolocation and not the user agent? Looking up the user agent would be a clear sign we are doing something different for users and Googlebot would it not? Would an overlay restrict Googlebot from crawling the site and because the user is seeing something different to Googlebot could this be perceived as cloaking?
I spoke to someone at Google regarding paywall schema and the feeling was this: "paywall markup would not be suited since there is no log-in or payment that can be done to get access when in the wrong country".
Thanks again George.
-
@MarkCanning here is what I would do:
Avoid Redirects for Blocked Regions: Instead of redirecting users from blocked regions to a different page, use a client-side overlay (like a modal window) to display the restricted access message. This method keeps all users on the same URL.
Implement Paywall Schema: Applying the paywall schema is a smart move. It informs Google that your content restrictions are based on user location, not pay-to-access barriers, which helps avoid penalties for cloaking.
Ensure Accessible Content for Googlebot: Allow Googlebot to crawl the original content. Ensure that your site’s robots.txt file permits Googlebot to access the URLs of region-specific pages.
Use hreflang Tags for Multi-Region Sites: For multiple language and region versions, use hreflang tags to help Google understand the geographic and language targeting of your pages. This will also prevent duplicate content issues.
Monitor and Adapt: Keep an eye on Google Search Console to monitor how these changes affect your site's indexing and adjust your strategies as needed.
This strategy should help you manage SEO for restricted content effectively, while staying compliant with Google’s guidelines.
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Crawling only the Home of my website
Hello,
Product Support | | Azurius
I don't understand why MOZ crawl only the homepage of our webiste https://www.modelos-de-curriculum.com We add the website correctly, and we asked for crawling all the pages. But the tool find only the homepage. Why? We are testing the tool before to suscribe. But we need to be sure that the tool is working for our website. If you can please help us.0 -
Website is not getting indexed
Hi,
Content Development | | Aman0022
Hope you all are doing great!
I have created a dog blog a few weeks back which talks about all things about dogs (http://pawspulse.com/). I am publishing couple of articles everyday which are more than 5k words long with proper keyword research but still Google is not indexing my content. My content is systematically categorized in proper categories related to dog guides, nutrition, accessories, dog breeds etc.
Can anyone help me how to get the website index faster fully. Any help will be much appreciated. Thanks0 -
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
What is the proper way to setup hreflang tags on my English and Spanish site?
I have a full English website at http://www.example.com and I have a Spanish version of the website at http://spanish.example.com but only about half of the English pages were translated and exist on the Spanish site. Should I just add a sitemap to both sites with hreflang tags that point to the correct version of the page? Is this a proper way to set this up? I was going to repeat this same process for all of the applicable URLs that exist on both versions of the website (English and Spanish). Is it okay to have hreflang="es" or do I need to have a country code attached as well? There are many Spanish speaking countries and I don't know if I need to list them all out. For example hreflang="es-bo" (Bolivia), hreflang="es-cl" (Chile), hreflang="es-co" (Columbia), etc... Sitemap example for English website URL:
International SEO | | peteboyd
<url><loc>http://www.example.com/</loc></url> Sitemap example for Spanish website URL:
<url><loc>http://spanish.example.com/</loc></url> Thanks in advance for your feedback and help!0 -
URL Structure - Homepage, Country and State Pages
Hello, I am creating a website (or websites if best format) that will have state-specific boating license courses for every state in the US, Canada and Australia. I would like the content to be available on the website in English, French and Spanish. I want to be the global leader in providing boat test courses. For the (1) homepage, (2) country pages, and (3) state pages, what is best SEO format I should use for:
International SEO | | Monologix
(a) URL structure
(b) "href lang" code
(c) rel canonical code
(d) will meta content with non-English pages need to also be in the non-English language of that page? Also, what server company do you recommend I host my website with? I am a non-programmer and learning SEO, so any and all help will be greatly appreciated! Thank you very much in advance!!!0 -
Subdomains or subfolders for language specific sites?
We're launching an .org.hk site with English and Traditional Chinese variants. As the local population speaks both languages we would prefer not to have separate domains and are deciding between subdomains and subfolders. We're aware of the reasons behind generally preferring folders, but many people, including moz.com, suggest preferring subfolders to subdomains with the notable exception of language-specific sites. Does this mean subdomains should be preferred for language specific sites, or just that they are okay? I can't find any rationale to this other than administrative simplification (e.g. easier to set up different analytics / hosting), which in our case is not an issue. Can anyone point me in the right direction?
International SEO | | SOS_Children0 -
Redirecting users based on location
My site is available in EN, DE, SW, SP, FR, IT, CH and JP. However, the EN sites ranks much better than the other languages, and even when searching in another language the EN homepage is normally the result that appears. Would it be worthwhile to automatically redirect users to the site in the same language they are searching in or country they are searching from? If so, how do I go about this? Thanks!
International SEO | | theLotter0 -
Country specific domains pointing to a .com site
Hello, I am new to seo so please be easy if this happens to be a "silly" question. My company has a .com site. We are expanding into global markets, focusing on specific countries right now. General question: Would I be penalized for duplicate content if I purchased country-specific domains and pointed them to the .com site? Thanks, Jim
International SEO | | jimmer0